Skip to content

Commit 0d8fc67

Browse files
committed
Added one step orchestration config files, supporting time series data files and postman collection
1 parent 362472d commit 0d8fc67

9 files changed

Lines changed: 727 additions & 0 deletions
Lines changed: 170 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,170 @@
1+
# Single Step Orchestration Sample
2+
---
3+
## Overview
4+
---
5+
This repository contains sample configuration, data and postman files for creating and running an orchestration in your instance of the Predix Analytics Runtime as per the 'Running an Orchestration Using Predix Time Series Tags' roadmap (https://www.predix.io/docs#ogUlV1fl). The orchestration runs the demo-timeseries-adder-java analytic. It take 2 sets of (aligned) timeseries data as input and produces their row-wise sum and writes the output to a new Timeseries tag.
6+
## Files in the repository
7+
Sample data files:
8+
9+
* data files with the timeseries inputs for the two addends,
10+
* the expected output and
11+
* sample data files for validating that the demo-timeseries-adder-java analytic has been properly deploy to your Predix Analytics Catalog.
12+
13+
A postman collections containing requests for
14+
* getting a user token
15+
* loading the orchestration configuration
16+
* running the orchestration
17+
18+
Configuration files for the sample orchestration
19+
20+
## Usage
21+
---
22+
Follow the instructions below to set up the orchestration and trigger its execution in your Predix Analytics environment.
23+
### Task Roadmap Prerequisites
24+
* Create subscriptions to:
25+
* Predix Analytics Catalog,
26+
* Predix Analytics Runtime,
27+
* Predix Analytics UI,
28+
* Predix Timeseries and
29+
* Predix UAA
30+
31+
(see the Getting Started instructions for each service in https://www.predix.io/docs#OqtMIsCd)
32+
33+
* Use the websocket connection in the Predix Tool Kit (for basic:https://predix-starter.run.aws-usw02-pr.ice.predix.io/#!/wsClient, for select: ????) to load the sample data into your Predix Timeseries instance.
34+
1. log in as a user from your UAA
35+
2. use Time series ingest to load the data from supportingDataFiles/time-series-tag-A-data.json and suportingDataFiles/time-series-tag-B-data.json:
36+
* predix-zone-id is your Time series guid (zone id/instance id)
37+
1. open the socket
38+
2. use supportingDataFiles/time-series-tag-A-data.json as the request body and send the message
39+
3. use supportingDataFiles/time-series-tag-B-data.json as the request body and send the message
40+
4. close the socket
41+
5. use Time series query, request: Time Bounded Request to verify that tag-A and tag-B have been loaded. Request bodies can be found in:
42+
* supportingDataFiles/tag-A-time-bounded-request.json
43+
* supportingDataFiles/tag-B-time-bounded-request.json
44+
45+
* Setup Postman
46+
* Install Postman (from the chrome web store, https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en)
47+
* Open Postman and import 'SingleStepOrchestrationDemoUsingTagMap.postman_collection'
48+
* Define the Postman environment (AnalyticsDemo)
49+
* **uaa_uri**: the uri for your UAA instance
50+
* **uaa_client_id**: the uaa client id for your Analytics and Time Series instances
51+
* **uaa_authorization_id**: the base64 encoding of <uaa client id>:<uaa client secret> from your UAA instance
52+
* **userId**: a user id from the UAA instance
53+
* **userPassword**: the password for the userId
54+
* **user_token**: a valid UAA token from your Predix UAA service for your Predix Analytics and Time Series instances.
55+
* Use the user token returned on the log in when you ingested tag-A and tag-B into Predix Time Series or use 'Request user token' from the postman collection to get a new user token
56+
* **catalog_uri**, **config_uri**, **execution_uri**: from running 'cf env <your app>' after binding your Predix Analytics instances to <your app>. (See the Getting Started guides)
57+
* **runtime-zone-id**: the guid for your Predix Analytics Runtime
58+
* Deploy the demo-timeseries-adder-java analytic (https://github.com/PredixDev/predix-analytics-sample/tree/master/analytics/demo-timeseries-adder-java) to your Analytics Catalog (using your Analytics UI)
59+
1. add the demo-timeseries-adder-java.jar and demo-timeseries-adder-template.json as 'Executable' and 'template' attachments
60+
2. deploy and test the analytic using supportingDataFiles/analytic-input-for-demo-timeseries-adder.json
61+
62+
## Task Roadmap Instructions
63+
1. Create the orchestration configuration file from orchestration/ConfigurationFiles/orchestration-workflow.xml by updating the following entries in line 19:
64+
* **analytic Catalog Entry Id** is catalog's guid for the analytic. The guid can be found on the URI for the Analytic Detail as follows: in the Analytic UI, go to the Analytic Detail page for the demo adder analytic, the guid is after '.../view/' in the uri.
65+
* **Analytic Name** is the name you entered when you created the analytic in the Analytics Catalog
66+
* **Analytic Version** is the version you entered when you created the analytic in the Analytics Catalog
67+
68+
2. Validate the Orchestration Workflow File - use the 'Workflow Validation' request from the Postman collection
69+
* choose the updated bpmn.xml file in the Body of the request
70+
* submit the request and look for 'Status: 200 OK'
71+
72+
3. Create a port-to-field Map - use orchestrationConfigurationFiles/port-to-field-map-for-demoTimeseriesAdder.json. You can use this file as is, no changes are needed.
73+
74+
4. Create an Orchestration Configuration Entry - use the following Postman requests
75+
1. 'Create Orchestration Configuration Entry' request. Note down the 'id' in the response - this is the orchestration entry id.
76+
2. 'Upload Orchestration Workflow File' request - change <orchestration entry id> to the orchestration entry id from step i and chose the orchestrationConfigurationFiles/orchestration-workflow.xml in the body of the request before sending it.
77+
3. 'Upload port-to-field Map for Orchestration Step' request - change <orchestration entry id> to the orchestration entry id from step i and chose the orchestrationConfigurationFiles/port-to-field-map-for-demoTimeseriesAdder.json before sending the request.
78+
79+
5. Run the orchestration using the 'Run Single Step Orchestration Using Tag Map' request - change <orchestration entry id> to the orchestration entry id from the prior step before submitting the request.
80+
81+
---
82+
## Congratulations!
83+
You have just created and run an orchestration.
84+
85+
---
86+
# Explaining the file contents
87+
The following highlights important entries in the sample files and how values are related across the files.
88+
## Loading values into Time Series
89+
time-series-tag-A-data.json contains:
90+
91+
'''
92+
{
93+
"messageId": "1453338376222",
94+
"body": [
95+
{
96+
**"name": "tag-A"**,
97+
"datapoints": [
98+
[
99+
1453338376200,
100+
1,
101+
3
102+
],
103+
[
104+
1453338376201,
105+
2,
106+
3
107+
],
108+
...
109+
'''
110+
111+
Note: the **\"name\" : \"tag-A\"** part of the json object. This causes Time Series to store the data under the **tag-A** key when Time Series ingests it.
112+
113+
## Mapping data values from tag-A and tag-B to the analytic input json
114+
115+
The port to field map file ( port-to-field-map-for-demo-TimeseriesAdder.json), defines the input as coming from fieldIds **temperature sensor** and **vibration sensor** (see below). And the orchestration request (in postman) maps **temperature sensor** to **tag-A** and **vibration sensor** to **tag-B**. So **tag-A's** values are read for the **temperature sensor** and **tag-B's** values are read for the **vibration sensor**.
116+
117+
port to field map entry:
118+
119+
'''
120+
"inputMaps": [
121+
{"valueSourceType": "DATA_CONNECTOR",
122+
"fullyQualifiedPortName": "data.time_series.numberArray1",
123+
**"fieldId": "temperature sensor"**,
124+
"queryCriteria": {"start": 0, "end": -1},
125+
"dataSourceId": "PredixTimeSeries"
126+
},
127+
{"valueSourceType": "DATA_CONNECTOR",
128+
"fullyQualifiedPortName": "data.time_series.numberArray2",
129+
**"fieldId": "vibration sensor"**,
130+
"queryCriteria": {"start": 0, "end": -1},
131+
"dataSourceId": "PredixTimeSeries"
132+
}
133+
],
134+
'''
135+
136+
orchestration run request:
137+
138+
'''
139+
"assetDataFieldsMap": {
140+
**"temperature sensor": "tag-A"**,
141+
**"vibration sensor": "tag-B"**,
142+
"demo sum": "tag-C"
143+
},
144+
'''
145+
146+
The port to field map entries:
147+
148+
**\"fullyQualifiedPortName\": \"data.time_series.numberArray1\"** and
149+
**\"fullyQualifiedPortName\": \"data.time_series.numberArray2\"**
150+
151+
tell the processing to put the values in the input json object as:
152+
153+
'''
154+
{"data" :
155+
{"time_series" :
156+
{"numberArray1" : [<values from tag-A>],
157+
"numberArray2" : [<values from tag-A>]
158+
}
159+
}
160+
}
161+
'''
162+
163+
## Mapping the output to Time Series tag-C
164+
This is essentially the same as the input mapping. The orchestration run request maps **tag-C** with **demo sum**. The port to field map maps **demo sum** to the value from the json **data.time_series.sum** object in the analytic output.
165+
166+
## Associating the port to field map with the orchestration step
167+
168+
When a port to field map is uploaded to the system, the request will contain a **name** field (see the request in postman). This value must match the **\<serviceTask ... id="\<value\>" ... \\>** in the orchestration's bpmn specification.
169+
170+

0 commit comments

Comments
 (0)