Reference
Analytics Catalog Service API Model Reference
The following models are used when calling the Analytics Catalog Service API.
Analytic Catalog Entry Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
id | System-generated | String | Generated by the Analytics Catalog service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path. |
name | Required | String | This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name and version combination must be unique. The name cannot be updated with the PUT request. |
author | Required | String | Free-form string up to 255 characters. |
description | Optional | String | Free-form string up to 1024 characters. |
version | Required | String | Can contain 1-32 alphanumeric, '.' (period) and '-' (dash) characters. The version cannot be updated with the PUT request. |
supportedLanguage | Required | String | Possible values: java , matlab , python , python_3 . |
customMetadata | Optional | String | Free-form string up to 4096 characters. |
taxonomyLocation | Optional | String | The taxonomy directory where the analytic catalog entry will be created. This location defaults to "/uncategorized" if not specified. |
createdTimestamp | System-generated | String | Marks the time the analytic catalog entry was created. Generated by the Analytics Catalog service, the timestamp is returned in ISO-8601 format as part of the response. |
updatedTimestamp | System-generated | String | Marks the time the analytic catalog entry was last updated. Generated by the Analytics Catalog service, the timestamp is returned in ISO-8601 format as part of the response. |
Analytic Validation Result Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
analyticId | System-generated | String | id of the analytic being validated. |
validationRequestId | System-generated | String | id of the analytic validation request. |
status | System-generated | String | Possible values: QUEUED, PROCESSING, COMPLETED, FAILED. |
message | System-generated | String | Detailed analytic validation status message. |
inputData | System-generated | String | The analytic input data that was passed in to the validation request. |
result | System-generated | String | The result string returned by the analytic. |
createdTimestamp | System-generated | String | Time the analytic validation was initiated. It is generated by the Analytics Catalog Service and returned in the response in ISO-8601 form. |
updatedTimestamp | System-generated | String | Time the analytic validation was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601. |
Analytic Deployment Request Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
memory | Optional | Integer | The amount of memory in MB to allocate to the analytic. Default is 512 MB. If no value is provided, or the provided value is less than the default, the default value will be used. |
diskQuota | Optional | Integer | The disk quota in MB to allocate to the analytic. Default is 1024 MB for Java and Python analytics, 2048 for Matlab analytics. If no value is provided, or the provided value is less than the default, the default value will be used. |
instances | Optional | Integer | The number of instances of the analytic to deploy. Default is 1. If no value is provided, or the provided value is less than the default, the default value will be used. |
Analytic Deployment Result Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
analyticId | System-generated | String | id of the analytic being deployed. |
requestId | System-generated | String | id of the analytic deployment request. |
status | System-generated | String | Possible values: QUEUED, PROCESSING, COMPLETED, FAILED. |
message | System-generated | String | Detailed analytic deployment status message. |
inputConfigData | System-generated | String | The input configuration data that was passed in to the deployment request. |
result | System-generated | String | Deprecated |
createdTimestamp | System-generated | String | Time the analytic deployment was initiated. It is generated by the analytics catalog service and returned in the response in ISO-8601. |
updatedTimestamp | System-generated | String | Time the analytic deployment was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601. |
Analytic Execution Result Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
analyticId | System-generated | String | id of the analytic being executed. |
status | System-generated | String | Possible values: QUEUED, PROCESSING, COMPLETED, FAILED. |
message | System-generated | String | Detailed analytic execution status message. |
inputData | System-generated | String | The analytic input data that was passed in to the execution request. |
result | System-generated | String | The result string returned by the analytic. |
createdTimestamp | System-generated | String | Time the analytic validation was initiated. It is generated by the analytics catalog service and returned in the response in ISO-8601 form. |
updatedTimestamp | System-generated | String | Time the analytic validation was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601. |
Asynchronous Execution Status Response Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
analyticId | System-generated | String | id of the analytic executed. |
requestId | System-generated | String | A unique identifier for retrieving the status and results of the requested execution. |
analyticExecutionState | System-generated | String | The current status of the execution. Possible values: QUEUED, PROCESSING, COMPLETED or FAILED. |
createdTimestamp | System-generated | String | The time when the execution request was received. |
updatedTimestamp | System-generated | String | The time when the status last changed. |
Deployed Analytic URI
The analytic URI value is needed when issuing certain REST API calls.
The deployed analytic URI format is https://<analytic id>.analytics.<Predix platform domain>
.
The parts of the analytic URI are determined as follows.
- "
analytic id
" is the id returned by the Analytics Catalog service when an analytic entry is created. - "
Predix platform domain
" is the sub string of the Analytics Catalog service URI starting at ".run
". For example, if the analytics catalog service URI ispredix-analytics-catalog-release.run.aws-usw02-pr.ice.predix.io
, then the Predix platform domain isrun.aws-usw02-pr.ice.predix.io
.
For example, if your "analytic id
" is 09718078-95e7-4b60-b74a-152838f03b41
and the Analytics Catalog service URI is predix-analytics-catalog-release.run.aws-usw02-pr.ice.predix.io
, then the analytic URI is https://09718078-95e7-4b60-b74a-152838f03b41.analytics.run.aws-usw02-pr.ice.predix.io
.
Required BPMN 2.0 XML Structures
The Analytics Framework uses an orchestration workflow file to trigger the analytic execution request. The orchestration workflow is defined in a BPMN file, which is an XML file conforming to the BPMN 2.0 standard.
Required Structures
The following table summarizes the required major structures in each BPMN 2.0 XML file created as an orchestration workflow file.
Tag | Description |
---|---|
<definitions> | See BPMN sample files for required content |
<process> | The following properties are required:
|
<startEvent> | The following properties are required:
|
<serviceTask> | Represents each orchestration step. The following attributes must be defined as shown:
The following properties must be defined as described:
|
<endEvent> | The following properties are required:
|
<sequenceFlow> | The following properties are required:
|
The serviceTask
Naming Conventions
Each analytic in the orchestration is represented as a serviceTask
element in the orchestration workflow file.
serviceTask
element must use the following format:name="<analytic_id>::<analytic_name>::<analytic_version>"
where <analytic_id>
, <analytic_name>
, and <analytic_version>
match the ID, name, and version assigned when the analytic was added to the catalog.
Descriptive Analytic Template Structure Reference
An analytic template defines the format of the input and output JSON structures. This reference describes how the JSON objects in a descriptive analytic template are used to generate the input structure required by the analytic.
Overview
The analytic template name (hierarchical, descriptive) describes the data format type you are working with. Descriptive data format is input and output data that is flat in structure but has self-describing attributes such as the data type and data size.
Each analytic has one template that is stored in the Analytics Catalog. The template is a JSON structure. For more information about JSON terminology, see http://www.json.org.
Type: AnalyticTemplate
This is the overall structure of a descriptive analytic template.
{
"name" : <String>,
"description" : <String>,
"packageName" : <String>,
"codeLanguage" : <String>,
"version" : <String>,
"author" : <String>,
"inputDataPortDefinitions" : [<List of DataPortDefinition>],
"constantDataPortDefinitions" : [<List of DataPortDefinition>],
"outputDataPortDefinitions" : [<List of DataPortDefinition>]
}
Field | Description |
---|---|
name | Name of the analytic. Must be alphanumeric and must start with a letter. |
description | Description of the analytic. It can be in HTML format. However, no image tags are allowed. |
packageName | Reserved for future use. |
codeLanguage | Reserved for future use. |
version | Version of the analytic. |
author | Author of the analytic. |
inputDataPortDefinitions | List of input port definitions |
constantDataPortDefinitions | List of constant port definitions. |
outputDataPortDefinitions | List of output port definitions. |
Type: DataPortDefinition
The inputDataPortDefinitions
, constantDataPortDefinitions
and outputDataPortDefinitions
elements of the analytic template contain arrays of DataPortDefinition
that specify the inputs, constants and outputs of the analytic.
{
"timeSeriesName" : <String>,
"name" : <String>,
"description" : <String>,
"dataType" : <String>,
"variable" : <Boolean>,
"arrayDimensions" : <Integer>,
"validators" : [<List of String>]
}
See the following table for a description of the elements in a DataPortDefinition
.
Field | Description |
---|---|
timeSeriesName | Leave field blank for non-timeseries port definition. For timeseries port definition this is the name of the timeseries |
name | Port name. |
description | Description of the data port. |
dataType | The data type in which the analytic expects input data and produces output data. The framework will convert values to and from this data type between the analytic and the data source. Possible values are: DOUBLE, INTEGER, LONG, STRING, BOOLEAN, DATETIME, DATE, TIME, CUSTOM, TIMESERIES. |
variable | Set value to true if port can have a variable number of ports at runtime. |
arrayDimensions | Indicate the number of dimensions (i.e. 0 is a single value, 1 is a 1-dimensional array, 2 is a 2-dimensional array, etc.). |
validators | Reserved for future use. |
Template Examples
Analytic templates are used at runtime in conjunction with Port-to-Field Map configurations to control the runtime engine's handling of the analytic input and output. See Samples of Descriptive Analytic Templates and Port-to-Field Maps.
Samples of Descriptive Analytic Templates and Port-to-Field Maps
An analytic template is used at runtime in conjunction with a port-to-field map to control how the framework's runtime engine will handle the analytic input and output. The following are samples of descriptive analytic templates and port-to-field maps.
Sample: Analytic Adder with Double Array Ports
The following is a sample JSON input.
{
"internalRecords": [],
"records": [
{
"type": "DOUBLE",
"name": "numberArray1",
"data": [
5.0,
6.0,
7.0,
8.0,
9.0,
10.0
],
"arraySizes": [
6
]
},
{
"type": "DOUBLE",
"name": "numberArray2",
"data": [
500.0,
600.0,
700.0,
800.0,
900.0,
1000.0
],
"arraySizes": [
6
]
}
],
"timeSeriesRecords": []
}
The following is a sample JSON output.
{
"records": [
{
"type": "DOUBLE",
"name": "results",
"data": [
505.0,
606.0,
707.0,
808.0,
909.0,
1010.0
],
"arraySizes": [
6
]
}
]
}
The following is a sample descriptive analytic template.
{
"name": "DoubleArrayAdder",
"description": "",
"packageName": "",
"codeLanguage": "JAVA",
"version": "1.0.0",
"author": "Predix Analytics team",
"inputDataPortDefinitions": [
{
"name": "numberArray1",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 1
},
{
"name": "numberArray2",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 1
}
],
"outputDataPortDefinitions": [
{
"name": "results",
"description": "",
"dataType": "Double",
"valueRequired": false,
"variable": false,
"arrayDimensions": 1
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "DoubleArrayAdder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "numberArray1",
"fieldId": "KW",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "numberArray2",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "results",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Sample: Analytic Adder with Timeseries and Constant Ports
Scenario: An analytic requires a time series table and one constant, where the time series input table has two fixed columns; numberArray1
and numberArray2
. Sample analytic templates and port-to-field maps to support this scenario follow.
All columns
in the Time Series array will share the same time stamps. The quality attribute from Time Series is not supported.
The following is a sample JSON input.
{
"internalRecords": [],
"records": [
{
"type": "DOUBLE",
"name": "constant1",
"data": [
55.0
],
"arraySizes": [
1
]
}
],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "numberArray1",
"data": [
1.0,
2.0,
3.0
],
"arraySizes": [
3
]
},
{
"type": "DOUBLE",
"name": "numberArray2",
"data": [
100.0,
200.0,
300.0
],
"arraySizes": [
3
]
}
]
}
]
}
The following is a sample analytic output.
{
"internalRecords": [],
"records": [],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "results",
"data": [
101.0,
202.0,
303.0
],
"arraySizes": [
3
]
}
]
}
]
}
The following is a sample descriptive analytic template.
{
"name": "TimeseriesAdder",
"description": "",
"packageName": "",
"codeLanguage": "JAVA",
"version": "1.0.0",
"author": "Predix Analytics team",
"inputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "numberArray1",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 1
},
{
"timeSeriesName": "time_series",
"name": "numberArray2",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 1
}
],
"constantDataPortDefinitions": [
{
"name": "constant1",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 0
}
],
"outputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "results",
"description": "",
"dataType": "Double",
"valueRequired": false,
"variable": false,
"arrayDimensions": 1
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.numberArray1",
"fieldId": "KW",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.numberArray2",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "constant1",
"fieldId": "engineType",
"queryCriteria": {
"columns": [
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value":"${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "field_id",
"value": "engineType",
"valueType": "string",
"relation": " = "
}
]
},
"required": true,
"dataSourceId": "Custom Data Source"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "time_series.results",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Sample: Analytic Adder with Variable Timeseries Ports
The following is a sample JSON input.
{
"internalRecords": [],
"records": [],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "temperature_2",
"data": [
1.0,
2.0,
3.0
],
"arraySizes": [
3
]
},
{
"type": "DOUBLE",
"name": "temperature_5",
"data": [
5.0,
6.0,
7.0
],
"arraySizes": [
3
]
},
{
"type": "DOUBLE",
"name": "vibration_2",
"data": [
100.0,
200.0,
300.0
],
"arraySizes": [
3
]
},
{
"type": "DOUBLE",
"name": "vibration_5",
"data": [
500.0,
600.0,
700.0
],
"arraySizes": [
3
]
}
]
}
]
}
The following is a sample JSON output.
{
"internalRecords": [],
"records": [],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "results_2",
"data": [
101.0,
202.0,
303.0
],
"arraySizes": [
3
]
},
{
"type": "DOUBLE",
"name": "results_5",
"data": [
505.0,
606.0,
707.0
],
"arraySizes": [
3
]
}
]
}
]
}
The following is a sample descriptive analytic template.
{
"name": "TimeseriesAdder",
"description": "",
"packageName": "",
"codeLanguage": "JAVA",
"version": "1.0.0",
"author": "Predix Analytics team",
"inputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "temperature",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": true,
"arrayDimensions": 1
},
{
"timeSeriesName": "time_series",
"name": "vibration",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": true,
"arrayDimensions": 1
}
],
"outputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "results",
"description": "",
"dataType": "Double",
"valueRequired": false,
"variable": true,
"arrayDimensions": 1
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.temperature_2",
"fieldId": "temp.left",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.temperature_5",
"fieldId": "temp.right",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.vibration_2",
"fieldId": "vibration.left",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.vibration_5",
"fieldId": "vibration.right",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "time_series.results_2",
"fieldId": "results.left",
"dataSourceId": "Predix Time Series"
}
{
"fullyQualifiedPortName": "time_series.results_5",
"fieldId": "results.right",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Sample: Analytic Adder with Multi-Dimensional Timeseries Ports
The following is a sample JSON input.
{
"internalRecords": [],
"records": [],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603",
"2016-03-17T18:27:49.601",
"2016-03-17T18:27:49.602",
"2016-03-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "temperature",
"data": [
1.0,
2.0,
3.0,
5.0,
6.0,
7.0
],
"arraySizes": [
2,
3
]
},
{
"type": "DOUBLE",
"name": "vibration",
"data": [
100.0,
200.0,
300.0,
500.0,
600.0,
700.0
],
"arraySizes": [
2,
3
]
}
]
}
]
}
The following is a sample analytic output.
{
"internalRecords": [],
"records": [],
"timeSeriesRecords": [
{
"name": "time_series",
"timestamps": [
"2016-02-17T18:27:49.601",
"2016-02-17T18:27:49.602",
"2016-02-17T18:27:49.603",
"2016-03-17T18:27:49.601",
"2016-03-17T18:27:49.602",
"2016-03-17T18:27:49.603"
],
"data": [
{
"type": "DOUBLE",
"name": "results",
"data": [
101.0,
202.0,
303.0,
505.0,
606.0,
707.0
],
"arraySizes": [
2,
3
]
}
]
}
]
}
The following is a sample descriptive analytic template.
{
"name": "TimeseriesAdder",
"description": "",
"packageName": "",
"codeLanguage": "JAVA",
"version": "1.0.0",
"author": "Predix Analytics team",
"inputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "temperature",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 2
},
{
"timeSeriesName": "time_series",
"name": "vibration",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": false,
"arrayDimensions": 2
}
],
"outputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": false,
"variable": false,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "results",
"description": "",
"dataType": "Double",
"valueRequired": false,
"variable": false,
"arrayDimensions": 2
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.temperature[0]",
"fieldId": "temperature.previous",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.temperature[1]",
"fieldId": "temperature.current",
"queryCriteria": {
"start": 1458239269601,
"end": 1458239269603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.vibration[0]",
"fieldId": "vibration.previous",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series.vibration[1]",
"fieldId": "vibration.current",
"queryCriteria": {
"start": 1458239269601,
"end": 1458239269603
},
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "time_series.results[0]",
"fieldId": "results.previous",
"dataSourceId": "Predix Time Series"
},
{
"fullyQualifiedPortName": "time_series.results[1]",
"fieldId": "results.current",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Hierarchical Analytic Template Structure Reference
An analytic template defines the format of the input and output JSON structures. This reference describes how the JSON objects in an hierarchical analytic template are used to generate the input structure required by the analytic.
Overview
The analytic template name (hierarchical, descriptive) describes the data format type you are working with. Hierarchical data format is input and output data that can be arbitrarily nested.
Each analytic has one template that is stored in the Analytics Catalog. The template is a JSON structure. For more information about JSON terminology, see http://www.json.org.
Type: AnalyticTemplate
This is the overall structure of an hierarchical analytic template.
{
"analyticName":<string>,
"analyticVersion":<string>,
"comment":[<string>],
"inputPortDefinitions":[<List of PortDefinition>],
"inputModelDefinitions":[<List of InputModel>],
"outputPortDefinitions":[<List of PortDefinition>]
}
See the following table for a description of the elements in an AnalyticTemplate
.
Field | Description |
---|---|
analyticName | Informational only. It does not correlate with any values in the input/output JSON structure. |
analyticVersion | Informational only. It does not correlate with any values in the input/output JSON structure. |
comment | (Optional) Informational only. An array of strings. It does not correlate with any values in the input/output JSON structure. |
inputPortDefinitions | The list of port definitions defining the top level JSON objects in the input. |
inputModelDefinitions
| The names of input models needed by the analytic. The runtime will get these models (by name) from the orchestration configuration and pass them in to the analytic as a map of byte arrays. |
outputPortDefinitions | The list of port definitions defining the top level JSON objects in the output. |
Type: PortDefinition
The inputPortDefinitions
and outputPortDefinitions
elements of the analytic template contain arrays of PortDefinitions
that specify the inputs and outputs of the analytic. There are several types of PortDefinitions
, but they all have the following elements in common.
{
"comment":[<string>],
"portName":<string>,
"portType":<string>,
"variable":<boolean>
}
FieldPort
TimeseriesArrayPort
CompositePort
See the following table for a description of the elements in a PortDefinition
.
Field | Description |
---|---|
comment | (Optional) Informational only. An array of strings. It does not correlate with any values in the input/output JSON structure. |
portName | Becomes the name in the name,value pair in the analytic's input or output JSON object. Shows as {“<portName>”: in the JSON object for this port definition. |
portType | Defines the nature of the value part of this JSON object. Predix Analytics Services supports the following value types:
|
variable | (Optional) True when the value in the analytic's input or output JSON object is a JSON array. The array's values will be enclosed in [ ]. Default value is false. |
Type: FieldPort
A FieldPort
is a type of PortDefinition
that holds the actual data values to be passed to/from the analytic. A FieldPort
defines a leaf node in the analytic's JSON structure.
{
"comment":[<string>],
"portName":<string>,
"portType":<string>,
"variable":<boolean>,
"engUnit":<string>,
"dataType":<string>,
"required":<boolean>,
"timestampShared":<string>
}
Extends from PortDefinition
.
FieldPort
definitions correspond to "<port name>": <value determined at runtime (see Port-to-Field Map description)>
pairs into the JSON structure.
In addition to the common elements of a PortDefinition
, a FieldPort
contains the following elements.
Field | Description |
---|---|
engUnit | (Optional, Reserved for future use) The engineering units of the data in this field. |
dataType | The data type in which the analytic expects input data and produces output data. The framework will convert values to and from this data type between the analytic and the data source. Possible values are: 'LONG', 'INTEGER', 'DOUBLE', 'FLOAT', 'STRING', 'BOOLEAN', "BOOLEAN_ARRAY", 'LONG_ARRAY', 'INTEGER_ARRAY', 'DOUBLE_ARRAY', 'FLOAT_ARRAY', 'STRING_ARRAY'. |
required | (Optional) Whether or not this field is required. |
timestampShared | Whether data includes separate timestamps for each indexed block of generated data for variable input port.
|
Type: TimeseriesArrayPort
A TimeseriesArrayPort
is a type of PortDefinition
that is used to define a JSON structure that represents a time series data table. This is the overall structure of a TimeseriesArrayPort
.
{
"comment": [<string>],
"portName": <string>,
"portType": <string>,
"variable": <boolean>,
"columns": [<List of FieldPorts>]
}
Extends from PortDefinition
.
In addition to the common elements of a PortDefinition
, a TimeseriesArrayPort
holds an array of FieldPorts
called columns
.
- Field: columns
-
Contains
FieldPort
column definitions where eachFieldPort
column is an array data type. The generated JSON structure will contain one JSON array for eachFieldPort
column and an additional 'time_stamp
' array object containing the timestamps of the table values, as the example shows below.Note: Allcolumns
in the time series array will share the same time stamps. The quality attribute from Time Series is not supported.{ “<timeseries array port name>” : { “time_stamp” : [<timestamp values associated with field port values>], [<list of JSON objects for the field port definitions>] } }
Note: The time series values for each of the field port definitions must contain the same set of time stamps. If they do not, the platform will trigger an error.
Type: CompositePort
A CompositePort
is a type of PortDefinition
that contains other PortDefinitions
. This is the overall structure of a CompositePort
.
{
"comment":[<string>],
"portName":<string>,
"portType":<string>,
"variable":<boolean>,
"childrenPorts":[<List of PortDefinition>]
}
Extends from: PortDefinition
.
In addition to the common elements of a PortDefinition
, a CompositePort
holds an array of childrenPorts
.
- Field: childrenPorts
-
Contains a list of inner (children)
PortDefinitions
. When the variable flag is false, the generated JSON object (corresponding to the composite port) will contain each child port as a direct child node of the composite port, following this pattern.{ “<composite port name>”: { <JSON structure from the Port-to-Field Map for the first child port definition>, ... <JSON structure from the Port-to-Field Map for the last child port definition> } }
When the variable flag is true, the generated JSON object (corresponding to the composite port) will wrap the child nodes in an array under the composite port, following this pattern.{ "<composite port name>" : [ { <JSON structure from Port-to-Field Map for the first child>, ..., <JSON structure from Port-to-Field Map for the last child>, }, ..., { <JSON structure from Port-to-Field Map for the first child>, ..., <JSON structure from Port-to-Field Map for the last child>, } ] }
Type: InputModel
An InputModel
is a named byte array that the analytic uses in its processing. The values for the model are managed as part of an orchestration configuration and are associated with an asset context.
{ "modelPortName": <string>,
"comments" : [<string>] }
See the following table for a description of the elements in an InputModel
.
Field | Description |
---|---|
modelPortName | The name the analytic uses to pull the model from the map of byte arrays. |
comment | (Optional) Comments describing the model. |
Template Examples
Analytic templates are used at runtime in conjunction with port-to-field map configurations to control the runtime engine's handling of the analytic input and output. See Samples of Hierarchical Analytic Templates and Port-to-Field Maps.
Samples of Hierarchical Analytic Templates and Port-to-Field Maps
An analytic template is used at runtime in conjunction with a port-to-field map to control how the framework's runtime engine will handle the analytic input and output. The following are samples of hierarchical analytic templates and port-to-field maps.
- Sample: DemoAdder Analytic
- Sample: Analytic Requiring a Time Series Table, Input Models, and Constants
- Sample: Analytic Requiring a Time Series Table With Variable Time Stamps
- Sample: Array of Arrays Model
- Sample: Multiple Iterations of Same Analytic
- Sample: Analytic Requiring Predix Time Series Data Using an Array of Tags
- Sample: Analytic Requiring Predix Time Series Data Using Variable Input Ports
- Sample: Reusing a Port-to-Field Map When Providing Data Is Optional
Sample: DemoAdder Analytic
The following is a sample JSON input.
{
"number1": <latest value from temperature sensor1>,
"number2": -55
}
The following is a sample JSON output.
{"result" : <result from analytic>}
The following is a sample hierarchical analytic template.
{
"analyticName": "add2Numbers",
"analyticVersion": "V1.0",
"inputPortDefinitions": [
{
"portType": "FIELD",
"portName": "number1",
"variable": false,
"required": true
},
{
"portType": "FIELD",
"portName": "number2",
"variable": false,
"required": true
}
],
"outputPortDefinitions": [
{
"portType": "FIELD",
"portName": "result",
"variable": false,
"required": true
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "add2Numbers",
"analyticVersion": "V1.0",
"orchestrationStepId": "adjust temp by ambient",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "number1",
"fieldId": "temperature sensor1",
"queryCriteria": {"start": 1464989425852, "end": 1464989425852},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "CONSTANT",
"fullyQualifiedPortName": "number2",
"value": -55
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "result",
"fieldId": "adjusted temp",
"dataSourceId": "PredixTimeSeries"
}
]
}
]
}
Sample: Analytic Requiring a Time Series Table, Input Models, and Constants
Scenario: An analytic requires a time series table and two constants, where the time series input table has three fixed columns; column1
, column2
, and column3
. Sample hierarchical analytic templates and port-to-field maps to support this scenario follow.
columns
in the Time Series array will share the same time stamps. The quality attribute from Time Series is not supported.The following is a sample JSON input.
{
"inputData": {
"data": {
"timeseries": {
"time_stamp": [
000003242,
000004242,
000005242
],
"column1": [
10.0,
20.0,
30.0
],
"column2": [
1.0,
2.0,
3.0
],
"column3": [
0.1,
0.2,
0.3
]
},
"constants": {
"constant1": 55,
"constant2": 100
}
}
},
"inputModel": [
{
"modelPortName": "test-model-port-1",
"isModelEmbedded": "true",
"modelLib": "anVuay1tb2RlbC1iaW5hcnktMQ=="
},
{
"modelPortName": "test-model-port-2",
"isModelEmbedded": "true",
"modelLib": "janVuay1tb2RlbC1iaW5hcnktMg=="
}
]
}
{
"data": {
"timeseries": {
"time_stamp": [
000003242,
000004242,
000005242
],
"column1": [
10.0,
20.0,
30.0
],
"column2": [
1.0,
2.0,
3.0
],
"column3": [
0.1,
0.2,
0.3
]
},
"constants": {
"constant1": 55,
"constant2": 100
}
}
}
The following is a sample hierarchical analytic template.
{
"analyticName": "analyticWith3ColumnsAnd2Constants",
"analyticVersion": "V1.0",
"inputPortDefinitions": [
{
"portType": "COMPOSITE",
"portName": "data",
"childrenPorts": [
{
"portName": "timeseries",
"portType": "TIMESERIES_ARRAY",
"variable": false,
"columns": [
{
"portName": "column1",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
},
{
"portName": "column2",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
},
{
"portName": "column3",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
},
{
"portType": "COMPOSITE",
"portName": "constants",
"childrenPorts": [
{
"portType": "FIELD",
"portName": "constant1",
"variable": false,
"dataSourceId": "PredixTimeSeries",
"required": true
},
{
"portType": "FIELD",
"portName": "constant2",
"variable": false,
"dataSourceId": "PredixTimeSeries",
"required": true
}
]
}
]
}
],
,
"inputModelDefinitions": [
{
"modelPortName": "test-model-port-1"
},
{
"modelPortName": "test-model-port-2"
}
],
"outputPortDefinitions" : [
{
"portType": "FIELD",
"portName": "results",
"variable": false,
"dataSourceId": "PredixTimeSeries"
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "timeseriesAnalyticSampleWith3ColumnsAnd2Constants",
"analyticVersion": "V1.0",
"orchestrationStepId": "adjust temp by ambient",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.timeseries.column1",
"fieldId": "temperature sensor",
"queryCriteria": {"start": 000003242, "end": 000005242},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.timeseries.column2",
"fieldId": "vibration sensor",
"queryCriteria": {"start": 000003242, "end": 000005242},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.timeseries.column3",
"fieldId": "pressure sensor",
"queryCriteria": {"start": 000003242, "end": 000005242},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "CONSTANT",
"fullyQualifiedPortName": "data.constants.constant1",
"value": -55
},
{
"valueSourceType": "CONSTANT",
"fullyQualifiedPortName": "data.constants.constant2",
"value": 100
}
],
"inputModelMaps": [
{
"modelPortName": "test-model-port-1",
"modelName": "test-model-1",
"modelVersion": "v1"
},
{
"modelPortName": "test-model-port-2",
"modelName": "test-model-2",
"modelVersion": "v1"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "result",
"fieldId": "xxxField",
"dataSourceId": "PredixTimeSeries"
}
]
}
]
}
Sample: Analytic Requiring a Time Series Table With Variable Time Stamps
Scenario: An analytic requires a time series table and two constants, where the time series input table has three fixed columns; column1
, column2
, and column3
where time stamps are different. Sample hierarchical analytic templates and port-to-field maps to support this scenario follow.
columns
in the Time Series array will not have the same time stamps. The quality attribute from Time Series is not supported.The following is a sample JSON input.
{
"time_series_temperature": {
"temperature": [
100
],
"time_stamp": [
"1455733669601"
]
},
"time_series_pressure": {
"pressure": [
57
],
"time_stamp": [
"1455733669605"
]
},
"time_series_vibration": {
"vibration": [
14
],
"time_stamp": [
"1455733669609"
]
}
}
The following is a sample hierarchical analytic template.
{
"analyticName": "analytic-name",
"analyticVersion": "1.0",
"inputPortDefinitions": [
{
"portName": "time_series_temperature",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "temperature",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
},
{
"portName": "time_series_pressure",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "pressure",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
},
{
"portName": "time_series_vibration",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "vibration",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "analytic-name",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series_temperature.temperature",
"fieldId": "field-7071",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669610
},
"engUnit": "Celsius",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series_pressure.pressure",
"fieldId": "field-2934",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669610
},
"engUnit": "Celsius",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "time_series_vibration.vibration",
"fieldId": "field-9342",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669610
},
"engUnit": "Celsius",
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.result",
"fieldId": "field-7432",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Sample: Array of Arrays Model
This example shows how you can create an hierarchical analytic template that represents JSON input data as an array of arrays when starting with input data similar to the following:
"Target_data": [
[-1.3902596851290876, -6.765816812041471, -4.76418388526832, -1.2112080900902589, -1.7569610822677713],
[9.7817179641162, -1.145749275218344, 6.533089409154036, 6.549142463394793, -5.168794582523],
[-0.44253227413493945, -6.43586397529881, -3.291613230961093, 1.2406735399676179, -1.4659275851421456],
[0.01589916825431556, -4.695552518057381, -5.284477375577028, -1.9252859877753656, -2.865932498974862]....
Start by marking the analytic input port as a variable port in the analytic template.
{
"analyticName": "analytic-with-variable-input-port",
"analyticVersion": "1.0",
"inputPortDefinitions": [
{
"portType": "FIELD",
"portName": "Target_data",
"variable": true,
"dataType": "DOUBLE_ARRAY",
"required": true
}
],
"outputPortDefinitions": [
{
"portName": "output",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "analytic-with-variable-input-port",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "Target_data.0",
"fieldId": "KW",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "Target_data.1",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "output",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
When the orchestration is run, the data input will come from two time series tags. The data input will look as follows.
{
"Target_data": [
[
5,
6,
7,
8,
9,
10
],
[
500,
600,
700,
800,
900,
1000
]
]
}
You can add more data array to the top level array by defining Target_data.2, Target_data.3, and so on.
Sample: Multiple Iterations of Same Analytic
This example shows how you can run multiple iterations of the same analytic in an orchestration execution. The following is a sample port-to-field map.
{
"analyticName": "demo-timeseries-adder",
"analyticVersion": "V1",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"id": "0",
"name": " First Iteration",
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "temperature sensor",
"queryCriteria": {
"start": 0,
"end": -1
},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"fieldId": "vibration sensor",
"queryCriteria": {
"start": 0,
"end": -1
},
"dataSourceId": "PredixTimeSeries"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "demo sum",
"dataSourceId": "PredixTimeSeries"
}
]
},
{
"id": "1",
"name": " Second Iteration",
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "temperature sensor",
"queryCriteria": {
"start": 0,
"end": -1
},
"dataSourceId": "PredixTimeSeries"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"fieldId": "vibration sensor",
"queryCriteria": {
"start": 0,
"end": -1
},
"dataSourceId": "PredixTimeSeries"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "demo sum",
"dataSourceId": "PredixTimeSeries"
}
]
}
]
}
Sample: Analytic Requiring Predix Time Series Data Using an Array of Tags
This example shows how you can run an analytic that uses Predix Time Series data that is an array of tags. For steps to run the orchestration request, see afs-orchestration-execution.html#task_9d270d7f-4d4a-4e07-9239-f121d17f1c8d.
The following is a sample hierarchical analytic template.
{
"analyticName": "java-timeseries-demo-adder",
"analyticVersion": "1.0",
"inputPortDefinitions": [
{
"portType": "COMPOSITE",
"portName": "data",
"variable": false,
"childrenPorts": [
{
"portName": "time_series_1",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "doubleArray",
"portType": "FIELD",
"variable": true,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
},
{
"portName": "time_series_2",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "integerArray",
"portType": "FIELD",
"variable": true,
"dataType": "INTEGER_ARRAY",
"required": true
}
]
}
]
}
],
"outputPortDefinitions": [
{
"portName": "data",
"portType": "COMPOSITE",
"required": true,
"variable": false,
"childrenPorts": [
{
"portName": "time_series",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "sum",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
}
]
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "4-input-timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_1.doubleArray",
"fieldId": "doubleArrayFieldId",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"engUnit": "hertz",
"required": true,
"variable": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_2.integerArray",
"fieldId": "integerArrayFieldId",
"queryCriteria": {
"start": 1455733669901,
"end": 1455733669904
},
"engUnit": "kw",
"required": true,
"variable": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
The array of tags is passed in through the orchestrationConfigurationID
definition as shown in the following example.
{
"orchestrationConfigurationId": "3a5cd1da-5b6e-47fd-b74d-9e75e959810f",
"assetDataFieldsMap": {
"doubleArrayFieldId": [
"timerseries_tag1",
"timerseries_tag2",
"timerseries_tag3"
],
"integerArrayFieldId": [
"timerseries_tag4",
"timerseries_tag5"
]
},
"dataSource": []
}
The following shows a sample of generated input with data.
{
"inputData": {
"data": {
"time_series_1": {
"time_stamp": [
1455733669601,
1455733669602,
1455733669603
],
"doubleArray": [
[
500,
600,
700
],
[
5000,
6000,
7000
],
[
50000,
60000,
70000
]
]
},
"time_series_2": {
"time_stamp": [
1455733669901,
1455733669902,
1455733669903,
1455733669904
],
"integerArray": [
[
105,
205,
305,
405
],
[
15,
25,
35,
45
]
]
}
}
}
}
Sample: Analytic Requiring Predix Time Series Data Using Variable Input Ports
This example shows how you can run an analytic that uses data having separate timestamps for each indexed block of generated data from Predix Time Series (variable input port).
The following is a sample hierarchical template format with variable ports.
portType
value is TIMESERIES_ARRAY, the timestampShared
value must be false
to indicate that variable ports are being used.{
"analyticName": "sample-hirerachical-analytic-template-with-two-variable-ports",
"analyticVersion": "1.0",
"inputPortDefinitions": [
{
"portType": "COMPOSITE",
"portName": "data",
"variable": false,
"childrenPorts": [
{
"portName": "time_series_1",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "doubleArray",
"portType": "FIELD",
"variable": true,
"timestampShared": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
},
{
"portName": "time_series_2",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "integerArray",
"portType": "FIELD",
"variable": true,
"timestampShared": false,
"dataType": "INTEGER_ARRAY",
"required": true
}
]
}
]
}
],
"outputPortDefinitions": [
{
"portName": "data",
"portType": "COMPOSITE",
"required": true,
"variable": false,
"childrenPorts": [
{
"portName": "time_series",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "sum",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
}
]
}
]
}
]
}
The following is a sample port-to-field map showing that separate timestamps for each block of data is needed.
{
"analyticName": "sample-port-to-field-map-with-two-variable-ports",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_1.doubleArray.0",
"fieldId": "vibration1",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_1.doubleArray.1",
"fieldId": "vibration2",
"queryCriteria": {
"start": 1455733669701,
"end": 1455733669705
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_1.doubleArray.2",
"fieldId": "vibration3",
"queryCriteria": {
"start": 1455733669801,
"end": 1455733669802
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_2.integerArray.0",
"fieldId": "kw1",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669603
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series_2.integerArray.1",
"fieldId": "kw2",
"queryCriteria": {
"start": 1455733669701,
"end": 1455733669705
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series"
}
]
}
]
}
The following is sample generated analytic input for a hierarchical template showing separate timestamps side-by-side with values.
{
"data": {
"time_series_1": {
"doubleArray": [
{
"time_stamp": [
1455733669601,
1455733669602,
1455733669603
],
"values": [
500,
600,
700
]
},
{
"time_stamp": [
1455733669701,
1455733669702,
1455733669703,
1455733669704,
1455733669705
],
"values": [
50,
60,
70,
80,
90
]
},
{
"time_stamp": [
1455733669801,
1455733669802
],
"values": [
1000,
2000
]
}
]
},
"time_series_2": {
"integerArray": [
{
"time_stamp": [
1455733669101,
1455733669102,
1455733669103,
1455733669104
],
"values": [
105,
205,
305,
405
]
},
{
"time_stamp": [
1455733669101,
1455733669202,
1455733669203,
1455733669204,
1455733669205,
1455733669206
],
"values": [
15,
25,
35,
45,
55,
65
]
}
]
}
}
}
When creating the orchestration execution request payload, pass the array type of data in assetDataFieldsMap
for the fieldId
. For example,
{
"orchestrationConfigurationId": "3a5cd1da-5b6e-47fd-b74d-9e75e959810f",
"assetDataFieldsMap": {
"doubleArrayFieldId": [
"timerseries_tag1",
"timerseries_tag2",
"timerseries_tag3"
],
"integerArrayFieldId": [
"timerseries_tag4",
"timerseries_tag5"
]
},
"dataSource": []
}
The following is sample generate input for two timeseries variable ports.
{
"inputData": {
"data": {
"time_series_1": {
"time_stamp": [
1455733669601,
1455733669602,
1455733669603
],
"doubleArray": [
[
500,
600,
700
],
[
5000,
6000,
7000,
8000,
9000
],
[
50000,
60000
]
]
},
"time_series_2": {
"time_stamp": [
1455733669101,
1455733669102,
1455733669103,
1455733669104
],
"integerArray": [
[
105,
205,
305,
405
],
[
15,
25,
35,
45,
55,
65
]
]
}
}
}
}
Sample: Reusing a Port-to-Field Map When Providing Data Is Optional
Scenario: You may want to use the same port-to-field map for multiple deployments but need to vary which ports will have data mapped for certain iterations. In the InputMaps
definition, you can set the required
field to false
to indicate that data is optional. When this port is marked optional (required: false
) and data is not provided during execution, it will be ignored and execution will not be failed for missing data. The following examples show how you can achieve this.
The following is a sample hierarchical analytic template
{
"analyticName": "java-timeseries-demo-adder",
"analyticVersion": "1.0",
"inputPortDefinitions": [
{
"portType": "COMPOSITE",
"portName": "data",
"variable": false,
"childrenPorts": [
{
"portName": "time_series",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "numberArray1",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
},
{
"portName": "numberArray2",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
},
{
"portName": "numberArray3",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": false
}
]
}
]
}
],
"outputPortDefinitions": [
{
"portName": "data",
"portType": "COMPOSITE",
"required": true,
"variable": false,
"childrenPorts": [
{
"portName": "time_series",
"portType": "TIMESERIES_ARRAY",
"required": true,
"variable": false,
"columns": [
{
"portName": "sum",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": true
},
{
"portName": "diff",
"portType": "FIELD",
"variable": false,
"dataType": "DOUBLE_ARRAY",
"required": false
}
]
}
]
}
]
}
The following is a sample descriptive analytic template.
{
"name": "TimeseriesAdder",
"description": "",
"packageName": "",
"codeLanguage": "JAVA",
"version": "1.0.0",
"author": "Predix Analytics team",
"inputDataPortDefinitions": [
{
"name": "time_series",
"description": "",
"dataType": "TimeSeries",
"valueRequired": true,
"variable": true,
"arrayDimensions": 0
},
{
"timeSeriesName": "time_series",
"name": "numberArray1",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": true,
"validators": [
"NotNull()",
"IsNumber()"
],
"arrayDimensions": 1
},
{
"timeSeriesName": "time_series",
"name": "numberArray2",
"description": "",
"dataType": "Double",
"valueRequired": true,
"variable": true,
"validators": [
"NotNull()",
"IsNumber()"
],
"arrayDimensions": 1
},
{
"timeSeriesName": "time_series",
"name": "numberArray3",
"description": "",
"dataType": "Double",
"valueRequired": false,
"variable": true,
"validators": [
"NotNull()",
"IsNumber()"
],
"arrayDimensions": 1
}
],
"constantDataPortDefinitions": [
{
"timeSeriesName": "time_series",
"name": "sum",
"description": "",
"dataType": "Custom",
"valueRequired": true,
"variable": true,
"arrayDimensions": 1
},
{
"timeSeriesName": "time_series",
"name": "diff",
"description": "",
"dataType": "Boolean",
"valueRequired": false,
"variable": true,
"validators": [
"NotNull()",
"IsNumber()"
],
"arrayDimensions": 1
}
]
}
The following is a sample port-to-field map.
{
"analyticName": "4-input-timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "KW",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "kw",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray3",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669605,
"end": 1455733669610
},
"engUnit": "hertz",
"required": false,
"dataSourceId": "Predix Time Series"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"fullyQualifiedPortName": "data.time_series.diff",
"fieldId": "windspeed final",
"engUnit": "km",
"required": false,
"dataSourceId": "Predix Time Series"
}
]
}
]
}
Sample generated input with data for optional (required: false
) field.
{
"data": {
"time_series": {
"time_stamp": [
1455733669605,
1455733669606,
1455733669607,
1455733669608,
1455733669609,
1455733669610
],
"numberArray1": [
5.0,
6.0,
7.0,
8.0,
9.0,
10.0
],
"numberArray2": [
500.0,
600.0,
700.0,
800.0,
900.0,
1000.0
],
"numberArray3": [
50.0,
60.0,
70.0,
80.0,
90.0,
100.0
]
}
}
}
Sample generated input without data for optional (required: false
) field.
{
"data": {
"time_series": {
"time_stamp": [
1455733669605,
1455733669606,
1455733669607,
1455733669608,
1455733669609,
1455733669610
],
"numberArray1": [
5.0,
6.0,
7.0,
8.0,
9.0,
10.0
],
"numberArray2": [
500.0,
600.0,
700.0,
800.0,
900.0,
1000.0
]
}
}
}
Port-To-Field Map Reference
An analytic template defines the format of its input and output JSON structure. The port-to-field map tells the runtime engine how to get the values to insert into the analytic’s input JSON structure and where to write the values from the analytic's output JSON structure. The port-to-field map simply maps FieldPort
entries from the analytic's template to data sources and sinks.
The port-to-field map is itself a JSON structure. The JSON objects in this structure are summarized in the following tables.
Type: PortToFieldMap
This is the overall structure of a port-to-field map.
{
"analyticName":<string>,
"analyticVersion":<string>,
"comment":<string>,
"orchestrationStepId":<string>,
"iterations":[<list of Iterations>]
}
See the following table for a description of the elements in a PortToFieldMap
.
Field | Description |
---|---|
analyticName | Informational only. It does not correlate with any values in the input/output JSON structure. |
analyticVersion | Informational only. It does not correlate with any values in the input/output JSON structure. |
comment | (Optional) Informational only. It does not correlate with any values in the input/output JSON structure. |
orchestrationStepId | Informational only. It does not correlate with any values in the input/output JSON structure. |
iterations | Create one iteration entry as described below. The Analytics Framework also supports multiple iterations of the same analytic.. |
Type: Iteration
An iteration
defines the set of input/output PortMaps
for an execution of the analytic. Fill in the "inputMaps
" array with the list of PortMaps
that define how to get the input value and fill in the "outputMaps
" with the list of OutputDataConnectorMaps
that define where to write the output values.
This is the overall structure of an iteration object.
{
"inputMaps":[<list of PortMaps>],
"outputMaps":[<list of OutputDataConnectorMaps>],
"inputModelMaps":[<list of ModelPortMaps>],
"id":<string>,
"name":<string>
}
See the following table for a description of the elements in an Iteration
.
Field | Description |
---|---|
inputMaps | The list of maps that associate (map) fields from the data source with objects in the JSON input structure. |
outputMaps | The list of maps that map objects from the analytic's JSON output structure to fields in the data sink. |
inputModelMaps | The list of maps that identify models to be passed to the analytic at runtime. |
id | (Optional). The analytic iteration id, must be a unique sequence. Can be useful to track an iteration which contains an error. Type string.
|
name | (Optional). The description for the iteration. Type string. |
Type: PortMap
Maps an object in the JSON input structure to a constant value or a field in the data source.
{
"valueSourceType":<string>,
"fullyQualifiedPortName":<string>
}
InputConstantMap
InputDataConnectorMap
See the following table for a description of the common elements in a PortMap
.
Field | Description |
---|---|
valueSourceType | A flag indicating if the map is for a constant value (“CONSTANT") or is mapping the JSON object to a field in the data source (“DATA_CONNECTOR"). |
fullyQualifiedPortName | Identifies the JSON object in the analytics’s input or output JSON structure that will hold the value from the field . The port name is a fully qualified (‘.’ delimited) string capturing the path from the root node in the JSON structure to the JSON object that will contain the value from the field in the data source. If the port was defined to be variable in the analytic template, that means the analytic is expecting an array of field values. In this case, each field value will need a separate PortMap with an index (‘.1’, ‘.2’, etc.) at the end of the fullyQualifiedPortName . |
Type: InputConstantMap
An InputConstantMap
is a type of PortMap
to be used when the data to be provided to the analytic can be stored directly within the port-to-field map itself. This is the overall structure of an input constant map.
{
"valueSourceType":<string>,
"fullyQualifiedPortName":<string>,
"value":<string>
}
Extends from: PortMap
.
In addition to the common elements of a PortMap
, an InputConstantMap
holds the value of a constant field:
- Field: value
- Contains the value to be passed to the constant field. This can be a numeric value (without quotes) or a string value (with quotes).
A constant input value can be parametrized and passed in at runtime using the customAttributes
field of OrchestrationExecutionRequest
. For example:
{
"valueSourceType":CONSTANT,
"fullyQualifiedPortName":"data.abc",
"value":"${custom.CONSTANT_VALUE}"
}
Type: InputDataConnectorMap
An InputDataConnectorMap
is a type of PortMap
to be used when the data to be provided to the analytic needs to be retrieved from the data source at runtime by the Analytics Data Connector. This is the overall structure of an input data connector map.
{
"valueSourceType":<string>,
"fullyQualifiedPortName":<string>,
"fieldId":<string>,
"dataSourceId":<string>,
"queryCriteria":<string>,
"engUnit":<string>,
"variable":<boolean>,
"tagNameQuery":<string>
}
Extends from: PortMap
.
In addition to the common elements of a PortMap
, an InputDataConnectorMap
contains the following entries.
Field | Description |
---|---|
fieldId | The fieldId is a string that is mapped to a Predix Time Series tag id for a given asset id. At runtime the orchestration engine gets the Predix Time Series tag id for the field id (as described below), reads the value from Predix Time Series, and puts that value in the JSON object at the location corresponding to the fullyQualifiedPortName . The runtime acquires the Predix Time Series tag id for the field id as follows:
|
dataSourceId | The following values are supported:
|
queryCriteria |
|
engUnit | (Optional) The engineering units of the data in this field. |
variable | (Optional) True when the value in the analytic's input or output JSON object is a JSON array. The array's values will be enclosed in [ ]. Default value is false. |
tagNameQuery |
|
Type: OutputDataConnectorMap
An OutputDataConnectorMap
is used to identify the destination of the output data generated by the analytic at runtime. This is the overall structure of an output data connector map.
{
"fullyQualifiedPortName":<string>,
"fieldId":<string>,
"dataSourceId":<string>,
"tagNameQuery":<string>
}
An OutputDataConnectorMap
contains the following entries.
Field | Description |
---|---|
fullyQualitifiedPortName | Identifies the JSON object in the analytics’s output JSON structure that will hold the value from the field. The port name is a fully qualified (‘.’ delimited) string capturing the path from the root node in the JSON analytic output data to the JSON object that will contain the value of the field to be stored in the data sink. If the port was defined to be variable in the analytic template, that means the analytic will produce an array of field values. In this case, each field value will need a separate OutputDataConnectorPortMap with an index (‘.1’, ‘.2’, etc.) at the end of the fullyQualifiedPortName . |
fieldId | The field id is a string that is mapped to a Predix Time Series tag id for a given asset id. At runtime the orchestration engine gets the Predix Time Series tag id for the field id (described in InputDataConnectorMap
fieldId description), and writes the value from JSON object at the location corresponding to the fullyQualitifedPortName to the Predix Time Series tag id. |
dataSourceId | For an OutputDataConnectorMap , multiple dataSourceIds can be specified using a comma delimited list. The following values are supported:
|
queryCriteria |
|
tagNameQuery |
|
Type: ModelPortMap
modelPortMap
type is supported in hierarchical analytic templates only. It is not supported in descriptive analytic templates.A ModelPortMap
is used to retrieve the specified model from the Runtime service. This model is then passed to the analytic at runtime by the Analytics Data Connector. This is the overall structure of a model port map.
{
"modelPortName": <string>,
"modelName": <string>,
"modelVersion": <string>
}
See the following table for a description of the elements in a ModelPortMap
.
Field | Description |
---|---|
modelPortName | The name the analytic uses to pull the model from a map of byte arrays. |
modelName | The name of the model stored in the Configuration service. |
modelVersion | The version of the model stored in the Configuration service. |
Customizing Query Criteria Attributes in Port-to-Field Map
The queryCriteria
field (InputDataConnectorMap
and OutputDataConnectorMap
) in the port-to-field map provides the data retrieval instructions. The following examples show how to add custom attributes in the queryCriteria
field when using Predix Time Series as a data source.
Pre-Defined System Attributes
System Attribute Name | Description |
---|---|
FIELD_ID | The field id of the port. |
ASSET_ID | The asset instance id of the current orchestration run. |
ANALYTIC_ID | The analytic id of the current orchestration step. |
ORCHESTRATION_REQUEST_ID | The request id of the current orchestration run. |
ORCHESTRATION_CONFIGURATION_ID | The orchestration configuration id of the current orchestration run. |
ORCESTRATION_STEP_ID | The orchestration step id of the current orchestration run. |
PORT_TO_FIELD_MAP_NAME | The port-to-field map name of the current orchestration step. |
Use Latest Data Point
Retrieve the latest datapoint from Predix Time Series service (instead of specifying tag ids) by omitting the start
and end
time in the port-to-field map's queryCriteria
field. The following example shows how.
{
"analyticName": "timeseriesAnalyticSampleWith3ColumnsAnd2Constants",
"analyticVersion": "V1.0",
"orchestrationStepId": "adjust temp by ambient",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.otherData.multipleTimeseries.0.MTSColumn1.0",
"fieldId": "pressure sensor1",
"queryCriteria": {
"tags": [{
"limit": 7,
"order": "asc",
"aggregations": [{
........
}]
},
"required": true,
"dataSourceId": "Predix Time Series"
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.otherData.multipleTimeseries.0.MTSColumn1.1",
"fieldId": "pressure sensor1",
"queryCriteria": {},
"dataSourceId": "Predix Time Series"
},
......
Use Parametrized Variables
You can use parametrized variables for resolving at orchestration runtime in the queryCriteria
field. The following port-to-field map example shows how to do this in InputDataConnectorMap
. The data must be consistent with Predix Time Series /datapoints
request format.
{
"analyticName": "4-input-timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "KW",
"queryCriteria": {
"start": "${custom.KW.START_TIME}",
"end": "${custom.KW.END_TIME}",
"tags": [{
"limit": 1000,
"order": "desc",
"aggregations": [{
"type": "${.custom.AGGREGATION_TYPE}",
"interval": "${custom.INTERVAL}"
}],
"groups": [{
"name": "${ATTRIBUTE_1_KEY}",
"attributes": [
"${context.ASSET_ID}_${context.FIELD_ID}_${custom.KW.ATTRIBUTE_1_VALUE_1}",
"${context.ASSET_ID}_${context.FIELD_ID}_${custom.KW.ATTRIBUTE_1_VALUE_2}"
]
}]
}]
},
.......
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "vibration",
"queryCriteria": {
"start": "${custom.vibration.START_TIME}",
"end": "${custom.vibration.END_TIME}",
"tags": [{
"limit": 1000,
"order": "desc",
"aggregations": [{
"type": "${custom.AGGREGATION_TYPE}",
"interval": "${custom.INTERVAL}"
}],
"groups": [{
"name": "${ATTRIBUTE_1_KEY}",
"attributes": [
"${context.ASSET_ID}_${context.FIELD_ID}_${custom.vibration.ATTRIBUTE_1_VALUE_1}",
"${context.ASSET_ID}_${context.FIELD_ID}_${custom.vibration.ATTRIBUTE_1_VALUE_2}"
]
}]
}]
},
When defining the OutputDataConnectorMap
, note the following restrictions.
- Output only supports key-value pairs of String datatype.
- You must represent the Predix Time Series Web Socket write API structure.
messageId
field is mandatory when writing to Time Series. Either provide hard coded value or a parametrized variable substituted at orchestration runtime.- If
queryCriteria
field is not specified for the output port, a system generatedmessageId
and default attribute"source"="orchestration"
key-value pair is added when writing to time series
The following is a sample queryCriteria
in OutputDataConnectorMap
.
........
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series",
"queryCriteria" : {
"messageId": "${custom.MESSAGE_ID}",
"body": [
{
"attributes": {
"${custom.ATTRIBUTE_1_KEY}" : "${custom.ASSET_ID}_${custom.FIELD_ID}_${custom.bearing temperature final.ATTRIBUTE_1_VALUE_1}",
"${custom.ATTRIBUTE_2_KEY}" : "${context.ASSET_ID}_${context.FIELD_ID}_${custom.bearing temperature final.ATTRIBUTE_1_VALUE_2}"
}
}
]
}
.......
Provide block dynamic parametrized variables in queryCriteria
field in InputDataConnectorMap
as follows.
{
"analyticName": "4-input-timeseries-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "KW",
"queryCriteria": "${custom.QUERY_CRITERIA1}",
.......
Provide block dynamic parametrized variables in queryCriteria
field in OutputDataConnectorMap
as follows.
........
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing temperature final",
"engUnit": "Celsius",
"dataSourceId": "Predix Time Series",
"queryCriteria" : "${custom.QUERY_CRITERIA2}"
.......
Configuring Dynamic Tags in Port-to-Field Map
When running an orchestration request using Predix Time Series tags, you can pass dynamic tags using customAttributes
as a key-value pair. The following examples show how to do this.
Overview
The tagNameQuery
field (InputDataConnectorMap
and OutputDataConnectorMap
) in the port-to-field map contains the GEL query for querying Predix Asset for the Predix Time Series tag id. Dynamic time series tags are supported by defining a dynamic tag template in the following format.
##${asset.sourceKey}.${context.portToFieldMapName}.${context.analyticId}.${custom.key}
Where,
- tagNameQuery
- Can contain either the dynamic time series tag or the asset GEL query to query the time series tag id.
- ##
- Indicates this is a dynamic tag. Use dot-notation to identify the type of variable being used. The variables will have a context/namespace corresponding to where defined. The following namespaces are supported: "
asset
", "context
", "custom
". - ${asset.sourceKey}
- In the orchestration request, if the "
assetSelectionFilter
" is set to an asset GEL query, two assets will be returned as shown in the following sample. You can reference the asset attributes as${asset.uri} , ${asset.sourceKey}, ${asset.name}
, etc.[ { "uri": "/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395", "classification": "/classifications/turbine", "sourceKey": "70101", "name": "Wolverine Ravine 1", "serial_number": "4610" }, { "uri": "/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395", "classification": "/classifications/turbine", "sourceKey": "70102", "name": "Wolverine Ravine 2", "serial_number": "4620" } ]
- ${context.portToFieldMapName}
- "
context
" namespace includes:portToFieldMapName
,analyticId
,assetId
,orchestrationStepId
,orchestrationRequestId
,orchestrationConfigurationId
. - ${custom.key}
- Value is passed in by
OrchestrationExecutionRequest
as "customAttributes
". In the following example,${custom.key}
will be replaced with "Hello World" .{ "orchestrationConfigurationId": "<Orchestration Configuration Id>", "assetId": null, "assetDataFieldsMap": null, "assetGroup": { "dataSourceId": "PredixAsset", "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-46606c64-619d-4db0-a059-bc2d879640ca<turbine_type" }, "modelGroupKey": null, "dataSource": [ { "dataSourceId": "Postgres Reference External Data Connector", "apiVersion": "v1", "baseUri": "http://localhost:18888" } ], "customAttributes": { "key": "Hello World" } }
Note the following.
- During orchestration execution, if the "
tagNameQuery
" is configured as dynamic tag in the port-to-field map, the defined format is processed as follows.- Orchestration context attributes, such as ANALYTIC_ID value, will replace
${context.analyticId}
. - Asset parameter from orchestration execution request, such as
sourceKey
, will replace${asset.sourceKey}
. portToFieldMap
metadata attribute, likeportToFieldMapName
will replace${context.portToFieldMapName}
.- Then the processed value will be used as the tag name to write the data to the data source, and the entry will also be added to the tag map. If the same
fieldId
is used as an input in anotherportToFieldMap
, the tag name can be found for reading the data from the data source.
- Orchestration context attributes, such as ANALYTIC_ID value, will replace
Sample: Dynamic Tag in Port-to-Field Map Output Port
The following is an example of a dynamic tag defined in the port-to-field map output port (OutputDataConnectorMap
).
{
"analyticName": "java-timeseries-demo-adder",
"analyticVersion": "1.0",
"orchestrationStepId": "sid-10001",
"iterations": [
{
"inputMaps": [
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"fieldId": "KW",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669610
},
"engUnit": "kw",
"required": true,
"dataSourceId": " PredixTimeSeries "
},
{
"valueSourceType": "DATA_CONNECTOR",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"fieldId": "vibration",
"queryCriteria": {
"start": 1455733669601,
"end": 1455733669610
},
"engUnit": "hertz",
"required": true,
"dataSourceId": "predixtimeseries"
}
],
"outputMaps": [
{
"fullyQualifiedPortName": "data.time_series.sum",
"fieldId": "bearing_temperature",
"tagNameQuery": "##${context.orchestrationRequestId}.${asset.serial_number}.${context.portToFieldMapName}.${context.fieldId}.${custom.ATTR_KEY1}",
"engUnit": "Celsius",
"dataSourceId": " Temporary , Predix Timeseries"
},
{
"fullyQualifiedPortName": "data.time_series.diff",
"fieldId": "local_windspeed",
"tagNameQuery": "##${context.orchestrationRequestId}.${asset.serial_number}.${context.portToFieldMapName}.${context.fieldId}.${custom.ATTR_KEY1}",
"engUnit": "Celsius",
"dataSourceId": " Temporary , Predix Timeseries"
}
]
}
]
}
Pre-Defined System Attributes
The following pre-defined system attributes and custom attributes (from orchestration run request) are supported for parameter substitution.
System Attribute Name | Description |
---|---|
fieldId | The field id of the port. |
assetId | The asset instance id of the current orchestration run. |
analyticId | The analytic id of the current orchestration step. |
orchestrationRequestId | The request id of the current orchestration run. |
orchestrationConfigurationId | The orchestration configuration id of the current orchestration run. |
orchestrationStepId | The orchestration step id of the current orchestration run. |
portToFieldMapName | The port-to-field-map name of the current orchestration step. |
Sample: Custom Attributes in Orchestration Run Request
The following is a sample orchestration run request with "customAttributes
" as key-value pair.
{
"orchestrationConfigurationId": "a7d1c09d-000c-4017-a073-8a3df9058ffc",
"assetId": null,
"assetDataFieldsMap": null,
"assetGroup": {
"dataSourceId": "PredixAsset",
"assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-46606c64-619d-4db0-a059-bc2d879640ca<turbine_type"
},
"customAttributes": {
"ATTR_KEY1": "test_attr1"
}
}
Note the following.
- When the orchestration execution is completed, the dynamically created time series tags will be sent to the monitoring service.
- You can query the monitoring service to read details about the dynamically generated time series tags for each orchestration execution.
Sample: Dynamic Tags in Orchestration Execution Result from Monitoring Service Query Response
The following is a sample orchestration execution result detail received from a monitoring service query response.
{
"orchestrationRequestId": "74084eb3-680c-43b3-b8b6-e50979bb447d",
"status": "COMPLETED",
"orchestrationExecutionStatus": [
{
"contextId": "74084eb3-680c-43b3-b8b6-e50979bb447d-/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395",
"assetId": "/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395",
"status": "COMPLETED",
"orchestrationStepStatus": [
{
"status": "COMPLETED",
"analyticId": "adc54168-237e-4c13-8076-8de4359207a7",
"analyticName": "Java Timeseries Demo Adder With Model - c81928a3-e76b-400b-a0ec-8b1b0b86778c",
"analyticVersion": "v1",
"analyticRequestId": "8c84e2af-f85b-11e6-9d69-ba7f8cc76dfa-sid-10001",
"stepId": "sid-10001",
"startTime": 1487698276520,
"endTime": 1487698279280,
"output": null,
"errorResponse": null,
"fieldTagMap": {
"bearing_temperature": "74084eb3-680c-43b3-b8b6-e50979bb447d.4620.step1PortToFieldMapName.bearing_temperature.test_attr1",
"local_windspeed": "74084eb3-680c-43b3-b8b6-e50979bb447d.4620.step1PortToFieldMapName.local_windspeed.test_attr1"
}
}
],
"startTime": 1487698276460,
"endTime": 1487698280700,
"errorResponse": null
},
{
"contextId": "74084eb3-680c-43b3-b8b6-e50979bb447d-/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395",
"assetId": "/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395",
"status": "COMPLETED",
"orchestrationStepStatus": [
{
"status": "COMPLETED",
"analyticId": "adc54168-237e-4c13-8076-8de4359207a7",
"analyticName": "Java Timeseries Demo Adder With Model - c81928a3-e76b-400b-a0ec-8b1b0b86778c",
"analyticVersion": "v1",
"analyticRequestId": "8b8dff7d-f85b-11e6-9d69-ba7f8cc76dfa-sid-10001",
"stepId": "sid-10001",
"startTime": 1487698274900,
"endTime": 1487698278040,
"output": null,
"errorResponse": null,
"fieldTagMap": {
"bearing_temperature": "74084eb3-680c-43b3-b8b6-e50979bb447d.4610.step1PortToFieldMapName.bearing_temperature.test_attr1",
"local_windspeed": "74084eb3-680c-43b3-b8b6-e50979bb447d.4610.step1PortToFieldMapName.local_windspeed.test_attr1"
}
}
],
"startTime": 1487698274840,
"endTime": 1487698279810,
"errorResponse": null
}
],
"errorResponse": null
}
Custom Data Connector Reference
You will build a Custom Data Connector service when analytics in your orchestration rely upon an external data source. This data connector service must be reachable by the Predix cloud using HTTPS.
A Java-based reference implementation with PostgreSQL DB support is available. Use Reference Data Connector for PostgreSQL as a starting point to develop a Custom Data Connector service that uses a PostgreSQL data source.
- Data Format
- Type: DataRequest
- Type: DataResponse
- Type: Field
- Type: OrchestrationExecutionContext
- Type: ErrorResponse
- Sample AnalyticReadDataRequest
- Sample AnalyticReadDataResponse
- Sample AnalyticDataReadResponse with Error
- Sample AnalyticDataWriteRequest
- Sample AnalyticDataWriteResponse
- Sample AnalyticDataWriteResponse with Error
Data Format
This is the overall data format:
[ [ <EpochInMs>, <Measurement Value>, <Quality> ] ]
Where:
- <EpochInMs> — Epoch Time in milliseconds
- <Measurement Value> — Measurement data value
- <Quality> — Quality (Optional)
See the following table for the expected format for all the data types.
Data Type | Expected Data Format | Description |
---|---|---|
DOUBLE | [ [ null, 10.0 ] ] | Double is represented in standard time series format. Note: If the data type is DOUBLE, the system will read the measurement data value from 1st index . |
DOUBLE_ARRAY | [ [ null, 10.0 ] , [ null, 11.0] ] | |
TIMESERIES_ARRAY | [ [1435776300000, 2, 1], [1435776400000, null], [1435776500000, 10.5, 3] ] |
Payloads are expected as follows.
API | Request Payload Type | Response Payload Type |
---|---|---|
/api/v1/analytics/customdata/read | AnalyticReadDataRequest extends DataRequest | AnalyticReadDataResponse extends DataResponse |
/api/v1/analytics/customdata/write | AnalyticWriteDataRequest extends DataRequest | AnalyticWriteDataResponse extends DataResponse |
/api/v1/analytics/customdata/healthcheck |
Type: DataRequest
This is the overall structure of a DataRequest
object.
{
"field": List<Field>,
"customAttributes": Object,
"systemAttributes": Object,
"orchestrationExecutionContext": OrchestrationExecutionContext,
"dataSourceId": String
}
See the following table for a description of the elements in a DataRequest
.
Attribute | Description |
---|---|
Field | List of Fields |
CustomAttributes | User-defined JSON object. |
SystemAttributes | Map of analytics system generated Key/Value(s). Reserved for future use. |
OrchestrationExecutionContext | Orchestration execution context with system generated IDs to track the request within analytics services. |
DataSourceId | An external data connector service identifier, added for monitoring purpose. |
Type: DataResponse
This is the overall structure of a DataResponse
object.
{
"field": List<Field>,
"orchestrationExecutionContext": OrchestrationExecutionContext,
"errorResponse": ErrorResponse,
"dataSourceId": String
}
See the following table for a description of the elements in a DataResponse
object.
Attribute | Description |
---|---|
Field | List of Fields |
OrchestrationExecutionContext | Orchestration execution context with system generated IDs to track the request within analytics services. |
DataSourceId | An external data connector service identifier, added for monitoring purpose. |
ErrorResponse | Error message details |
Type: Field
This is the overall structure of a Field
object.
{
"fieldId": String,
"fullyQualifiedPortName": String,
"dataType": String,
"engUnit": String,
"data": Object,
"queryCriteria": Object,
"errorResponse": ErrorResponse
}
See the following table for a description of the elements in a Field
object.
Attribute | Description |
---|---|
FieldId | Field identifier defined in port-to-field map. |
FullyQualifiedPortName | Unique port name to identify a port in the port-to-field map. |
DataType | Field data type. The following is the list of supported analytic data types.
|
EngUnit | Engineering Unit as defined in port-to-field map. |
QueryCriteria | A custom object to define the criteria to query the requested field as defined in port-to-field map. |
Data | Data value.
|
ErrorResponse | If there is an error in processing the request, this attribute is updated with error details. |
Type: OrchestrationExecutionContext
This is the overall structure of an OrchestrationExecutionContext
object.
{
"assetId": String,
"orchestrationConfigurationID": String,
"orchestrationExecutionRequestID": String,
"analyticId": String,
"analyticName": String,
"analyticVersion": String,
"analyticExecutionRequestID": String
}
See the following table for a description of the elements in an OrchestrationExecutionContext
object.
Attribute | Description |
---|---|
assetId | The asset identifier. |
orchestrationConfigurationID | The orchestration configuration identifier. |
orchestrationExecutionRequestID | The orchestration execution request identifier. |
analyticID | The analytic catalog entry identifier. |
analyticName | The analytic name. |
analyticVersion | The analytic version. |
analyticExecutionRequestID | The analytic execution request identifier. |
Type: ErrorResponse
This is the overall structure of an ErrorResponse
object.
{
"code": String,
"severity": String,
"detail": String,
"message": String
}
See the following table for a description of the elements in an ErrorResponse
object.
Attribute | Description |
---|---|
code | The error code. |
severity | The error severity |
message | A short error message. |
detail | A detailed error message with stack trace, etc. |
Sample AnalyticReadDataRequest
{
"field": [
{
"fieldId": "KW",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"dataType": "DOUBLE_ARRAY",
"engUnit": "kw",
"data": [],
"queryCriteria": {
"columns": [
"recorded_at",
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "KW",
"valueType": "string",
"relation": " = "
}
]
},
"errorResponse": null
},
{
"fieldId": "vibration",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"dataType": "DOUBLE_ARRAY",
"engUnit": "hertz",
"data": [],
"queryCriteria": {
"columns": [
"recorded_at",
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "vibration",
"valueType": "string",
"relation": " = "
}
]
},
"errorResponse": null
}
],
"customAttributes": {
"IS_GENERIC_SCHEMA": "TRUE"
},
"systemAttributes": null,
"orchestrationExecutionContext": {
"assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
"orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
"orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
"analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95",
"analyticName": null,
"analyticVersion": null,
"analyticExecutionRequestID": null
},
"dataSourceId": "Postgres Reference External Data Connector"
}
Sample AnalyticReadDataResponse
{
"field": [
{
"fieldId": "KW",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"dataType": "DOUBLE_ARRAY",
"engUnit": "kw",
"data": [
[
1473366334967,
144.33
],
[
1473366334968,
244.33
],
[
1473366334969,
344.33
]
],
"queryCriteria": {
"columns": [
"recorded_at",
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "KW",
"valueType": "string",
"relation": " = "
}
]
}
},
{
"fieldId": "vibration",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"dataType": "DOUBLE_ARRAY",
"engUnit": "hertz",
"data": [
[
1473366334967,
1244.33
],
[
1473366334968,
2244.33
],
[
1473366334969,
3244.33
]
],
"queryCriteria": {
"columns": [
"recorded_at",
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "vibration",
"valueType": "string",
"relation": " = "
}
]
}
}
],
"orchestrationExecutionContext": {
"assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
"orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
"orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
"analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95"
}
}
Sample AnalyticDataReadResponse with Error
{
"field": [
{
"fieldId": "KW",
"fullyQualifiedPortName": "data.time_series.numberArray1",
"dataType": "DOUBLE_ARRAY",
"engUnit": "kw",
"data": [],
"queryCriteria": {
"columns": [
"recorded_at",
"value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "KW",
"valueType": "string",
"relation": " = "
}
]
},
"errorResponse": {
"code": "FIELD_EXCEPTION",
"message": "Unable to retrieve field KW. StatementCallback; bad SQL grammar [select recorded_at, value from sensor_data where asset_id = '/assets/32-3c686c25-2f57-4f13-8cf0-04bc6bb26866' and recorded_at < current_timestamp and field_id = 'KW']; nested exception is org.postgresql.util.PSQLException: ERROR: column \"value\" does not exist\n Position: 21",
"parameters": []
}
},
{
"fieldId": "vibration",
"fullyQualifiedPortName": "data.time_series.numberArray2",
"dataType": "DOUBLE_ARRAY",
"engUnit": "hertz",
"data": [
[
1473366334967,
1244.33
],
[
1473366334968,
2244.33
],
[
1473366334969,
3244.33
]
],
"queryCriteria": {
"columns": [
"recorded_at",
"data_value"
],
"table": "sensor_data",
"conditions": [
{
"key": "asset_id",
"value": "${ASSET_ID}",
"valueType": "string",
"relation": " = "
},
{
"key": "recorded_at",
"value": "current_timestamp",
"valueType": "none",
"relation": " < "
},
{
"key": "field_id",
"value": "vibration",
"valueType": "string",
"relation": " = "
}
]
}
}
],
"orchestrationExecutionContext": {
"assetId": "/assets/32-3c686c25-2f57-4f13-8cf0-04bc6bb26866",
"orchestrationConfigurationID": "6dfae6f5-8be1-4f90-b567-0970a34f2c53",
"orchestrationExecutionRequestID": "accab63f-da95-4ac4-8120-fcfca109011d",
"analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95"
}
}
Sample AnalyticDataWriteRequest
{
"field": [
{
"fieldId": "bearing temperature",
"fullyQualifiedPortName": "data.time_series.sum",
"dataType": "DOUBLE_ARRAY",
"engUnit": "Celsius",
"data": [
[
1473366334967,
1388.6599999999999,
"2"
],
[
1473366334968,
2488.66,
"2"
],
[
1473366334969,
3588.66,
"2"
]
],
"queryCriteria": {
"columns": [
"field_id",
"recorded_at",
"data_value"
],
"table": "sensor_data"
},
"errorResponse": null
}
],
"customAttributes": {
"IS_GENERIC_SCHEMA": "TRUE"
},
"systemAttributes": null,
"orchestrationExecutionContext": {
"assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
"orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
"orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
"analyticId": null,
"analyticName": null,
"analyticVersion": null,
"analyticExecutionRequestID": null
},
"dataSourceId": "Postgres Reference External Data Connector"
}
Sample AnalyticDataWriteResponse
{
"field": [
{
"fieldId": "bearing temperature",
"fullyQualifiedPortName": "data.time_series.sum",
"dataType": "DOUBLE_ARRAY",
"engUnit": "Celsius",
"data": [
[
1473366334967,
1388.6599999999999,
"2"
],
[
1473366334968,
2488.66,
"2"
],
[
1473366334969,
3588.66,
"2"
]
],
"queryCriteria": {
"columns": [
"field_id",
"recorded_at",
"data_value"
],
"table": "sensor_data"
}
}
],
"orchestrationExecutionContext": {
"assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
"orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
"orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530"
}
}
Sample AnalyticDataWriteResponse with Error
"field": [
{
"fieldId": "bearing temperature",
"fullyQualifiedPortName": "data.time_series.sum",
"dataType": "DOUBLE_ARRAY",
"engUnit": "Celsius",
"data": [
[
1473366334967,
1388.6599999999999,
"2"
],
[
1473366334968,
2488.66,
"2"
],
[
1473366334969,
3588.66,
"2"
]
],
"queryCriteria": {
"columns": [
"field_id",
"recorded_at",
"value"
],
"table": "sensor_data"
},
"errorResponse": {
"code": "FIELD_EXCEPTION",
"message": "Unable to update field bearing temperature. StatementCallback; bad SQL grammar [ insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334967::double precision/1000), 1388.6599999999999 ); insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334968::double precision/1000), 2488.66 ); insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334969::double precision/1000), 3588.66 )]; nested exception is java.sql.BatchUpdateException: Batch entry 0 insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334967::double precision/1000), 1388.6599999999999 ) was aborted. Call getNextException to see the cause.",
"parameters": []
}
}
],
"orchestrationExecutionContext": {
"assetId": "/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5",
"orchestrationConfigurationID": "eecc3b4a-274a-4fba-804a-c2d47f85e9f8",
"orchestrationExecutionRequestID": "a418c284-774a-4080-8c59-b2afd5d0cc27"
}
}
Data Conversion with Predix Time Series
The following table summarizes data conversion from Predix Time Series to analytic input.
Analytic Data Type | Conversion Logic |
---|---|
Boolean |
|
Integer |
|
Long |
|
Float |
|
Double |
|
String |
|
The following table summarizes the data type conversion from analytic output from the analytic template’s input/output data types to Predix Time Series data types.
Analytic Data Type | Conversion Logic |
---|---|
Integer, Long, Float, Double |
|
Boolean |
|
String |
|
Orchestration Configuration Service API Model Reference
The following models are used when calling the Orchestration Configuration Service.
Orchestration Configuration Entry Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
id | System-generated | String | Generated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, the ID is required, and it must match the ID in the REST endpoint path. |
name | Required | String | This field can contain1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name must be unique. |
author | Required | String | Free-form string up to 255 characters. |
description | Optional | String | Free-form string up to 1024 characters. |
createdTimestamp | System-generated | String | Marks the time the analytic configuration entry was created. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response. |
updatedTimestamp | System-generated | String | Marks the time the analytic configuration entry was last updated. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response. |
Orchestration Artifact Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
id | System-generated | String | Generated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path. |
name | Required | String | This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name must be unique. |
filename | Automatically populated | String | This field is the name of the attached artifact file. It is automatically populated by the Orchestration Configuration service. |
type | Required | String | Type of the artifact, must be one of the following values: bpmn | iomap. |
originalBinaryFile | Required | Binary | Content of the attached artifact file. |
author | Required | String | Free-form string up to 255 characters. |
description | Optional | String | Free-form string up to 1024 characters. |
createdTimestamp | System-generated | String | Marks the time the analytic artifact entry was created. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response. |
updatedTimestamp | System-generated | String | Marks the time the analytic artifact entry was last updated. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response. |
Orchestration Execution (v2) Model
assetId
, assetGroup
, assetDataFieldsMap
.Field | Field Type | Data Type | Comments |
---|---|---|---|
orchestrationConfigurationId | Required | String | ID of the orchestration configuration entry (containing the BPMN workflow XML describing the orchestration and the port-to-field maps defining the inputs and outputs for each analytic). |
assetId | String | The asset ID (from an asset model stored in Predix Asset) from which the platform will retrieve the Time Series tags for data input and output during the orchestration run. | |
assetGroup | AssetGroup (see following table) | The definition of a group of assets for which the orchestration should be run. | |
assetDataFieldsMap | Map | A map that connects field IDs (found in the port-to-field maps) to Time Series tags. | |
modelGroupKey | Required when using trained models | String | Key used for the trained model selection during runtime. This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. |
Asset Group Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
dataSourceId | Required | String | The source type of the asset model. Currently, only “PredixAsset” is supported. |
assetSelectionFilter | Required | String | A Predix Asset query that will return a list of asset IDs. The platform will run the orchestration for each asset ID returned. |
Trained Model
Field | Field Type | Data Type | Comments |
---|---|---|---|
id | System-generated | String | Generated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path. |
fileName | Automatically populated | String | This field is the name of the attached trained model file. It is automatically populated by the Orchestration Configuration service. |
modelKey | Required | String | Key used for the trained model selection during runtime. This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. |
modelVersion | Required | String | Version of the model library. This field can contain 1-255 alphanumeric, space and '-' (dash) characters. |
md5 | System-generated | String | md5 checksum of the model content. The field is automatically generated. |
description | Optional | String | Free-form string up to 1024 characters. |
createdTimestamp | System-generated | String | Marks the time the trained model entry was created. Generated by the Orchestration Configuration service. |
updatedTimestamp | System-generated | String | Marks the time the trained model entry was updated. Generated by the Orchestration Configuration service. |
Job Execution Configuration Reference
Field | Mandatory | Allowed Values | Description |
---|---|---|---|
url | Yes | REST API endpoint to invoke when the job execution is triggered by the job scheduler. | |
httpMethod | Yes | GET , POST , HEAD , OPTIONS , PUT , PATCH , DELETE , TRACE | HTTP method |
httpHeaders | No | HTTP headers needed by the REST API endpoint | |
inputData | Input data for the REST API request |
Job Scheduling Configuration Reference
Field | Mandatory | Allowed Values | Allowed Special Characters | Notes |
---|---|---|---|---|
seconds | Yes | 0-59 | , - * / | |
minutes | Yes | 0-59 | , - * / | |
hours | Yes | 0-23 | , - * / | |
dayOfMonth | Yes | 1-31 | , - * ? / L W | Special character '?' can not be used in both dayOfMonth and dayOfWeek fields at same time.Example: To schedule a job for 15th day of the month, set |
months | Yes | 1-12 or JAN-DEC | , - * / | The allowed abbreviated month names are JAN, FEB, MAR, APR, MAY, JUN, JUL, AUG, SEP, OCT, NOV and DEC. |
dayOfWeek | Yes | 1-7 or SUN-SAT | , - * ? / L # | The allowed abbreviated names for the days are SUN, MON, TUE, WED, THU, FRI and SAT. Example: To schedule a job for 15th day of the month, set |
years | No (Default: empty) | empty, 1970-2099 | , - * / | |
timeZoneId | No (Default: UTC) | Java Time Zone ID |
Character | Description |
---|---|
',' | Used to specify a list of values. For example, setting months field to "JAN,MAY,SEP' means the month of January, May and September. |
'-' | Used to specify a list of values. For example, setting months field to 'JUN-AUG' means the month of June, July and August. |
'*' | Used to specify "all values". For example, '*' in seconds field it means every second. |
'?' | Used to specify "no special value". |
'L' |
|
'W' | Special character for dayOfMonth . Used to specify the weekday nearest to the given day. For example, '20W; specifies "nearest weekday to the 20th day of the month". A weekday is Monday to Friday. |
'#' | Special character in dayOfWeek . Used with a number before and a number after. For example, 'm#n' where 'm' specifies the day of the week and 'n' specifies the nth occurrence in the month. '3#2' specifies "the second Tuesday of the month" ('3' = "Tuesday" and '#2' = 'second one in the month"). |
For more information about these special characters, please see the CronTrigger Tutorial in Quartz documentation at http://www.quartz-scheduler.org/documentation/quartz-2.x/tutorials/crontrigger.
For more information about available values for Java time zoneid
, please see Java documentation at http://docs.oracle.com/javase/7/docs/api/java/util/TimeZone.html.