Reference

Analytics Catalog Service API Model Reference

The following models are used when calling the Analytics Catalog Service API.

Analytic Catalog Entry Model

FieldField TypeData TypeComments
idSystem-generatedStringGenerated by the Analytics Catalog service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path.
nameRequiredStringThis field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name and version combination must be unique. The name cannot be updated with the PUT request.
authorRequiredStringFree-form string up to 255 characters.
descriptionOptionalStringFree-form string up to 1024 characters.
versionRequiredStringCan contain 1-32 alphanumeric, '.' (period) and '-' (dash) characters. The version cannot be updated with the PUT request.
supportedLanguageRequiredStringPossible values: java, matlab, python, python_3.
customMetadataOptionalStringFree-form string up to 4096 characters.
taxonomyLocationOptionalStringThe taxonomy directory where the analytic catalog entry will be created. This location defaults to "/uncategorized" if not specified.
createdTimestampSystem-generatedStringMarks the time the analytic catalog entry was created. Generated by the Analytics Catalog service, the timestamp is returned in ISO-8601 format as part of the response.
updatedTimestampSystem-generatedStringMarks the time the analytic catalog entry was last updated. Generated by the Analytics Catalog service, the timestamp is returned in ISO-8601 format as part of the response.

Analytic Validation Result Model

FieldField TypeData TypeComments
analyticIdSystem-generatedStringid of the analytic being validated.
validationRequestIdSystem-generatedStringid of the analytic validation request.
statusSystem-generatedStringPossible values: QUEUED, PROCESSING, COMPLETED, FAILED.
messageSystem-generatedStringDetailed analytic validation status message.
inputDataSystem-generatedStringThe analytic input data that was passed in to the validation request.
resultSystem-generatedStringThe result string returned by the analytic.
createdTimestampSystem-generatedStringTime the analytic validation was initiated. It is generated by the Analytics Catalog Service and returned in the response in ISO-8601 form.
updatedTimestampSystem-generatedStringTime the analytic validation was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601.

Analytic Deployment Request Model

FieldField TypeData TypeComments
memoryOptionalIntegerThe amount of memory in MB to allocate to the analytic. Default is 512 MB. If no value is provided, or the provided value is less than the default, the default value will be used.
diskQuotaOptionalIntegerThe disk quota in MB to allocate to the analytic. Default is 1024 MB for Java and Python analytics, 2048 for Matlab analytics. If no value is provided, or the provided value is less than the default, the default value will be used.
instancesOptionalIntegerThe number of instances of the analytic to deploy. Default is 1. If no value is provided, or the provided value is less than the default, the default value will be used.

Analytic Deployment Result Model

FieldField TypeData TypeComments
analyticIdSystem-generatedStringid of the analytic being deployed.
requestIdSystem-generatedStringid of the analytic deployment request.
statusSystem-generatedStringPossible values: QUEUED, PROCESSING, COMPLETED, FAILED.
messageSystem-generatedStringDetailed analytic deployment status message.
inputConfigDataSystem-generatedStringThe input configuration data that was passed in to the deployment request.
resultSystem-generatedStringDeprecated
createdTimestampSystem-generatedStringTime the analytic deployment was initiated. It is generated by the analytics catalog service and returned in the response in ISO-8601.
updatedTimestampSystem-generatedStringTime the analytic deployment was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601.

Analytic Execution Result Model

FieldField TypeData TypeComments
analyticIdSystem-generatedStringid of the analytic being executed.
statusSystem-generatedStringPossible values: QUEUED, PROCESSING, COMPLETED, FAILED.
messageSystem-generatedStringDetailed analytic execution status message.
inputDataSystem-generatedStringThe analytic input data that was passed in to the execution request.
resultSystem-generatedStringThe result string returned by the analytic.
createdTimestampSystem-generatedStringTime the analytic validation was initiated. It is generated by the analytics catalog service and returned in the response in ISO-8601 form.
updatedTimestampSystem-generatedStringTime the analytic validation was last updated. It is generated by the analytics catalog service and returned in the response in ISO-8601.

Asynchronous Execution Status Response Model

FieldField TypeData TypeComments
analyticIdSystem-generatedStringid of the analytic executed.
requestIdSystem-generatedStringA unique identifier for retrieving the status and results of the requested execution.
analyticExecutionStateSystem-generatedStringThe current status of the execution. Possible values: QUEUED, PROCESSING, COMPLETED or FAILED.
createdTimestampSystem-generatedStringThe time when the execution request was received.
updatedTimestampSystem-generatedStringThe time when the status last changed.

Deployed Analytic URI

The analytic URI value is needed when issuing certain REST API calls.

The deployed analytic URI format is https://<analytic id>.analytics.<Predix platform domain>.

The parts of the analytic URI are determined as follows.

  • "analytic id" is the id returned by the Analytics Catalog service when an analytic entry is created.
  • "Predix platform domain" is the sub string of the Analytics Catalog service URI starting at ".run". For example, if the analytics catalog service URI is predix-analytics-catalog-release.run.aws-usw02-pr.ice.predix.io, then the Predix platform domain is run.aws-usw02-pr.ice.predix.io.

For example, if your "analytic id" is 09718078-95e7-4b60-b74a-152838f03b41 and the Analytics Catalog service URI is predix-analytics-catalog-release.run.aws-usw02-pr.ice.predix.io, then the analytic URI is https://09718078-95e7-4b60-b74a-152838f03b41.analytics.run.aws-usw02-pr.ice.predix.io.

Required BPMN 2.0 XML Structures

The Analytics Framework uses an orchestration workflow file to trigger the analytic execution request. The orchestration workflow is defined in a BPMN file, which is an XML file conforming to the BPMN 2.0 standard.

Required Structures

The following table summarizes the required major structures in each BPMN 2.0 XML file created as an orchestration workflow file.

TagDescription
<definitions>See BPMN sample files for required content
<process>The following properties are required:
  • id: The name of the orchestration workflow file
  • isExecutable: Set this flag to true
<startEvent>The following properties are required:
  • id: Must be unique within the orchestration workflow file
  • outgoing: The outgoing sequenceFlow id
<serviceTask>Represents each orchestration step.
The following attributes must be defined as shown:
  • completionQuantity="1"
  • isForCompensation="false"
  • startQuantity="1"
  • activiti:delegateExpression="${javaDelegate}"
  • xmlns:activiti="http://activiti.org/bpmn"
The following properties must be defined as described:
  • id: Must be unique within the orchestration workflow file
  • name: Must follow the platform serviceTask naming conventions
  • incoming: The incoming sequenceFlow id
  • outgoing: The outgoing sequenceFlow id
<endEvent>The following properties are required:
  • id: Must be unique within the orchestration workflow file
  • incoming: The incoming sequenceFlow id
<sequenceFlow>The following properties are required:
  • id: Must be unique within the orchestration workflow file
  • sourceRef: The id of the event or task from which this sequenceFlow begins
  • targetRef: the id of the event or task to which this sequenceFlow leads

The serviceTask Naming Conventions

Each analytic in the orchestration is represented as a serviceTask element in the orchestration workflow file.

For analytics hosted by the Analytics Catalog, the name property of the serviceTask element must use the following format:
name="<analytic_id>::<analytic_name>::<analytic_version>"

where <analytic_id>, <analytic_name>, and <analytic_version> match the ID, name, and version assigned when the analytic was added to the catalog.

Descriptive Analytic Template Structure Reference

An analytic template defines the format of the input and output JSON structures. This reference describes how the JSON objects in a descriptive analytic template are used to generate the input structure required by the analytic.

Overview

The analytic template name (hierarchical, descriptive) describes the data format type you are working with. Descriptive data format is input and output data that is flat in structure but has self-describing attributes such as the data type and data size.

Each analytic has one template that is stored in the Analytics Catalog. The template is a JSON structure. For more information about JSON terminology, see http://www.json.org.

Type: AnalyticTemplate

This is the overall structure of a descriptive analytic template.

{
    "name" : <String>,
    "description" : <String>,
    "packageName" : <String>,
    "codeLanguage" : <String>,
    "version" : <String>,
    "author" : <String>,
    "inputDataPortDefinitions" : [<List of DataPortDefinition>],
    "constantDataPortDefinitions" : [<List of DataPortDefinition>],
    "outputDataPortDefinitions" : [<List of DataPortDefinition>]
}
FieldDescription
nameName of the analytic. Must be alphanumeric and must start with a letter.
descriptionDescription of the analytic. It can be in HTML format. However, no image tags are allowed.
packageNameReserved for future use.
codeLanguageReserved for future use.
versionVersion of the analytic.
authorAuthor of the analytic.
inputDataPortDefinitionsList of input port definitions
constantDataPortDefinitionsList of constant port definitions.
outputDataPortDefinitionsList of output port definitions.

Type: DataPortDefinition

The inputDataPortDefinitions, constantDataPortDefinitions and outputDataPortDefinitions elements of the analytic template contain arrays of DataPortDefinition that specify the inputs, constants and outputs of the analytic.

{
    "timeSeriesName" : <String>,
    "name" : <String>,
    "description" : <String>,
    "dataType" : <String>,
    "variable" : <Boolean>,
    "arrayDimensions" : <Integer>,
    "validators" : [<List of String>]
}

See the following table for a description of the elements in a DataPortDefinition.

FieldDescription
timeSeriesNameLeave field blank for non-timeseries port definition. For timeseries port definition this is the name of the timeseries
namePort name.
descriptionDescription of the data port.
dataTypeThe data type in which the analytic expects input data and produces output data. The framework will convert values to and from this data type between the analytic and the data source. Possible values are: DOUBLE, INTEGER, LONG, STRING, BOOLEAN, DATETIME, DATE, TIME, CUSTOM, TIMESERIES.
variableSet value to true if port can have a variable number of ports at runtime.
arrayDimensionsIndicate the number of dimensions (i.e. 0 is a single value, 1 is a 1-dimensional array, 2 is a 2-dimensional array, etc.).
validatorsReserved for future use.

Template Examples

Analytic templates are used at runtime in conjunction with Port-to-Field Map configurations to control the runtime engine's handling of the analytic input and output. See Samples of Descriptive Analytic Templates and Port-to-Field Maps.

Samples of Descriptive Analytic Templates and Port-to-Field Maps

An analytic template is used at runtime in conjunction with a port-to-field map to control how the framework's runtime engine will handle the analytic input and output. The following are samples of descriptive analytic templates and port-to-field maps.

Sample: Analytic Adder with Double Array Ports

The following is a sample JSON input.

{
  "internalRecords": [],
  "records": [
    {
      "type": "DOUBLE",
      "name": "numberArray1",
      "data": [
        5.0,
        6.0,
        7.0,
        8.0,
        9.0,
        10.0
      ],
      "arraySizes": [
        6
      ]
    },
    {
      "type": "DOUBLE",
      "name": "numberArray2",
      "data": [
        500.0,
        600.0,
        700.0,
        800.0,
        900.0,
        1000.0
      ],
      "arraySizes": [
        6
      ]
    }
  ],
  "timeSeriesRecords": []
}

The following is a sample JSON output.

{
  "records": [
    {
      "type": "DOUBLE",
      "name": "results",
      "data": [
        505.0,
        606.0,
        707.0,
        808.0,
        909.0,
        1010.0
      ],
      "arraySizes": [
        6
      ]
    }
  ]
}

The following is a sample descriptive analytic template.

{
  "name": "DoubleArrayAdder",
  "description": "",
  "packageName": "",
  "codeLanguage": "JAVA",
  "version": "1.0.0",
  "author": "Predix Analytics team",
  "inputDataPortDefinitions": [
    {
      "name": "numberArray1",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": false,
      "arrayDimensions": 1
    },
    {
      "name": "numberArray2",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": false,
      "arrayDimensions": 1
    }
  ],
  "outputDataPortDefinitions": [
    {
      "name": "results",
      "description": "",
      "dataType": "Double",
      "valueRequired": false,
      "variable": false,
      "arrayDimensions": 1
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "DoubleArrayAdder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "numberArray1",
          "fieldId": "KW",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "numberArray2",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "results",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

Sample: Analytic Adder with Timeseries and Constant Ports

Scenario: An analytic requires a time series table and one constant, where the time series input table has two fixed columns; numberArray1 and numberArray2. Sample analytic templates and port-to-field maps to support this scenario follow.

Note:

All columns in the Time Series array will share the same time stamps. The quality attribute from Time Series is not supported.

The following is a sample JSON input.

{
    "internalRecords": [],
    "records": [
        {
            "type": "DOUBLE",
            "name": "constant1",
            "data": [
                55.0
            ],
            "arraySizes": [
                1
            ]
        }
    ],
    "timeSeriesRecords": [
        {
            "name": "time_series",
            "timestamps": [
                "2016-02-17T18:27:49.601",
                "2016-02-17T18:27:49.602",
                "2016-02-17T18:27:49.603"
            ],
            "data": [
                {
                    "type": "DOUBLE",
                    "name": "numberArray1",
                    "data": [
                        1.0,
                        2.0,
                        3.0
                    ],
                    "arraySizes": [
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "numberArray2",
                    "data": [
                        100.0,
                        200.0,
                        300.0
                    ],
                    "arraySizes": [
                        3
                    ]
                }
            ]
        }
    ]
}

The following is a sample analytic output.

{
  "internalRecords": [],
  "records": [],
  "timeSeriesRecords": [
    {
      "name": "time_series",
      "timestamps": [
        "2016-02-17T18:27:49.601",
        "2016-02-17T18:27:49.602",
        "2016-02-17T18:27:49.603"
      ],
      "data": [
        {
          "type": "DOUBLE",
          "name": "results",
          "data": [
            101.0,
            202.0,
            303.0
          ],
          "arraySizes": [
            3
          ]
        }
      ]
    }
  ]
}

The following is a sample descriptive analytic template.

{
  "name": "TimeseriesAdder",
  "description": "",
  "packageName": "",
  "codeLanguage": "JAVA",
  "version": "1.0.0",
  "author": "Predix Analytics team",
  "inputDataPortDefinitions": [
    {
      "name": "time_series",
      "description": "",
      "dataType": "TimeSeries",
      "valueRequired": false,
      "variable": false,
      "arrayDimensions": 0
    },
    {
      "timeSeriesName": "time_series",
      "name": "numberArray1",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": false,
      "arrayDimensions": 1
    },
    {
      "timeSeriesName": "time_series",
      "name": "numberArray2",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": false,
      "arrayDimensions": 1
    }
  ],
  "constantDataPortDefinitions": [
    {
      "name": "constant1",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": false,
      "arrayDimensions": 0
    }
  ],
  "outputDataPortDefinitions": [
    {
      "name": "time_series",
      "description": "",
      "dataType": "TimeSeries",
      "valueRequired": false,
      "variable": false,
      "arrayDimensions": 0
    },
    {
      "timeSeriesName": "time_series",
      "name": "results",
      "description": "",
      "dataType": "Double",
      "valueRequired": false,
      "variable": false,
      "arrayDimensions": 1
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "timeseries-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "time_series.numberArray1",
          "fieldId": "KW",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "time_series.numberArray2",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "constant1",
          "fieldId": "engineType",
          "queryCriteria": {
            "columns": [
              "data_value"
            ],
            "table": "sensor_data",
            "conditions": [
              {
                "key": "asset_id",
                "value":"${ASSET_ID}",
                "valueType": "string",
                "relation": " = "
              },
              {
                "key": "field_id",
                "value": "engineType",
                "valueType": "string",
                "relation": " = "
              }
            ]
          },
          "required": true,
          "dataSourceId": "Custom Data Source"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "time_series.results",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

Sample: Analytic Adder with Variable Timeseries Ports

The following is a sample JSON input.

{
    "internalRecords": [],
    "records": [],
    "timeSeriesRecords": [
        {
            "name": "time_series",
            "timestamps": [
                "2016-02-17T18:27:49.601",
                "2016-02-17T18:27:49.602",
                "2016-02-17T18:27:49.603"
            ],
            "data": [
                {
                    "type": "DOUBLE",
                    "name": "temperature_2",
                    "data": [
                        1.0,
                        2.0,
                        3.0
                    ],
                    "arraySizes": [
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "temperature_5",
                    "data": [
                        5.0,
                        6.0,
                        7.0
                    ],
                    "arraySizes": [
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "vibration_2",
                    "data": [
                        100.0,
                        200.0,
                        300.0
                    ],
                    "arraySizes": [
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "vibration_5",
                    "data": [
                        500.0,
                        600.0,
                        700.0
                    ],
                    "arraySizes": [
                        3
                    ]
                }
            ]
        }
    ]
}

The following is a sample JSON output.

{
    "internalRecords": [],
    "records": [],
    "timeSeriesRecords": [
        {
            "name": "time_series",
            "timestamps": [
                "2016-02-17T18:27:49.601",
                "2016-02-17T18:27:49.602",
                "2016-02-17T18:27:49.603"
            ],
            "data": [
                {
                    "type": "DOUBLE",
                    "name": "results_2",
                    "data": [
                        101.0,
                        202.0,
                        303.0
                    ],
                    "arraySizes": [
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "results_5",
                    "data": [
                        505.0,
                        606.0,
                        707.0
                    ],
                    "arraySizes": [
                        3
                    ]
                }
            ]
        }
    ]
}

The following is a sample descriptive analytic template.

{
    "name": "TimeseriesAdder",
    "description": "",
    "packageName": "",
    "codeLanguage": "JAVA",
    "version": "1.0.0",
    "author": "Predix Analytics team",
    "inputDataPortDefinitions": [
        {
            "name": "time_series",
            "description": "",
            "dataType": "TimeSeries",
            "valueRequired": false,
            "variable": false,
            "arrayDimensions": 0
        },
        {
            "timeSeriesName": "time_series",
            "name": "temperature",
            "description": "",
            "dataType": "Double",
            "valueRequired": true,
            "variable": true,
            "arrayDimensions": 1
        },
        {
            "timeSeriesName": "time_series",
            "name": "vibration",
            "description": "",
            "dataType": "Double",
            "valueRequired": true,
            "variable": true,
            "arrayDimensions": 1
        }
    ],
    "outputDataPortDefinitions": [
        {
            "name": "time_series",
            "description": "",
            "dataType": "TimeSeries",
            "valueRequired": false,
            "variable": false,
            "arrayDimensions": 0
        },
        {
            "timeSeriesName": "time_series",
            "name": "results",
            "description": "",
            "dataType": "Double",
            "valueRequired": false,
            "variable": true,
            "arrayDimensions": 1
        }
    ]
}

The following is a sample port-to-field map.

{
    "analyticName": "timeseries-adder",
    "analyticVersion": "1.0",
    "orchestrationStepId": "sid-10001",
    "iterations": [
        {
            "inputMaps": [
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.temperature_2",
                    "fieldId": "temp.left",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.temperature_5",
                    "fieldId": "temp.right",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.vibration_2",
                    "fieldId": "vibration.left",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.vibration_5",
                    "fieldId": "vibration.right",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                }
            ],
            "outputMaps": [
                {
                    "fullyQualifiedPortName": "time_series.results_2",
                    "fieldId": "results.left",
                    "dataSourceId": "Predix Time Series"
                }
                {
                    "fullyQualifiedPortName": "time_series.results_5",
                    "fieldId": "results.right",
                    "dataSourceId": "Predix Time Series"
                }
            ]
        }
    ]
}

Sample: Analytic Adder with Multi-Dimensional Timeseries Ports

The following is a sample JSON input.

{
    "internalRecords": [],
    "records": [],
    "timeSeriesRecords": [
        {
            "name": "time_series",
            "timestamps": [
                "2016-02-17T18:27:49.601",
                "2016-02-17T18:27:49.602",
                "2016-02-17T18:27:49.603",
                "2016-03-17T18:27:49.601",
                "2016-03-17T18:27:49.602",
                "2016-03-17T18:27:49.603"
            ],
            "data": [
                {
                    "type": "DOUBLE",
                    "name": "temperature",
                    "data": [
                        1.0,
                        2.0,
                        3.0,
                        5.0,
                        6.0,
                        7.0
                    ],
                    "arraySizes": [
                        2,
                        3
                    ]
                },
                {
                    "type": "DOUBLE",
                    "name": "vibration",
                    "data": [
                        100.0,
                        200.0,
                        300.0,
                        500.0,
                        600.0,
                        700.0
                    ],
                    "arraySizes": [
                        2,
                        3
                    ]
                }
            ]
        }
    ]
}

The following is a sample analytic output.

{
    "internalRecords": [],
    "records": [],
    "timeSeriesRecords": [
        {
            "name": "time_series",
            "timestamps": [
                "2016-02-17T18:27:49.601",
                "2016-02-17T18:27:49.602",
                "2016-02-17T18:27:49.603",
                "2016-03-17T18:27:49.601",
                "2016-03-17T18:27:49.602",
                "2016-03-17T18:27:49.603"
            ],
            "data": [
                {
                    "type": "DOUBLE",
                    "name": "results",
                    "data": [
                        101.0,
                        202.0,
                        303.0,
                        505.0,
                        606.0,
                        707.0
                    ],
                    "arraySizes": [
                        2,
                        3
                    ]
                }
            ]
        }
    ]
}

The following is a sample descriptive analytic template.

{
    "name": "TimeseriesAdder",
    "description": "",
    "packageName": "",
    "codeLanguage": "JAVA",
    "version": "1.0.0",
    "author": "Predix Analytics team",
    "inputDataPortDefinitions": [
        {
            "name": "time_series",
            "description": "",
            "dataType": "TimeSeries",
            "valueRequired": false,
            "variable": false,
            "arrayDimensions": 0
        },
        {
            "timeSeriesName": "time_series",
            "name": "temperature",
            "description": "",
            "dataType": "Double",
            "valueRequired": true,
            "variable": false,
            "arrayDimensions": 2
        },
        {
            "timeSeriesName": "time_series",
            "name": "vibration",
            "description": "",
            "dataType": "Double",
            "valueRequired": true,
            "variable": false,
            "arrayDimensions": 2
        }
    ],
    "outputDataPortDefinitions": [
        {
            "name": "time_series",
            "description": "",
            "dataType": "TimeSeries",
            "valueRequired": false,
            "variable": false,
            "arrayDimensions": 0
        },
        {
            "timeSeriesName": "time_series",
            "name": "results",
            "description": "",
            "dataType": "Double",
            "valueRequired": false,
            "variable": false,
            "arrayDimensions": 2
        }
    ]
}

The following is a sample port-to-field map.

{
    "analyticName": "timeseries-adder",
    "analyticVersion": "1.0",
    "orchestrationStepId": "sid-10001",
    "iterations": [
        {
            "inputMaps": [
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.temperature[0]",
                    "fieldId": "temperature.previous",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.temperature[1]",
                    "fieldId": "temperature.current",
                    "queryCriteria": {
                        "start": 1458239269601,
                        "end": 1458239269603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.vibration[0]",
                    "fieldId": "vibration.previous",
                    "queryCriteria": {
                        "start": 1455733669601,
                        "end": 1455733669603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "time_series.vibration[1]",
                    "fieldId": "vibration.current",
                    "queryCriteria": {
                        "start": 1458239269601,
                        "end": 1458239269603
                    },
                    "required": true,
                    "dataSourceId": "Predix Time Series"
                }
            ],
            "outputMaps": [
                {
                    "fullyQualifiedPortName": "time_series.results[0]",
                    "fieldId": "results.previous",
                    "dataSourceId": "Predix Time Series"
                },
                {
                    "fullyQualifiedPortName": "time_series.results[1]",
                    "fieldId": "results.current",
                    "dataSourceId": "Predix Time Series"
                }
            ]
        }
    ]
}

Hierarchical Analytic Template Structure Reference

An analytic template defines the format of the input and output JSON structures. This reference describes how the JSON objects in an hierarchical analytic template are used to generate the input structure required by the analytic.

Overview

The analytic template name (hierarchical, descriptive) describes the data format type you are working with. Hierarchical data format is input and output data that can be arbitrarily nested.

Each analytic has one template that is stored in the Analytics Catalog. The template is a JSON structure. For more information about JSON terminology, see http://www.json.org.

Type: AnalyticTemplate

This is the overall structure of an hierarchical analytic template.

{
    "analyticName":<string>,
    "analyticVersion":<string>,
    "comment":[<string>],
    "inputPortDefinitions":[<List of PortDefinition>],
    "inputModelDefinitions":[<List of InputModel>],
    "outputPortDefinitions":[<List of PortDefinition>]
}

See the following table for a description of the elements in an AnalyticTemplate.

FieldDescription
analyticNameInformational only. It does not correlate with any values in the input/output JSON structure.
analyticVersionInformational only. It does not correlate with any values in the input/output JSON structure.
comment(Optional) Informational only. An array of strings. It does not correlate with any values in the input/output JSON structure.
inputPortDefinitionsThe list of port definitions defining the top level JSON objects in the input.
inputModelDefinitions The names of input models needed by the analytic. The runtime will get these models (by name) from the orchestration configuration and pass them in to the analytic as a map of byte arrays.
outputPortDefinitionsThe list of port definitions defining the top level JSON objects in the output.

Type: PortDefinition

The inputPortDefinitions and outputPortDefinitions elements of the analytic template contain arrays of PortDefinitions that specify the inputs and outputs of the analytic. There are several types of PortDefinitions, but they all have the following elements in common.

{
    "comment":[<string>],
    "portName":<string>,
    "portType":<string>,
    "variable":<boolean>
}
Is extended by the following.
  • FieldPort
  • TimeseriesArrayPort
  • CompositePort

See the following table for a description of the elements in a PortDefinition.

FieldDescription
comment(Optional) Informational only. An array of strings. It does not correlate with any values in the input/output JSON structure.
portNameBecomes the name in the name,value pair in the analytic's input or output JSON object. Shows as {“<portName>”: in the JSON object for this port definition.
portTypeDefines the nature of the value part of this JSON object. Predix Analytics Services supports the following value types:
  • COMPOSITE type: the value is an inner JSON object defined by a list of inner PortDefinitions.
  • FIELD type: the value is a single value or array of values such that the value is obtained from the port-to-field map.
  • TIMESERIES_ARRAY: the value is a Predix Analytic Time Series JSON object (see Type: TimeseriesArrayPort).
variable(Optional) True when the value in the analytic's input or output JSON object is a JSON array. The array's values will be enclosed in [ ]. Default value is false.

Type: FieldPort

A FieldPort is a type of PortDefinition that holds the actual data values to be passed to/from the analytic. A FieldPort defines a leaf node in the analytic's JSON structure.

{
    "comment":[<string>],
    "portName":<string>,
    "portType":<string>,
    "variable":<boolean>,
    "engUnit":<string>,
    "dataType":<string>,
    "required":<boolean>,
    "timestampShared":<string>
}

Extends from PortDefinition.

FieldPort definitions correspond to "<port name>": <value determined at runtime (see Port-to-Field Map description)> pairs into the JSON structure.

In addition to the common elements of a PortDefinition, a FieldPort contains the following elements.

FieldDescription
engUnit(Optional, Reserved for future use) The engineering units of the data in this field.
dataTypeThe data type in which the analytic expects input data and produces output data. The framework will convert values to and from this data type between the analytic and the data source. Possible values are: 'LONG', 'INTEGER', 'DOUBLE', 'FLOAT', 'STRING', 'BOOLEAN', "BOOLEAN_ARRAY", 'LONG_ARRAY', 'INTEGER_ARRAY', 'DOUBLE_ARRAY', 'FLOAT_ARRAY', 'STRING_ARRAY'.
required(Optional) Whether or not this field is required.
timestampSharedWhether data includes separate timestamps for each indexed block of generated data for variable input port.
  • false: if separate timestamps block is needed for variable input port.
  • true: if separate timestamp block is not needed.

Type: TimeseriesArrayPort

A TimeseriesArrayPort is a type of PortDefinition that is used to define a JSON structure that represents a time series data table. This is the overall structure of a TimeseriesArrayPort.

{
    "comment": [<string>],
    "portName": <string>,
    "portType": <string>,
    "variable": <boolean>,
    "columns": [<List of FieldPorts>]
}

Extends from PortDefinition.

In addition to the common elements of a PortDefinition, a TimeseriesArrayPort holds an array of FieldPorts called columns.

Field: columns

Contains FieldPort column definitions where each FieldPort column is an array data type. The generated JSON structure will contain one JSON array for each FieldPort column and an additional 'time_stamp' array object containing the timestamps of the table values, as the example shows below.

Note: All columns in the time series array will share the same time stamps. The quality attribute from Time Series is not supported.
{
    “<timeseries array port name>” : {
        “time_stamp” : [<timestamp values associated with field port values>],
        [<list of JSON objects for the field port definitions>]
    }
}
Note: The time series values for each of the field port definitions must contain the same set of time stamps. If they do not, the platform will trigger an error.

Type: CompositePort

A CompositePort is a type of PortDefinition that contains other PortDefinitions. This is the overall structure of a CompositePort.

{
    "comment":[<string>],
    "portName":<string>,
    "portType":<string>,
    "variable":<boolean>,
    "childrenPorts":[<List of PortDefinition>]
}

Extends from: PortDefinition.

In addition to the common elements of a PortDefinition, a CompositePort holds an array of childrenPorts.

Field: childrenPorts
Contains a list of inner (children) PortDefinitions. When the variable flag is false, the generated JSON object (corresponding to the composite port) will contain each child port as a direct child node of the composite port, following this pattern.
{
    “<composite port name>”: {
        <JSON structure from the Port-to-Field Map for the first child port definition>, 
		
        ...
		
        <JSON structure from the Port-to-Field Map for the last child port definition>
    } 
}
When the variable flag is true, the generated JSON object (corresponding to the composite port) will wrap the child nodes in an array under the composite port, following this pattern.
{
    "<composite port name>" : [
        {
            <JSON structure from Port-to-Field Map for the first child>,
            
            ...,
            
            <JSON structure from Port-to-Field Map for the last child>,        
        },
        
        ...,
        
        {
            <JSON structure from Port-to-Field Map for the first child>,
            
            ...,
            
            <JSON structure from Port-to-Field Map for the last child>,        
        }
    ]
}

Type: InputModel

An InputModel is a named byte array that the analytic uses in its processing. The values for the model are managed as part of an orchestration configuration and are associated with an asset context.

{ "modelPortName": <string>,
 "comments" : [<string>] }

See the following table for a description of the elements in an InputModel.

FieldDescription
modelPortNameThe name the analytic uses to pull the model from the map of byte arrays.
comment(Optional) Comments describing the model.

Template Examples

Analytic templates are used at runtime in conjunction with port-to-field map configurations to control the runtime engine's handling of the analytic input and output. See Samples of Hierarchical Analytic Templates and Port-to-Field Maps.

Samples of Hierarchical Analytic Templates and Port-to-Field Maps

An analytic template is used at runtime in conjunction with a port-to-field map to control how the framework's runtime engine will handle the analytic input and output. The following are samples of hierarchical analytic templates and port-to-field maps.

Sample: DemoAdder Analytic

The following is a sample JSON input.

{
  "number1": <latest value from temperature sensor1>,
  "number2": -55
}

The following is a sample JSON output.

{"result" : <result from analytic>}

The following is a sample hierarchical analytic template.

{
  "analyticName": "add2Numbers",
  "analyticVersion": "V1.0",
  "inputPortDefinitions": [
    {
      "portType": "FIELD",
      "portName": "number1",
      "variable": false,
      "required": true
    },
    {
      "portType": "FIELD",
      "portName": "number2",
      "variable": false,
      "required": true
    }
  ],
  "outputPortDefinitions": [
    {
      "portType": "FIELD",
      "portName": "result",
      "variable": false,
      "required": true
    }
  ]
}

The following is a sample port-to-field map.

{
    "analyticName": "add2Numbers",
    "analyticVersion": "V1.0",
    "orchestrationStepId": "adjust temp by ambient",
    "iterations": [
        {
            "inputMaps": [
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "number1",
                    "fieldId": "temperature sensor1",
                    "queryCriteria": {"start": 1464989425852, "end": 1464989425852},
                    "dataSourceId": "PredixTimeSeries"
                },
                {
                    "valueSourceType": "CONSTANT",
                    "fullyQualifiedPortName": "number2",
                    "value": -55
                }
            ],
            "outputMaps": [
                {
                    "fullyQualifiedPortName": "result",
                    "fieldId": "adjusted temp",
                    "dataSourceId": "PredixTimeSeries"
                }
            ]
        }
    ]
}

Sample: Analytic Requiring a Time Series Table, Input Models, and Constants

Scenario: An analytic requires a time series table and two constants, where the time series input table has three fixed columns; column1, column2, and column3. Sample hierarchical analytic templates and port-to-field maps to support this scenario follow.

Note: All columns in the Time Series array will share the same time stamps. The quality attribute from Time Series is not supported.

The following is a sample JSON input.

{
  "inputData": {
 "data": {
        "timeseries": {
            "time_stamp": [
                000003242,
                000004242,
                000005242
            ],
            "column1": [
                10.0,
                20.0,
                30.0
            ],
            "column2": [
                1.0,
                2.0,
                3.0
            ],
            "column3": [
                0.1,
                0.2,
                0.3
            ]
        },
        "constants": {
            "constant1": 55,
            "constant2": 100
        }
    }

     },
  "inputModel": [
    {
      "modelPortName": "test-model-port-1",
      "isModelEmbedded": "true",
      "modelLib": "anVuay1tb2RlbC1iaW5hcnktMQ=="
    },
    {
      "modelPortName": "test-model-port-2",
      "isModelEmbedded": "true",
      "modelLib": "janVuay1tb2RlbC1iaW5hcnktMg=="
    }
  ]
}

{
    "data": {
        "timeseries": {
            "time_stamp": [
                000003242,
                000004242,
                000005242
            ],
            "column1": [
                10.0,
                20.0,
                30.0
            ],
            "column2": [
                1.0,
                2.0,
                3.0
            ],
            "column3": [
                0.1,
                0.2,
                0.3
            ]
        },
        "constants": {
            "constant1": 55,
            "constant2": 100
        }
    }
}

The following is a sample hierarchical analytic template.

{
    "analyticName": "analyticWith3ColumnsAnd2Constants",
    "analyticVersion": "V1.0",
    "inputPortDefinitions": [
        {
            "portType": "COMPOSITE",
            "portName": "data",
            "childrenPorts": [
                {
                    "portName": "timeseries",
                    "portType": "TIMESERIES_ARRAY",
                    "variable": false,
                    "columns": [
                        {
                            "portName": "column1",
                            "portType": "FIELD",
                            "variable": false,
                            "dataType": "DOUBLE_ARRAY",
                            "required": true
                        },
                        {
                            "portName": "column2",
                            "portType": "FIELD",
                            "variable": false,
                            "dataType": "DOUBLE_ARRAY",
                            "required": true
                        },
                        {
                            "portName": "column3",
                            "portType": "FIELD",
                            "variable": false,
                            "dataType": "DOUBLE_ARRAY",
                            "required": true
                        }
                    ]
                },
                {
                    "portType": "COMPOSITE",
                    "portName": "constants",
                    "childrenPorts": [
                        {
                            "portType": "FIELD",
                            "portName": "constant1",
                            "variable": false,
                            "dataSourceId": "PredixTimeSeries",
                            "required": true
                        },
                        {
                            "portType": "FIELD",
                            "portName": "constant2",
                            "variable": false,
                            "dataSourceId": "PredixTimeSeries",
                            "required": true
                        }
                    ]
                }
            ]
        }
    ],
,
"inputModelDefinitions": [
  {
    "modelPortName": "test-model-port-1"
  },
  {
    "modelPortName": "test-model-port-2"
  }
],

    "outputPortDefinitions" : [
        {
            "portType": "FIELD",
            "portName": "results",
            "variable": false,
            "dataSourceId": "PredixTimeSeries"
        }
    ]
}

The following is a sample port-to-field map.

{
    "analyticName": "timeseriesAnalyticSampleWith3ColumnsAnd2Constants",
    "analyticVersion": "V1.0",
    "orchestrationStepId": "adjust temp by ambient",
    "iterations": [
        {
            "inputMaps": [
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "data.timeseries.column1",
                    "fieldId": "temperature sensor",
                    "queryCriteria": {"start": 000003242, "end": 000005242},
                    "dataSourceId": "PredixTimeSeries"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "data.timeseries.column2",
                    "fieldId": "vibration sensor",
                    "queryCriteria": {"start": 000003242, "end": 000005242},
                    "dataSourceId": "PredixTimeSeries"
                },
                {
                    "valueSourceType": "DATA_CONNECTOR",
                    "fullyQualifiedPortName": "data.timeseries.column3",
                    "fieldId": "pressure sensor",
                    "queryCriteria": {"start": 000003242, "end": 000005242},
                    "dataSourceId": "PredixTimeSeries"
                },
                {
                    "valueSourceType": "CONSTANT",
                    "fullyQualifiedPortName": "data.constants.constant1",
                    "value": -55
                },
                {
                    "valueSourceType": "CONSTANT",
                    "fullyQualifiedPortName": "data.constants.constant2",
                    "value": 100
                }
            ],
"inputModelMaps": [
  {
    "modelPortName": "test-model-port-1",
    "modelName": "test-model-1",
    "modelVersion": "v1"
  },
  {
    "modelPortName": "test-model-port-2",
    "modelName": "test-model-2",
    "modelVersion": "v1"
  }
],
            "outputMaps": [
                {
                    "fullyQualifiedPortName": "result",
                    "fieldId": "xxxField",
                    "dataSourceId": "PredixTimeSeries"
                }
            ]
        }
    ]
}

Sample: Analytic Requiring a Time Series Table With Variable Time Stamps

Scenario: An analytic requires a time series table and two constants, where the time series input table has three fixed columns; column1, column2, and column3 where time stamps are different. Sample hierarchical analytic templates and port-to-field maps to support this scenario follow.

Note: All columns in the Time Series array will not have the same time stamps. The quality attribute from Time Series is not supported.

The following is a sample JSON input.

{
  "time_series_temperature": {
    "temperature": [
      100
    ],
    "time_stamp": [
      "1455733669601"
    ]
  },
  "time_series_pressure": {
    "pressure": [
      57
    ],
    "time_stamp": [
      "1455733669605"
    ]
  },
  "time_series_vibration": {
    "vibration": [
      14
    ],
    "time_stamp": [
      "1455733669609"
    ]
  }
}

The following is a sample hierarchical analytic template.

{
  "analyticName": "analytic-name",
  "analyticVersion": "1.0",
  "inputPortDefinitions": [
    {
      "portName": "time_series_temperature",
      "portType": "TIMESERIES_ARRAY",
      "required": true,
      "variable": false,
      "columns": [
        {
          "portName": "temperature",
          "portType": "FIELD",
          "variable": false,
          "dataType": "DOUBLE_ARRAY",
          "required": true
        }
      ]
    },
    {
      "portName": "time_series_pressure",
      "portType": "TIMESERIES_ARRAY",
      "required": true,
      "variable": false,
      "columns": [
        {
          "portName": "pressure",
          "portType": "FIELD",
          "variable": false,
          "dataType": "DOUBLE_ARRAY",
          "required": true
        }
      ]
    },
    {
      "portName": "time_series_vibration",
      "portType": "TIMESERIES_ARRAY",
      "required": true,
      "variable": false,
      "columns": [
        {
          "portName": "vibration",
          "portType": "FIELD",
          "variable": false,
          "dataType": "DOUBLE_ARRAY",
          "required": true
        }
      ]
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "analytic-name",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "time_series_temperature.temperature",
          "fieldId": "field-7071",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669610
          },
          "engUnit": "Celsius",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "time_series_pressure.pressure",
          "fieldId": "field-2934",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669610
          },
          "engUnit": "Celsius",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "time_series_vibration.vibration",
          "fieldId": "field-9342",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669610
          },
          "engUnit": "Celsius",
          "required": true,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.result",
          "fieldId": "field-7432",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

Sample: Array of Arrays Model

This example shows how you can create an hierarchical analytic template that represents JSON input data as an array of arrays when starting with input data similar to the following:

"Target_data": [
    [-1.3902596851290876, -6.765816812041471, -4.76418388526832, -1.2112080900902589, -1.7569610822677713], 
    [9.7817179641162, -1.145749275218344, 6.533089409154036, 6.549142463394793, -5.168794582523],
    [-0.44253227413493945, -6.43586397529881, -3.291613230961093, 1.2406735399676179, -1.4659275851421456],
    [0.01589916825431556, -4.695552518057381, -5.284477375577028, -1.9252859877753656, -2.865932498974862]....

Start by marking the analytic input port as a variable port in the analytic template.

{
  "analyticName": "analytic-with-variable-input-port",
  "analyticVersion": "1.0",
  "inputPortDefinitions": [
    {
      "portType": "FIELD",
      "portName": "Target_data",
      "variable": true,
      "dataType": "DOUBLE_ARRAY",
      "required": true
    }
  ],
  "outputPortDefinitions": [
    {
      "portName": "output",
      "portType": "FIELD",
      "variable": false,
      "dataType": "DOUBLE_ARRAY",
      "required": true
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "analytic-with-variable-input-port",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "Target_data.0",
          "fieldId": "KW",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "Target_data.1",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "output",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

When the orchestration is run, the data input will come from two time series tags. The data input will look as follows.

{
  "Target_data": [
    [
      5,
      6,
      7,
      8,
      9,
      10
    ],
    [
      500,
      600,
      700,
      800,
      900,
      1000
    ]
  ]
}

You can add more data array to the top level array by defining Target_data.2, Target_data.3, and so on.

Sample: Multiple Iterations of Same Analytic

This example shows how you can run multiple iterations of the same analytic in an orchestration execution. The following is a sample port-to-field map.

{
  "analyticName": "demo-timeseries-adder",
  "analyticVersion": "V1",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "id": "0",
      "name": " First Iteration",
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "temperature sensor",
          "queryCriteria": {
            "start": 0,
            "end": -1
          },
          "dataSourceId": "PredixTimeSeries"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray2",
          "fieldId": "vibration sensor",
          "queryCriteria": {
            "start": 0,
            "end": -1
          },
          "dataSourceId": "PredixTimeSeries"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "demo sum",
          "dataSourceId": "PredixTimeSeries"
        }
      ]
    },
    {
      "id": "1",
      "name": " Second Iteration",
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "temperature sensor",
          "queryCriteria": {
            "start": 0,
            "end": -1
          },
          "dataSourceId": "PredixTimeSeries"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray2",
          "fieldId": "vibration sensor",
          "queryCriteria": {
            "start": 0,
            "end": -1
          },
          "dataSourceId": "PredixTimeSeries"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "demo sum",
          "dataSourceId": "PredixTimeSeries"
        }
      ]
    }
  ]
}

Sample: Analytic Requiring Predix Time Series Data Using an Array of Tags

This example shows how you can run an analytic that uses Predix Time Series data that is an array of tags. For steps to run the orchestration request, see afs-orchestration-execution.html#task_9d270d7f-4d4a-4e07-9239-f121d17f1c8d.

The following is a sample hierarchical analytic template.


{
  "analyticName": "java-timeseries-demo-adder",
  "analyticVersion": "1.0",
  "inputPortDefinitions": [
    {
      "portType": "COMPOSITE",
      "portName": "data",
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series_1",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "doubleArray",
              "portType": "FIELD",
              "variable": true,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            }
          ]
        },
        {
          "portName": "time_series_2",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "integerArray",
              "portType": "FIELD",
              "variable": true,
              "dataType": "INTEGER_ARRAY",
              "required": true
            }
          ]
        }
      ]
    }
  ],
  "outputPortDefinitions": [
    {
      "portName": "data",
      "portType": "COMPOSITE",
      "required": true,
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "sum",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            }
          ]
        }
      ]
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "4-input-timeseries-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_1.doubleArray",
          "fieldId": "doubleArrayFieldId",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669603
          },
          "engUnit": "hertz",
          "required": true,
          "variable": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_2.integerArray",
          "fieldId": "integerArrayFieldId",
          "queryCriteria": {
            "start": 1455733669901,
            "end": 1455733669904
          },
          "engUnit": "kw",
          "required": true,
          "variable": true,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

The array of tags is passed in through the orchestrationConfigurationID definition as shown in the following example.

{
  "orchestrationConfigurationId": "3a5cd1da-5b6e-47fd-b74d-9e75e959810f",
  "assetDataFieldsMap": {
    "doubleArrayFieldId": [
      "timerseries_tag1",
      "timerseries_tag2",
      "timerseries_tag3"
    ],
    "integerArrayFieldId": [
      "timerseries_tag4",
      "timerseries_tag5"
    ]
  },
  "dataSource": []
}

The following shows a sample of generated input with data.

{
  "inputData": {
    "data": {
      "time_series_1": {
        "time_stamp": [
          1455733669601,
          1455733669602,
          1455733669603
        ],
        "doubleArray": [
          [
            500,
            600,
            700
          ],
          [
            5000,
            6000,
            7000
          ],
          [
            50000,
            60000,
            70000
          ]
        ]
      },
      "time_series_2": {
        "time_stamp": [
          1455733669901,
          1455733669902,
          1455733669903,
          1455733669904
        ],
        "integerArray": [
          [
            105,
            205,
            305,
            405
          ],
          [
            15,
            25,
            35,
            45
          ]
        ]
      }
    }
  }
}

Sample: Analytic Requiring Predix Time Series Data Using Variable Input Ports

This example shows how you can run an analytic that uses data having separate timestamps for each indexed block of generated data from Predix Time Series (variable input port).

The following is a sample hierarchical template format with variable ports.

Note: When the portType value is TIMESERIES_ARRAY, the timestampShared value must be false to indicate that variable ports are being used.
{
  "analyticName": "sample-hirerachical-analytic-template-with-two-variable-ports",
  "analyticVersion": "1.0",
  "inputPortDefinitions": [
    {
      "portType": "COMPOSITE",
      "portName": "data",
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series_1",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "doubleArray",
              "portType": "FIELD",
              "variable": true,
              "timestampShared": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            }
          ]
        },
        {
          "portName": "time_series_2",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "integerArray",
              "portType": "FIELD",
              "variable": true,
              "timestampShared": false,
              "dataType": "INTEGER_ARRAY",
              "required": true
            }
          ]
        }
      ]
    }
  ],
  "outputPortDefinitions": [
    {
      "portName": "data",
      "portType": "COMPOSITE",
      "required": true,
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "sum",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            }
          ]
        }
      ]
    }
  ]
}

The following is a sample port-to-field map showing that separate timestamps for each block of data is needed.

{
  "analyticName": "sample-port-to-field-map-with-two-variable-ports",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_1.doubleArray.0",
          "fieldId": "vibration1",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669603
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_1.doubleArray.1",
          "fieldId": "vibration2",
          "queryCriteria": {
            "start": 1455733669701,
            "end": 1455733669705
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_1.doubleArray.2",
          "fieldId": "vibration3",
          "queryCriteria": {
            "start": 1455733669801,
            "end": 1455733669802
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_2.integerArray.0",
          "fieldId": "kw1",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669603
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series_2.integerArray.1",
          "fieldId": "kw2",
          "queryCriteria": {
            "start": 1455733669701,
            "end": 1455733669705
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

The following is sample generated analytic input for a hierarchical template showing separate timestamps side-by-side with values.

{
  "data": {
    "time_series_1": {
      "doubleArray": [
        {
          "time_stamp": [
            1455733669601,
            1455733669602,
            1455733669603
          ],
          "values": [
            500,
            600,
            700
          ]
        },
        {
          "time_stamp": [
            1455733669701,
            1455733669702,
            1455733669703,
            1455733669704,
            1455733669705
          ],
          "values": [
            50,
            60,
            70,
            80,
            90
          ]
        },
        {
          "time_stamp": [
            1455733669801,
            1455733669802
          ],
          "values": [
            1000,
            2000
          ]
        }
      ]
    },
    "time_series_2": {
      "integerArray": [
        {
          "time_stamp": [
            1455733669101,
            1455733669102,
            1455733669103,
            1455733669104
          ],
          "values": [
            105,
            205,
            305,
            405
          ]
        },
        {
          "time_stamp": [
            1455733669101,
            1455733669202,
            1455733669203,
            1455733669204,
            1455733669205,
            1455733669206
          ],
          "values": [
            15,
            25,
            35,
            45,
            55,
            65
          ]
        }
      ]
    }
  }
}

When creating the orchestration execution request payload, pass the array type of data in assetDataFieldsMap for the fieldId. For example,

{
  "orchestrationConfigurationId": "3a5cd1da-5b6e-47fd-b74d-9e75e959810f",
  "assetDataFieldsMap": {
    "doubleArrayFieldId": [
      "timerseries_tag1",
      "timerseries_tag2",
      "timerseries_tag3"
    ],
    "integerArrayFieldId": [
      "timerseries_tag4",
      "timerseries_tag5"
    ]
  },
  "dataSource": []
}

The following is sample generate input for two timeseries variable ports.

{
  "inputData": {
    "data": {
      "time_series_1": {
        "time_stamp": [
          1455733669601,
          1455733669602,
          1455733669603
        ],
        "doubleArray": [
          [
            500,
            600,
            700
          ],
          [
            5000,
            6000,
            7000,
            8000,
            9000
          ],
          [
            50000,
            60000
          ]
        ]
      },
      "time_series_2": {
        "time_stamp": [
          1455733669101,
          1455733669102,
          1455733669103,
          1455733669104
        ],
        "integerArray": [
          [
            105,
            205,
            305,
            405
          ],
          [
            15,
            25,
            35,
            45,
            55,
            65
          ]
        ]
      }
    }
  }
}

Sample: Reusing a Port-to-Field Map When Providing Data Is Optional

Scenario: You may want to use the same port-to-field map for multiple deployments but need to vary which ports will have data mapped for certain iterations. In the InputMaps definition, you can set the required field to false to indicate that data is optional. When this port is marked optional (required: false) and data is not provided during execution, it will be ignored and execution will not be failed for missing data. The following examples show how you can achieve this.

The following is a sample hierarchical analytic template

{
  "analyticName": "java-timeseries-demo-adder",
  "analyticVersion": "1.0",
  "inputPortDefinitions": [
    {
      "portType": "COMPOSITE",
      "portName": "data",
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "numberArray1",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            },
            {
              "portName": "numberArray2",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            },
            {
              "portName": "numberArray3",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": false
            }
          ]
        }
      ]
    }
  ],
  "outputPortDefinitions": [
    {
      "portName": "data",
      "portType": "COMPOSITE",
      "required": true,
      "variable": false,
      "childrenPorts": [
        {
          "portName": "time_series",
          "portType": "TIMESERIES_ARRAY",
          "required": true,
          "variable": false,
          "columns": [
            {
              "portName": "sum",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": true
            },
            {
              "portName": "diff",
              "portType": "FIELD",
              "variable": false,
              "dataType": "DOUBLE_ARRAY",
              "required": false
            }
          ]
        }
      ]
    }
  ]
}

The following is a sample descriptive analytic template.

{
  "name": "TimeseriesAdder",
  "description": "",
  "packageName": "",
  "codeLanguage": "JAVA",
  "version": "1.0.0",
  "author": "Predix Analytics team",
  "inputDataPortDefinitions": [
    {
      "name": "time_series",
      "description": "",
      "dataType": "TimeSeries",
      "valueRequired": true,
      "variable": true,
      "arrayDimensions": 0
    },
    {
      "timeSeriesName": "time_series",
      "name": "numberArray1",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": true,
      "validators": [
        "NotNull()",
        "IsNumber()"
      ],
      "arrayDimensions": 1
    },
    {
      "timeSeriesName": "time_series",
      "name": "numberArray2",
      "description": "",
      "dataType": "Double",
      "valueRequired": true,
      "variable": true,
      "validators": [
        "NotNull()",
        "IsNumber()"
      ],
      "arrayDimensions": 1
    },
    {
      "timeSeriesName": "time_series",
      "name": "numberArray3",
      "description": "",
      "dataType": "Double",
      "valueRequired": false,
      "variable": true,
      "validators": [
        "NotNull()",
        "IsNumber()"
      ],
      "arrayDimensions": 1
    }
  ],
  "constantDataPortDefinitions": [
    {
      "timeSeriesName": "time_series",
      "name": "sum",
      "description": "",
      "dataType": "Custom",
      "valueRequired": true,
      "variable": true,
      "arrayDimensions": 1
    },
    {
      "timeSeriesName": "time_series",
      "name": "diff",
      "description": "",
      "dataType": "Boolean",
      "valueRequired": false,
      "variable": true,
      "validators": [
        "NotNull()",
        "IsNumber()"
      ],
      "arrayDimensions": 1
    }
  ]
}

The following is a sample port-to-field map.

{
  "analyticName": "4-input-timeseries-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "KW",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray2",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray3",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669605,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": false,
          "dataSourceId": "Predix Time Series"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "bearing temperature final",
          "engUnit": "Celsius",
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "fullyQualifiedPortName": "data.time_series.diff",
          "fieldId": "windspeed final",
          "engUnit": "km",
          "required": false,
          "dataSourceId": "Predix Time Series"
        }
      ]
    }
  ]
}

Sample generated input with data for optional (required: false) field.

{
    "data": {
      "time_series": {
        "time_stamp": [
          1455733669605,
          1455733669606,
          1455733669607,
          1455733669608,
          1455733669609,
          1455733669610
        ],
        "numberArray1": [
          5.0,
          6.0,
          7.0,
          8.0,
          9.0,
          10.0
        ],
        "numberArray2": [
          500.0,
          600.0,
          700.0,
          800.0,
          900.0,
          1000.0
        ],
       "numberArray3": [
          50.0,
          60.0,
          70.0,
          80.0,
          90.0,
          100.0
        ]
      }
    }
}

Sample generated input without data for optional (required: false) field.

{
    "data": {
      "time_series": {
        "time_stamp": [
          1455733669605,
          1455733669606,
          1455733669607,
          1455733669608,
          1455733669609,
          1455733669610
        ],
        "numberArray1": [
          5.0,
          6.0,
          7.0,
          8.0,
          9.0,
          10.0
        ],
        "numberArray2": [
          500.0,
          600.0,
          700.0,
          800.0,
          900.0,
          1000.0
        ]
      }
    }
}

Port-To-Field Map Reference

An analytic template defines the format of its input and output JSON structure. The port-to-field map tells the runtime engine how to get the values to insert into the analytic’s input JSON structure and where to write the values from the analytic's output JSON structure. The port-to-field map simply maps FieldPort entries from the analytic's template to data sources and sinks.

The port-to-field map is itself a JSON structure. The JSON objects in this structure are summarized in the following tables.

Type: PortToFieldMap

This is the overall structure of a port-to-field map.

{
    "analyticName":<string>,
    "analyticVersion":<string>,
    "comment":<string>,
    "orchestrationStepId":<string>,
    "iterations":[<list of Iterations>]
}

See the following table for a description of the elements in a PortToFieldMap.

FieldDescription
analyticNameInformational only. It does not correlate with any values in the input/output JSON structure.
analyticVersionInformational only. It does not correlate with any values in the input/output JSON structure.
comment(Optional) Informational only. It does not correlate with any values in the input/output JSON structure.
orchestrationStepIdInformational only. It does not correlate with any values in the input/output JSON structure.
iterationsCreate one iteration entry as described below. The Analytics Framework also supports multiple iterations of the same analytic..

Type: Iteration

An iteration defines the set of input/output PortMaps for an execution of the analytic. Fill in the "inputMaps" array with the list of PortMaps that define how to get the input value and fill in the "outputMaps" with the list of OutputDataConnectorMaps that define where to write the output values.

This is the overall structure of an iteration object.

{
    "inputMaps":[<list of PortMaps>],
    "outputMaps":[<list of OutputDataConnectorMaps>],
    "inputModelMaps":[<list of ModelPortMaps>],
    "id":<string>,
    "name":<string>
}

See the following table for a description of the elements in an Iteration.

FieldDescription
inputMapsThe list of maps that associate (map) fields from the data source with objects in the JSON input structure.
outputMapsThe list of maps that map objects from the analytic's JSON output structure to fields in the data sink.
inputModelMapsThe list of maps that identify models to be passed to the analytic at runtime.
id(Optional). The analytic iteration id, must be a unique sequence. Can be useful to track an iteration which contains an error. Type string.
  • For single iteration — if id is missing, default value is "0".
  • For multiple iteration — if id is missing, default value is "0" for first iteration. Number will be increased by 1 for each subsequent iteration.
name(Optional). The description for the iteration. Type string.

Type: PortMap

Maps an object in the JSON input structure to a constant value or a field in the data source.

{
    "valueSourceType":<string>,
    "fullyQualifiedPortName":<string>
}
Extended by:
  • InputConstantMap
  • InputDataConnectorMap

See the following table for a description of the common elements in a PortMap.

FieldDescription
valueSourceTypeA flag indicating if the map is for a constant value (“CONSTANT") or is mapping the JSON object to a field in the data source (“DATA_CONNECTOR").
fullyQualifiedPortNameIdentifies the JSON object in the analytics’s input or output JSON structure that will hold the value from the field. The port name is a fully qualified (‘.’ delimited) string capturing the path from the root node in the JSON structure to the JSON object that will contain the value from the field in the data source. If the port was defined to be variable in the analytic template, that means the analytic is expecting an array of field values. In this case, each field value will need a separate PortMap with an index (‘.1’, ‘.2’, etc.) at the end of the fullyQualifiedPortName.

Type: InputConstantMap

An InputConstantMap is a type of PortMap to be used when the data to be provided to the analytic can be stored directly within the port-to-field map itself. This is the overall structure of an input constant map.

{
    "valueSourceType":<string>,
    "fullyQualifiedPortName":<string>,
    "value":<string>
}

Extends from: PortMap.

In addition to the common elements of a PortMap, an InputConstantMap holds the value of a constant field:

Field: value
Contains the value to be passed to the constant field. This can be a numeric value (without quotes) or a string value (with quotes).

A constant input value can be parametrized and passed in at runtime using the customAttributes field of OrchestrationExecutionRequest. For example:

{
    "valueSourceType":CONSTANT,
    "fullyQualifiedPortName":"data.abc",
    "value":"${custom.CONSTANT_VALUE}"
}

Type: InputDataConnectorMap

An InputDataConnectorMap is a type of PortMap to be used when the data to be provided to the analytic needs to be retrieved from the data source at runtime by the Analytics Data Connector. This is the overall structure of an input data connector map.

{
    "valueSourceType":<string>,
    "fullyQualifiedPortName":<string>,
    "fieldId":<string>,
    "dataSourceId":<string>,
    "queryCriteria":<string>,
    "engUnit":<string>,
    "variable":<boolean>,
    "tagNameQuery":<string>
}

Extends from: PortMap.

In addition to the common elements of a PortMap, an InputDataConnectorMap contains the following entries.

FieldDescription
fieldIdThe fieldId is a string that is mapped to a Predix Time Series tag id for a given asset id. At runtime the orchestration engine gets the Predix Time Series tag id for the field id (as described below), reads the value from Predix Time Series, and puts that value in the JSON object at the location corresponding to the fullyQualifiedPortName. The runtime acquires the Predix Time Series tag id for the field id as follows:
  • At runtime, an asset id is included in the orchestration request.
  • The request will also optionally contain a map of fieldId to Predix Time Series tag ids.
  • If the map in the request contains an entry for the field id, the runtime will use the Time Series tag id from that entry.
  • If the map in the request does not contain the field id, and if the InputDataConnectorMap contains a query in the tagNameQuery, the runtime will execute the query in Predix Asset to get the tag id.
  • If the tag id for this field id has not been found, the runtime will use the tenant's defaultTagQuery to retrieve the default map of field ids to Predix Time Series tag ids and it will get the Predix Time Series tag id for this InputDataConnectorMap's field id from this map.
dataSourceIdThe following values are supported:
  • "Temporary": This is a built-in data source. The data will be held in-memory for life of orchestration (temporary store).
  • "PredixTimeSeries": This is a built-in data source. The data will be held in Predix Time Series (persistent store) .
  • "external_data_store": the unique value you provide to identify the custom data connector service and its external data source. This value cannot begin with "Predix".
queryCriteria
  • For PredixTimeSeries data source, this field provides the Predix Time Series query string for retrieving the value from Predix Time Series. Note this string should not contain any Predix Time Series tag ids. The tag ids will be inserted at runtime as per the logic for mapping the fieldId to a Predix Time Series tag id (see fieldId description). Any tag ids in this query string will be overwritten.
  • To customize queryCriteria attributes, see Customizing Query Criteria Attributes in Port-to-Field Map.
  • For an external data source, the value defined in this field is passed to the custom data connector service when a read/write request is made.
  • Optional for Temporary data source.
engUnit(Optional) The engineering units of the data in this field.
variable(Optional) True when the value in the analytic's input or output JSON object is a JSON array. The array's values will be enclosed in [ ]. Default value is false.
tagNameQuery
  • (Optional) This field contains the GEL query for querying Predix Asset for the Predix Time Series tag id that will be used to retrieve the runtime value for the JSON object that corresponds with this InputDataConnectorMap's fullyQualifiedPortName. This query can be parametrized by the request's asset id and the InputDataConnectorMap's fieldId using the ${ASSET_ID} and ${FIELD_ID} variables.
  • To configure dynamic tags, see Configuring Dynamic Tags in Port-to-Field Map.

Type: OutputDataConnectorMap

An OutputDataConnectorMap is used to identify the destination of the output data generated by the analytic at runtime. This is the overall structure of an output data connector map.

{
    "fullyQualifiedPortName":<string>,
    "fieldId":<string>,
    "dataSourceId":<string>,
    "tagNameQuery":<string>
}

An OutputDataConnectorMap contains the following entries.

FieldDescription
fullyQualitifiedPortNameIdentifies the JSON object in the analytics’s output JSON structure that will hold the value from the field. The port name is a fully qualified (‘.’ delimited) string capturing the path from the root node in the JSON analytic output data to the JSON object that will contain the value of the field to be stored in the data sink. If the port was defined to be variable in the analytic template, that means the analytic will produce an array of field values. In this case, each field value will need a separate OutputDataConnectorPortMap with an index (‘.1’, ‘.2’, etc.) at the end of the fullyQualifiedPortName.
fieldIdThe field id is a string that is mapped to a Predix Time Series tag id for a given asset id. At runtime the orchestration engine gets the Predix Time Series tag id for the field id (described in InputDataConnectorMap fieldId description), and writes the value from JSON object at the location corresponding to the fullyQualitifedPortName to the Predix Time Series tag id.
dataSourceIdFor an OutputDataConnectorMap, multiple dataSourceIds can be specified using a comma delimited list. The following values are supported:
  • "Temporary": This is a built-in data source. The data will be held in-memory for life of orchestration (temporary store).
  • "PredixTimeSeries": This is a built-in data source. The data will be held in Predix Time Series (persistent store).
  • "external_data_store": the unique value you provide to identify the custom data connector service and its external data source. This value cannot begin with "Predix".
queryCriteria
tagNameQuery
  • (Optional) This field contains the GEL query for querying Predix Asset for the Predix Time Series tag id that will be used when writing the value from the JSON object that corresponds with this OutputDataConnectorMap's fieldId using the ${ASSET_ID} and ${FIELD_ID} variables.
  • To configure dynamic tags, see afs-reference.html#task_476c42a9-99df-4703-b172-d7e5872b0415.

Type: ModelPortMap

Note: modelPortMap type is supported in hierarchical analytic templates only. It is not supported in descriptive analytic templates.

A ModelPortMap is used to retrieve the specified model from the Runtime service. This model is then passed to the analytic at runtime by the Analytics Data Connector. This is the overall structure of a model port map.

{
    "modelPortName": <string>,
    "modelName": <string>,
    "modelVersion": <string>
}

See the following table for a description of the elements in a ModelPortMap.

FieldDescription
modelPortNameThe name the analytic uses to pull the model from a map of byte arrays.
modelNameThe name of the model stored in the Configuration service.
modelVersionThe version of the model stored in the Configuration service.

Customizing Query Criteria Attributes in Port-to-Field Map

The queryCriteria field (InputDataConnectorMap and OutputDataConnectorMap) in the port-to-field map provides the data retrieval instructions. The following examples show how to add custom attributes in the queryCriteria field when using Predix Time Series as a data source.

Pre-Defined System Attributes

System Attribute NameDescription
FIELD_IDThe field id of the port.
ASSET_IDThe asset instance id of the current orchestration run.
ANALYTIC_IDThe analytic id of the current orchestration step.
ORCHESTRATION_REQUEST_IDThe request id of the current orchestration run.
ORCHESTRATION_CONFIGURATION_IDThe orchestration configuration id of the current orchestration run.
ORCESTRATION_STEP_IDThe orchestration step id of the current orchestration run.
PORT_TO_FIELD_MAP_NAMEThe port-to-field map name of the current orchestration step.

Use Latest Data Point

Retrieve the latest datapoint from Predix Time Series service (instead of specifying tag ids) by omitting the start and end time in the port-to-field map's queryCriteria field. The following example shows how.

{
  "analyticName": "timeseriesAnalyticSampleWith3ColumnsAnd2Constants",
  "analyticVersion": "V1.0",
  "orchestrationStepId": "adjust temp by ambient",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.otherData.multipleTimeseries.0.MTSColumn1.0",
          "fieldId": "pressure sensor1",
          "queryCriteria": {
            "tags": [{
              "limit": 7,
              "order": "asc",
              "aggregations": [{
              ........
            }]
          },
          "required": true,
          "dataSourceId": "Predix Time Series"
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.otherData.multipleTimeseries.0.MTSColumn1.1",
          "fieldId": "pressure sensor1",
          "queryCriteria": {},
          "dataSourceId": "Predix Time Series"
        },
        ......

Use Parametrized Variables

You can use parametrized variables for resolving at orchestration runtime in the queryCriteria field. The following port-to-field map example shows how to do this in InputDataConnectorMap. The data must be consistent with Predix Time Series /datapoints request format.

{
  "analyticName": "4-input-timeseries-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "KW",
          "queryCriteria": {
            "start": "${custom.KW.START_TIME}",
            "end": "${custom.KW.END_TIME}",
            "tags": [{
              "limit": 1000,
              "order": "desc",
              "aggregations": [{
                "type": "${.custom.AGGREGATION_TYPE}",
                "interval": "${custom.INTERVAL}"
              }],
              "groups": [{
                "name": "${ATTRIBUTE_1_KEY}",
                "attributes": [
                  "${context.ASSET_ID}_${context.FIELD_ID}_${custom.KW.ATTRIBUTE_1_VALUE_1}",
                  "${context.ASSET_ID}_${context.FIELD_ID}_${custom.KW.ATTRIBUTE_1_VALUE_2}"
                ]
              }]
            }]
          },
.......
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": "${custom.vibration.START_TIME}",
            "end": "${custom.vibration.END_TIME}",
            "tags": [{
              "limit": 1000,
              "order": "desc",
              "aggregations": [{
                "type": "${custom.AGGREGATION_TYPE}",
                "interval": "${custom.INTERVAL}"
              }],
              "groups": [{
                "name": "${ATTRIBUTE_1_KEY}",
                "attributes": [
                  "${context.ASSET_ID}_${context.FIELD_ID}_${custom.vibration.ATTRIBUTE_1_VALUE_1}",
                  "${context.ASSET_ID}_${context.FIELD_ID}_${custom.vibration.ATTRIBUTE_1_VALUE_2}"
                ]
              }]
            }]
          },

When defining the OutputDataConnectorMap, note the following restrictions.

  • Output only supports key-value pairs of String datatype.
  • You must represent the Predix Time Series Web Socket write API structure.
  • messageId field is mandatory when writing to Time Series. Either provide hard coded value or a parametrized variable substituted at orchestration runtime.
  • If queryCriteria field is not specified for the output port, a system generated messageId and default attribute "source"="orchestration" key-value pair is added when writing to time series

The following is a sample queryCriteria in OutputDataConnectorMap.

........
 "outputMaps": [
  {
    "fullyQualifiedPortName": "data.time_series.sum",
    "fieldId": "bearing temperature final",
    "engUnit": "Celsius",
    "dataSourceId": "Predix Time Series",
    "queryCriteria" :     {
      "messageId": "${custom.MESSAGE_ID}",
      "body": [
       {
         "attributes": {
          "${custom.ATTRIBUTE_1_KEY}" : "${custom.ASSET_ID}_${custom.FIELD_ID}_${custom.bearing temperature final.ATTRIBUTE_1_VALUE_1}",
          "${custom.ATTRIBUTE_2_KEY}" : "${context.ASSET_ID}_${context.FIELD_ID}_${custom.bearing temperature final.ATTRIBUTE_1_VALUE_2}"
 
          }
        }
      ]
    }
.......

Provide block dynamic parametrized variables in queryCriteria field in InputDataConnectorMap as follows.

{
  "analyticName": "4-input-timeseries-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "KW",
          "queryCriteria": "${custom.QUERY_CRITERIA1}",
.......

Provide block dynamic parametrized variables in queryCriteria field in OutputDataConnectorMap as follows.

........
 "outputMaps": [
  {
    "fullyQualifiedPortName": "data.time_series.sum",
    "fieldId": "bearing temperature final",
    "engUnit": "Celsius",
    "dataSourceId": "Predix Time Series",
    "queryCriteria" :  "${custom.QUERY_CRITERIA2}"
.......

Configuring Dynamic Tags in Port-to-Field Map

When running an orchestration request using Predix Time Series tags, you can pass dynamic tags using customAttributes as a key-value pair. The following examples show how to do this.

Overview

The tagNameQuery field (InputDataConnectorMap and OutputDataConnectorMap) in the port-to-field map contains the GEL query for querying Predix Asset for the Predix Time Series tag id. Dynamic time series tags are supported by defining a dynamic tag template in the following format.

##${asset.sourceKey}.${context.portToFieldMapName}.${context.analyticId}.${custom.key}

Where,

tagNameQuery
Can contain either the dynamic time series tag or the asset GEL query to query the time series tag id.
##
Indicates this is a dynamic tag. Use dot-notation to identify the type of variable being used. The variables will have a context/namespace corresponding to where defined. The following namespaces are supported: "asset", "context", "custom".
${asset.sourceKey}
In the orchestration request, if the "assetSelectionFilter" is set to an asset GEL query, two assets will be returned as shown in the following sample. You can reference the asset attributes as ${asset.uri} , ${asset.sourceKey}, ${asset.name}, etc.
[
  {
    "uri": "/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395",
    "classification": "/classifications/turbine",
    "sourceKey": "70101",
    "name": "Wolverine Ravine 1",
    "serial_number": "4610"
  },
  {
    "uri": "/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395",
    "classification": "/classifications/turbine",
    "sourceKey": "70102",
    "name": "Wolverine Ravine 2",
    "serial_number": "4620"
  }
]
${context.portToFieldMapName}
"context" namespace includes: portToFieldMapName, analyticId, assetId, orchestrationStepId, orchestrationRequestId, orchestrationConfigurationId.
${custom.key}
Value is passed in by OrchestrationExecutionRequest as "customAttributes". In the following example, ${custom.key} will be replaced with "Hello World" .
{
  "orchestrationConfigurationId": "<Orchestration Configuration Id>",
  "assetId": null,
  "assetDataFieldsMap": null,
  "assetGroup": {
    "dataSourceId": "PredixAsset",
    "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-46606c64-619d-4db0-a059-bc2d879640ca<turbine_type"
  },
  "modelGroupKey": null,
  "dataSource": [
    {
      "dataSourceId": "Postgres Reference External Data Connector",
      "apiVersion": "v1",
      "baseUri": "http://localhost:18888"
    }
  ],
  "customAttributes": {
    "key": "Hello World"
  }
}

Note the following.

  • During orchestration execution, if the "tagNameQuery" is configured as dynamic tag in the port-to-field map, the defined format is processed as follows.
    • Orchestration context attributes, such as ANALYTIC_ID value, will replace ${context.analyticId}.
    • Asset parameter from orchestration execution request, such as sourceKey, will replace ${asset.sourceKey}.
    • portToFieldMap metadata attribute, like portToFieldMapName will replace ${context.portToFieldMapName}.
    • Then the processed value will be used as the tag name to write the data to the data source, and the entry will also be added to the tag map. If the same fieldId is used as an input in another portToFieldMap, the tag name can be found for reading the data from the data source.

Sample: Dynamic Tag in Port-to-Field Map Output Port

The following is an example of a dynamic tag defined in the port-to-field map output port (OutputDataConnectorMap).

{
  "analyticName": "java-timeseries-demo-adder",
  "analyticVersion": "1.0",
  "orchestrationStepId": "sid-10001",
  "iterations": [
    {
      "inputMaps": [
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray1",
          "fieldId": "KW",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669610
          },
          "engUnit": "kw",
          "required": true,
          "dataSourceId": " PredixTimeSeries "
        },
        {
          "valueSourceType": "DATA_CONNECTOR",
          "fullyQualifiedPortName": "data.time_series.numberArray2",
          "fieldId": "vibration",
          "queryCriteria": {
            "start": 1455733669601,
            "end": 1455733669610
          },
          "engUnit": "hertz",
          "required": true,
          "dataSourceId": "predixtimeseries"
        }
      ],
      "outputMaps": [
        {
          "fullyQualifiedPortName": "data.time_series.sum",
          "fieldId": "bearing_temperature",
          "tagNameQuery": "##${context.orchestrationRequestId}.${asset.serial_number}.${context.portToFieldMapName}.${context.fieldId}.${custom.ATTR_KEY1}",
          "engUnit": "Celsius",
          "dataSourceId": " Temporary , Predix Timeseries"
        },
        {
          "fullyQualifiedPortName": "data.time_series.diff",
          "fieldId": "local_windspeed",
          "tagNameQuery": "##${context.orchestrationRequestId}.${asset.serial_number}.${context.portToFieldMapName}.${context.fieldId}.${custom.ATTR_KEY1}",
          "engUnit": "Celsius",
          "dataSourceId": " Temporary , Predix Timeseries"
        }
      ]
    }
  ]
}

Pre-Defined System Attributes

The following pre-defined system attributes and custom attributes (from orchestration run request) are supported for parameter substitution.

System Attribute NameDescription
fieldIdThe field id of the port.
assetIdThe asset instance id of the current orchestration run.
analyticIdThe analytic id of the current orchestration step.
orchestrationRequestIdThe request id of the current orchestration run.
orchestrationConfigurationIdThe orchestration configuration id of the current orchestration run.
orchestrationStepIdThe orchestration step id of the current orchestration run.
portToFieldMapNameThe port-to-field-map name of the current orchestration step.

Sample: Custom Attributes in Orchestration Run Request

The following is a sample orchestration run request with "customAttributes" as key-value pair.

{
    "orchestrationConfigurationId": "a7d1c09d-000c-4017-a073-8a3df9058ffc",
    "assetId": null,
    "assetDataFieldsMap": null,
     "assetGroup": {
        "dataSourceId": "PredixAsset",
        "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-46606c64-619d-4db0-a059-bc2d879640ca<turbine_type"
    },
    "customAttributes": {   
        "ATTR_KEY1": "test_attr1"
    }
}

Note the following.

  • When the orchestration execution is completed, the dynamically created time series tags will be sent to the monitoring service.
  • You can query the monitoring service to read details about the dynamically generated time series tags for each orchestration execution.

Sample: Dynamic Tags in Orchestration Execution Result from Monitoring Service Query Response

The following is a sample orchestration execution result detail received from a monitoring service query response.

{
  "orchestrationRequestId": "74084eb3-680c-43b3-b8b6-e50979bb447d",
  "status": "COMPLETED",
  "orchestrationExecutionStatus": [
    {
      "contextId": "74084eb3-680c-43b3-b8b6-e50979bb447d-/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395",
      "assetId": "/assets/37-3ed8356b-4c46-431e-b4e7-bb6371c39395",
      "status": "COMPLETED",
      "orchestrationStepStatus": [
        {
          "status": "COMPLETED",
          "analyticId": "adc54168-237e-4c13-8076-8de4359207a7",
          "analyticName": "Java Timeseries Demo Adder With Model - c81928a3-e76b-400b-a0ec-8b1b0b86778c",
          "analyticVersion": "v1",
          "analyticRequestId": "8c84e2af-f85b-11e6-9d69-ba7f8cc76dfa-sid-10001",
          "stepId": "sid-10001",
          "startTime": 1487698276520,
          "endTime": 1487698279280,
          "output": null,
          "errorResponse": null,
          "fieldTagMap": {
            "bearing_temperature": "74084eb3-680c-43b3-b8b6-e50979bb447d.4620.step1PortToFieldMapName.bearing_temperature.test_attr1",
            "local_windspeed": "74084eb3-680c-43b3-b8b6-e50979bb447d.4620.step1PortToFieldMapName.local_windspeed.test_attr1"
          }
        }
      ],
      "startTime": 1487698276460,
      "endTime": 1487698280700,
      "errorResponse": null
    },
    {
      "contextId": "74084eb3-680c-43b3-b8b6-e50979bb447d-/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395",
      "assetId": "/assets/32-3ed8356b-4c46-431e-b4e7-bb6371c39395",
      "status": "COMPLETED",
      "orchestrationStepStatus": [
        {
          "status": "COMPLETED",
          "analyticId": "adc54168-237e-4c13-8076-8de4359207a7",
          "analyticName": "Java Timeseries Demo Adder With Model - c81928a3-e76b-400b-a0ec-8b1b0b86778c",
          "analyticVersion": "v1",
          "analyticRequestId": "8b8dff7d-f85b-11e6-9d69-ba7f8cc76dfa-sid-10001",
          "stepId": "sid-10001",
          "startTime": 1487698274900,
          "endTime": 1487698278040,
          "output": null,
          "errorResponse": null,
          "fieldTagMap": {
            "bearing_temperature": "74084eb3-680c-43b3-b8b6-e50979bb447d.4610.step1PortToFieldMapName.bearing_temperature.test_attr1",
            "local_windspeed": "74084eb3-680c-43b3-b8b6-e50979bb447d.4610.step1PortToFieldMapName.local_windspeed.test_attr1"
          }
        }
      ],
      "startTime": 1487698274840,
      "endTime": 1487698279810,
      "errorResponse": null
    }
  ],
  "errorResponse": null
}

Custom Data Connector Reference

You will build a Custom Data Connector service when analytics in your orchestration rely upon an external data source. This data connector service must be reachable by the Predix cloud using HTTPS.

A Java-based reference implementation with PostgreSQL DB support is available. Use Reference Data Connector for PostgreSQL as a starting point to develop a Custom Data Connector service that uses a PostgreSQL data source.

Data Format

This is the overall data format:

 [ [ <EpochInMs>, <Measurement Value>, <Quality> ] ] 

Where:

  • <EpochInMs> — Epoch Time in milliseconds
  • <Measurement Value> — Measurement data value
  • <Quality> — Quality (Optional)

See the following table for the expected format for all the data types.

Data TypeExpected Data FormatDescription
DOUBLE[ [ null, 10.0 ] ]Double is represented in standard time series format.
Note: If the data type is DOUBLE, the system will read the measurement data value from 1st index.
DOUBLE_ARRAY[ [ null, 10.0 ] , [ null, 11.0] ]
TIMESERIES_ARRAY[ [1435776300000, 2, 1], [1435776400000, null], [1435776500000, 10.5, 3] ]

Payloads are expected as follows.

APIRequest Payload TypeResponse Payload Type
/api/v1/analytics/customdata/readAnalyticReadDataRequest extends DataRequestAnalyticReadDataResponse extends DataResponse
/api/v1/analytics/customdata/writeAnalyticWriteDataRequest extends DataRequestAnalyticWriteDataResponse extends DataResponse
/api/v1/analytics/customdata/healthcheck

Type: DataRequest

This is the overall structure of a DataRequest object.

{
  "field": List<Field>,
  "customAttributes": Object,
  "systemAttributes": Object,
  "orchestrationExecutionContext": OrchestrationExecutionContext,
  "dataSourceId": String
}

See the following table for a description of the elements in a DataRequest.

AttributeDescription
FieldList of Fields
CustomAttributesUser-defined JSON object.
SystemAttributesMap of analytics system generated Key/Value(s). Reserved for future use.
OrchestrationExecutionContextOrchestration execution context with system generated IDs to track the request within analytics services.
DataSourceIdAn external data connector service identifier, added for monitoring purpose.

Type: DataResponse

This is the overall structure of a DataResponse object.

{
  "field": List<Field>,
  "orchestrationExecutionContext": OrchestrationExecutionContext,
  "errorResponse": ErrorResponse,
  "dataSourceId": String
}

See the following table for a description of the elements in a DataResponse object.

AttributeDescription
FieldList of Fields
OrchestrationExecutionContextOrchestration execution context with system generated IDs to track the request within analytics services.
DataSourceIdAn external data connector service identifier, added for monitoring purpose.
ErrorResponseError message details

Type: Field

This is the overall structure of a Field object.

{
  "fieldId": String,
  "fullyQualifiedPortName": String,
  "dataType": String,
  "engUnit": String,
  "data": Object,
  "queryCriteria": Object,
  "errorResponse": ErrorResponse
}

See the following table for a description of the elements in a Field object.

AttributeDescription
FieldIdField identifier defined in port-to-field map.
FullyQualifiedPortNameUnique port name to identify a port in the port-to-field map.
DataTypeField data type. The following is the list of supported analytic data types.
  • LONG
  • INTEGER
  • DOUBLE
  • FLOAT
  • STRING
  • BOOLEAN
  • LONG_ARRAY
  • INTEGER_ARRAY
  • DOUBLE_ARRAY
  • FLOAT_ARRAY
  • STRING_ARRAY
  • BOOLEAN_ARRAY
  • TIMESERIES_ARRAY
EngUnitEngineering Unit as defined in port-to-field map.
QueryCriteriaA custom object to define the criteria to query the requested field as defined in port-to-field map.
DataData value.
  • For read request, provide the data for this field in AnalyticDataReadResponse.
  • For write request, analytics adds the analytic generated output data in this attribute.
ErrorResponseIf there is an error in processing the request, this attribute is updated with error details.

Type: OrchestrationExecutionContext

This is the overall structure of an OrchestrationExecutionContext object.

{
  "assetId": String,
  "orchestrationConfigurationID": String,
  "orchestrationExecutionRequestID": String,
  "analyticId": String,
  "analyticName": String,
  "analyticVersion": String,
  "analyticExecutionRequestID": String
}

See the following table for a description of the elements in an OrchestrationExecutionContext object.

AttributeDescription
assetIdThe asset identifier.
orchestrationConfigurationIDThe orchestration configuration identifier.
orchestrationExecutionRequestIDThe orchestration execution request identifier.
analyticIDThe analytic catalog entry identifier.
analyticNameThe analytic name.
analyticVersionThe analytic version.
analyticExecutionRequestIDThe analytic execution request identifier.

Type: ErrorResponse

This is the overall structure of an ErrorResponse object.

{
  "code": String,
  "severity": String,
  "detail": String,
  "message": String
}

See the following table for a description of the elements in an ErrorResponse object.

AttributeDescription
codeThe error code.
severityThe error severity
messageA short error message.
detailA detailed error message with stack trace, etc.

Sample AnalyticReadDataRequest

{
    "field": [
        {
            "fieldId": "KW",
            "fullyQualifiedPortName": "data.time_series.numberArray1",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "kw",
            "data": [],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "KW",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            },
            "errorResponse": null
        },
        {
            "fieldId": "vibration",
            "fullyQualifiedPortName": "data.time_series.numberArray2",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "hertz",
            "data": [],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "vibration",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            },
            "errorResponse": null
        }
    ],
    "customAttributes": {
        "IS_GENERIC_SCHEMA": "TRUE"
    },
    "systemAttributes": null,
    "orchestrationExecutionContext": {
        "assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
        "orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
        "orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
        "analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95",
        "analyticName": null,
        "analyticVersion": null,
        "analyticExecutionRequestID": null
    },
    "dataSourceId": "Postgres Reference External Data Connector"
}

Sample AnalyticReadDataResponse

{
    "field": [
        {
            "fieldId": "KW",
            "fullyQualifiedPortName": "data.time_series.numberArray1",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "kw",
            "data": [
                [
                    1473366334967,
                    144.33
                ],
                [
                    1473366334968,
                    244.33
                ],
                [
                    1473366334969,
                    344.33
                ]
            ],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "KW",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            }
        },
        {
            "fieldId": "vibration",
            "fullyQualifiedPortName": "data.time_series.numberArray2",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "hertz",
            "data": [
                [
                    1473366334967,
                    1244.33
                ],
                [
                    1473366334968,
                    2244.33
                ],
                [
                    1473366334969,
                    3244.33
                ]
            ],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "vibration",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            }
        }
    ],
    "orchestrationExecutionContext": {
        "assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
        "orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
        "orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
        "analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95"
    }
}

Sample AnalyticDataReadResponse with Error

{
    "field": [
        {
            "fieldId": "KW",
            "fullyQualifiedPortName": "data.time_series.numberArray1",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "kw",
            "data": [],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "KW",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            },
            "errorResponse": {
                "code": "FIELD_EXCEPTION",
                "message": "Unable to retrieve field KW. StatementCallback; bad SQL grammar [select recorded_at, value from sensor_data where  asset_id  =  '/assets/32-3c686c25-2f57-4f13-8cf0-04bc6bb26866' and  recorded_at  <  current_timestamp and  field_id  =  'KW']; nested exception is org.postgresql.util.PSQLException: ERROR: column \"value\" does not exist\n  Position: 21",
                "parameters": []
            }
        },
        {
            "fieldId": "vibration",
            "fullyQualifiedPortName": "data.time_series.numberArray2",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "hertz",
            "data": [
                [
                    1473366334967,
                    1244.33
                ],
                [
                    1473366334968,
                    2244.33
                ],
                [
                    1473366334969,
                    3244.33
                ]
            ],
            "queryCriteria": {
                "columns": [
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data",
                "conditions": [
                    {
                        "key": "asset_id",
                        "value": "${ASSET_ID}",
                        "valueType": "string",
                        "relation": " = "
                    },
                    {
                        "key": "recorded_at",
                        "value": "current_timestamp",
                        "valueType": "none",
                        "relation": " < "
                    },
                    {
                        "key": "field_id",
                        "value": "vibration",
                        "valueType": "string",
                        "relation": " = "
                    }
                ]
            }
        }
    ],
    "orchestrationExecutionContext": {
        "assetId": "/assets/32-3c686c25-2f57-4f13-8cf0-04bc6bb26866",
        "orchestrationConfigurationID": "6dfae6f5-8be1-4f90-b567-0970a34f2c53",
        "orchestrationExecutionRequestID": "accab63f-da95-4ac4-8120-fcfca109011d",
        "analyticId": "b9e999a1-8931-459d-8d34-2f76c2b9fd95"
    }
}

Sample AnalyticDataWriteRequest

{
    "field": [
        {
            "fieldId": "bearing temperature",
            "fullyQualifiedPortName": "data.time_series.sum",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "Celsius",
            "data": [
                [
                    1473366334967,
                    1388.6599999999999,
                    "2"
                ],
                [
                    1473366334968,
                    2488.66,
                    "2"
                ],
                [
                    1473366334969,
                    3588.66,
                    "2"
                ]
            ],
            "queryCriteria": {
                "columns": [
                    "field_id",
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data"
            },
            "errorResponse": null
        }
    ],
    "customAttributes": {
        "IS_GENERIC_SCHEMA": "TRUE"
    },
    "systemAttributes": null,
    "orchestrationExecutionContext": {
        "assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
        "orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
        "orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530",
        "analyticId": null,
        "analyticName": null,
        "analyticVersion": null,
        "analyticExecutionRequestID": null
    },
    "dataSourceId": "Postgres Reference External Data Connector"
}

Sample AnalyticDataWriteResponse

{
    "field": [
        {
            "fieldId": "bearing temperature",
            "fullyQualifiedPortName": "data.time_series.sum",
            "dataType": "DOUBLE_ARRAY",
            "engUnit": "Celsius",
            "data": [
                [
                    1473366334967,
                    1388.6599999999999,
                    "2"
                ],
                [
                    1473366334968,
                    2488.66,
                    "2"
                ],
                [
                    1473366334969,
                    3588.66,
                    "2"
                ]
            ],
            "queryCriteria": {
                "columns": [
                    "field_id",
                    "recorded_at",
                    "data_value"
                ],
                "table": "sensor_data"
            }
        }
    ],
    "orchestrationExecutionContext": {
        "assetId": "/assets/32-90effe42-eb21-4611-b734-83f707d89d7a",
        "orchestrationConfigurationID": "e06982f5-f446-46d2-bc21-9fdf64c111ab",
        "orchestrationExecutionRequestID": "8f990980-b208-4418-99b1-f46d26cd4530"
    }
}

Sample AnalyticDataWriteResponse with Error

  "field": [
    {
      "fieldId": "bearing temperature",
      "fullyQualifiedPortName": "data.time_series.sum",
      "dataType": "DOUBLE_ARRAY",
      "engUnit": "Celsius",
      "data": [
        [
          1473366334967,
          1388.6599999999999,
          "2"
        ],
        [
          1473366334968,
          2488.66,
          "2"
        ],
        [
          1473366334969,
          3588.66,
          "2"
        ]
      ],
      "queryCriteria": {
        "columns": [
          "field_id",
          "recorded_at",
          "value"
        ],
        "table": "sensor_data"
      },
      "errorResponse": {
        "code": "FIELD_EXCEPTION",
        "message": "Unable to update field bearing temperature. StatementCallback; bad SQL grammar [ insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334967::double precision/1000), 1388.6599999999999 );  insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334968::double precision/1000), 2488.66 );  insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334969::double precision/1000), 3588.66 )]; nested exception is java.sql.BatchUpdateException: Batch entry 0  insert into sensor_data(asset_id, field_id, recorded_at, value) values( '/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5', 'bearing temperature', to_timestamp(1473366334967::double precision/1000), 1388.6599999999999 ) was aborted.  Call getNextException to see the cause.",
        "parameters": []
      }
    }
  ],
  "orchestrationExecutionContext": {
    "assetId": "/assets/37-a57b74c1-28e1-44a5-b59f-7456411a7ab5",
    "orchestrationConfigurationID": "eecc3b4a-274a-4fba-804a-c2d47f85e9f8",
    "orchestrationExecutionRequestID": "a418c284-774a-4080-8c59-b2afd5d0cc27"
  }
}

Data Conversion with Predix Time Series

The following table summarizes data conversion from Predix Time Series to analytic input.

Analytic Data TypeConversion Logic
Boolean
  • null converted to null
  • 0.0 converted to false
  • Otherwise returns true
Integer
  • The value is rounded toward zero using IEEE 754 round-toward-zero mode. If the resulting value can be represented as an integer, the value is returned. Otherwise, the smallest representable value of type integer is returned.
  • null to null
  • 4.2 to 4
  • 4.9 to 4
  • - 16.1 to - 16
  • - 16.9 to - 16
  • 9000000000.0 to 2147483647
  • -9000000000.0 to -2147483648
Long
  • The value is rounded toward zero using IEEE 754 round-toward-zero mode. If the resulting value can be represented as a long, the value is returned. Otherwise, the smallest representable value of type long is returned.
  • null to null
  • 4.2 to 4
  • 4.9 to 4
  • - 16.1 to - 16
  • - 16.9 to - 16
  • 9000000000.0 to 9000000000
  • -9000000000.0 to -9000000000
Float
  • The value is converted using IEEE 754 rounding rules. The conversion can lose precision as well as range.
  • null to null
  • -16.9889998989 to -16.989
Double
  • null to null
  • otherwise returns the value as is
String
  • null to null
  • otherwise returns the string representation of the value

The following table summarizes the data type conversion from analytic output from the analytic template’s input/output data types to Predix Time Series data types.

Analytic Data TypeConversion Logic
Integer, Long, Float, Double
  • null to null
  • Otherwise value is written as is.
Boolean
  • null to null
  • true to 1.0
  • false to 0.0
String
  • If string is a representation of a numeric value, value is written as is.
  • null to null
  • Otherwise, an exception is thrown.

Orchestration Configuration Service API Model Reference

The following models are used when calling the Orchestration Configuration Service.

Orchestration Configuration Entry Model

FieldField TypeData TypeComments
idSystem-generatedStringGenerated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, the ID is required, and it must match the ID in the REST endpoint path.
nameRequiredStringThis field can contain1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name must be unique.
authorRequiredStringFree-form string up to 255 characters.
descriptionOptionalStringFree-form string up to 1024 characters.
createdTimestampSystem-generatedStringMarks the time the analytic configuration entry was created. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response.
updatedTimestampSystem-generatedStringMarks the time the analytic configuration entry was last updated. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response.

Orchestration Artifact Model

FieldField TypeData TypeComments
idSystem-generatedStringGenerated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path.
nameRequiredStringThis field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter. The name must be unique.
filenameAutomatically populatedStringThis field is the name of the attached artifact file. It is automatically populated by the Orchestration Configuration service.
typeRequiredStringType of the artifact, must be one of the following values: bpmn | iomap.
originalBinaryFileRequiredBinaryContent of the attached artifact file.
authorRequiredStringFree-form string up to 255 characters.
descriptionOptionalStringFree-form string up to 1024 characters.
createdTimestampSystem-generatedStringMarks the time the analytic artifact entry was created. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response.
updatedTimestampSystem-generatedStringMarks the time the analytic artifact entry was last updated. Generated by the Orchestration Configuration service, the timestamp is returned in ISO-8601 format as part of the response.

Orchestration Execution (v2) Model

Note: An orchestration execution request can contain only one of the following fields: assetId, assetGroup, assetDataFieldsMap.
FieldField TypeData TypeComments
orchestrationConfigurationIdRequiredStringID of the orchestration configuration entry (containing the BPMN workflow XML describing the orchestration and the port-to-field maps defining the inputs and outputs for each analytic).
assetIdStringThe asset ID (from an asset model stored in Predix Asset) from which the platform will retrieve the Time Series tags for data input and output during the orchestration run.
assetGroupAssetGroup (see following table)The definition of a group of assets for which the orchestration should be run.
assetDataFieldsMapMapA map that connects field IDs (found in the port-to-field maps) to Time Series tags.
modelGroupKeyRequired when using trained modelsStringKey used for the trained model selection during runtime. This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter.

Asset Group Model

FieldField TypeData TypeComments
dataSourceIdRequiredStringThe source type of the asset model. Currently, only “PredixAsset” is supported.
assetSelectionFilterRequiredStringA Predix Asset query that will return a list of asset IDs. The platform will run the orchestration for each asset ID returned.

Trained Model

FieldField TypeData TypeComments
idSystem-generatedStringGenerated by the Orchestration Configuration service, the ID is returned as part of the response. This field is ignored in the POST request. In the PUT request, this ID is required, and it must match the ID in the REST endpoint path.
fileNameAutomatically populatedStringThis field is the name of the attached trained model file. It is automatically populated by the Orchestration Configuration service.
modelKeyRequiredStringKey used for the trained model selection during runtime. This field can contain 1-255 alphanumeric, space and '-' (dash) characters. This entry must start with a letter.
modelVersionRequiredStringVersion of the model library. This field can contain 1-255 alphanumeric, space and '-' (dash) characters.
md5System-generatedStringmd5 checksum of the model content. The field is automatically generated.
descriptionOptionalStringFree-form string up to 1024 characters.
createdTimestampSystem-generatedStringMarks the time the trained model entry was created. Generated by the Orchestration Configuration service.
updatedTimestampSystem-generatedStringMarks the time the trained model entry was updated. Generated by the Orchestration Configuration service.

Job Execution Configuration Reference

FieldMandatoryAllowed ValuesDescription
urlYesREST API endpoint to invoke when the job execution is triggered by the job scheduler.
httpMethodYesGET, POST, HEAD, OPTIONS, PUT, PATCH, DELETE, TRACEHTTP method
httpHeadersNoHTTP headers needed by the REST API endpoint
inputDataInput data for the REST API request

Job Scheduling Configuration Reference

FieldMandatoryAllowed ValuesAllowed Special CharactersNotes
secondsYes0-59, - * /
minutesYes0-59, - * /
hoursYes0-23, - * /
dayOfMonthYes1-31, - * ? / L W Special character '?' can not be used in both dayOfMonth and dayOfWeek fields at same time.

Example: To schedule a job for 15th day of the month, set dayOfMonth to 15 and dayOfWeek to '?'.

monthsYes1-12 or JAN-DEC, - * /The allowed abbreviated month names are JAN, FEB, MAR, APR, MAY, JUN, JUL, AUG, SEP, OCT, NOV and DEC.
dayOfWeekYes1-7 or SUN-SAT, - * ? / L #The allowed abbreviated names for the days are SUN, MON, TUE, WED, THU, FRI and SAT.

Example: To schedule a job for 15th day of the month, set dayOfMonth to 15 and dayOfWeek to '?'.

yearsNo (Default: empty)empty, 1970-2099, - * /
timeZoneIdNo (Default: UTC)Java Time Zone ID
Table 1. Special Characters Summary
CharacterDescription
','Used to specify a list of values. For example, setting months field to "JAN,MAY,SEP' means the month of January, May and September.
'-'Used to specify a list of values. For example, setting months field to 'JUN-AUG' means the month of June, July and August.
'*'Used to specify "all values". For example, '*' in seconds field it means every second.
'?'Used to specify "no special value".
'L'
  • dayOfMonth - Used to specify the last day of the month. For example, 31st in January, 28th in February for non-leap year. and so on.
  • dayOfWeek - Used by itself, 'L' specifies '7' or 'SAT'. 'L' can be combined with another value, for example '3L', to designate "the last Tuesday of the month". (3 specifies 'TUES').
'W'Special character for dayOfMonth. Used to specify the weekday nearest to the given day. For example, '20W; specifies "nearest weekday to the 20th day of the month". A weekday is Monday to Friday.
'#'Special character in dayOfWeek. Used with a number before and a number after. For example, 'm#n' where 'm' specifies the day of the week and 'n' specifies the nth occurrence in the month. '3#2' specifies "the second Tuesday of the month" ('3' = "Tuesday" and '#2' = 'second one in the month").

For more information about these special characters, please see the CronTrigger Tutorial in Quartz documentation at http://www.quartz-scheduler.org/documentation/quartz-2.x/tutorials/crontrigger.

For more information about available values for Java time zoneid, please see Java documentation at http://docs.oracle.com/javase/7/docs/api/java/util/TimeZone.html.