Orchestration Execution

Deploying an Orchestration Workflow File

Add the orchestration workflow file to the database by deploying it to the runtime engine. Do this once before running the first orchestration execution that uses the workflow file. Or whenever you update the BPMN workflow file.

Run the following REST API call using the orchestrationConfigurationId (see Uploading an Orchestration Configuration Entry).
POST <execution_uri>/api/v2/execution/orchestrations/{orchestrationConfigurationId}/deployment

Passing Custom Attributes During an Orchestration

When running an orchestration request, you can pass custom attributes as follows.

Custom attributes are used for the following situations.

  • Customizing query criteria attributes in port-to-field map.
  • Configuring dynamic tags in port-to-field map.
  • These custom attributes are also passed to custom data connectors.

The following is a sample of the structure required to add custom attributes (customAttributes) in an orchestration run request for a single asset.

{
    "orchestrationConfigurationId": "a7d1c09d-000c-4017-a073-8a3df9058ffc",
    "assetId": null,
    "assetDataFieldsMap": null,
    "assetGroup": {
        "dataSourceId": "PredixAsset",
        "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-b3d857c0-bb3d-4b82-b61c-199b75c6f1ef<turbine_type"
    },
    "modelGroupKey": null,
    "dataSource": [
 
    ],
    "customAttributes": {
        "KW.START_TIME": 1455733669601,
        "KW.END_TIME": 1455733669610,
        "vibration.START_TIME": 1455733669601,
        "vibration.END_TIME": 1455733669610,
        "ATTRIBUTE_1_KEY": "test_attr_key1",
        "ATTRIBUTE_2_KEY": "test_attr_key2",
        "KW.ATTRIBUTE_1_VALUE_1": "test-val1",
        "KW.ATTRIBUTE_1_VALUE_2": "test-val2",
        "vibration.ATTRIBUTE_1_VALUE_1": "test-val1",
        "vibration.ATTRIBUTE_1_VALUE_2": "test-val2",
        "AGGREGATION_TYPE":"interpolation",
        "INTERVAL":"1hr", 
        "MESSAGE_ID": "test-message-id-1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_1":"test_write_val1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_2":"test_write_val2",
 
        "QUERY_CRITERIA1": {
            "start": "${START_TIME}",
            "end": "${END_TIME}",
            "tags": [
                {
                    "limit": "${LIMIT}",
                    "order": "${ORDER}"
                }
            ]
        },
        "START_TIME": 1455733669601,
        "END_TIME": 1455733669610,
        "LIMIT": 1000,
        "ORDER": "desc",
        "QUERY_CRITERIA2": {
            "messageId": "${MESSAGE_ID}",
            "body": [
                {
                    "name": null,
                    "attributes": {
                        "${ATTRIBUTE_2_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_2_VALUE}",
                        "${ATTRIBUTE_1_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_1_VALUE}"
                    }
                }
            ]
        },
        "double array.ATTRIBUTE_2_VALUE": "test_write_attr_val2",
        "double array.ATTRIBUTE_1_VALUE": "test_write_attr_val1"
    }
}
A constant input value can be parametrized and passed in at runtime through the customAttributes field of OrchestrationExecutionRequest. The following sample uses customAttributes to pass in the value for CONSTANT input port, setting value at runtime for port "abc".
{
    "orchestrationConfigurationId": "a7d1c09d-000c-4017-a073-8a3df9058ffc",
    "assetId": null,
    "assetDataFieldsMap": null,
     "assetGroup": {
        "dataSourceId": "PredixAsset",
        "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=15sl-46606c64-619d-4db0-a059-bc2d879640ca<turbine_type"
    },
    "customAttributes": {   
        "CONSTANT_VALUE": "testvalue-for-constant-inputport-abc"    }
}

Note the following requirements when passing custom attributes.

  • Variable placeholders are case sensitive and must be enclosed by ${..}.
  • Variables will have the namespace where defined. Supported namespaces are: "asset", "context", "custom". At runtime:
    • The context namespace is for the system defined variables. For example, the analytic id for the current orchestration step will replace ${context.ANALYTIC_ID}, ${context.analyticId}. This applies to all the pre-defined system attributes.
    • The asset namespace is for the asset attributes from the assetGroupQuery result. For example, the sourceKey attribute in the response from assetGroupQuery will replace ${asset.sourceKey}.
    • The custom namespace is the customAttributes defined in the orchestration execution request. A custom attribute value from orchestration execution request MESSAGE_ID will replace corresponding ${custom.MESSAGE_ID}.
  • Variables can be alphanumeric, and include dots (.), underscores (_), spaces ( ).
  • If the parameter does not have a valid namespace, the value will be searched against all the supported namespaces in following order: custom, context, asset.
  • Query criteria must be valid JSON.
  • If the pre-defined system attributes are used in the query, the system will substitute values automatically.
  • Numeric placeholders (e.g., $(LIMIT)) must be double quoted. The system will automatically substitute the placeholder with a numeric value.
  • Only string and numeric values can be substituted. Boolean (true and false) must be double-quoted as a string.
  • All variables must be specified in customAttributes field. Any unsubstituted variables will cause the orchestration to fail.
  • Dynamic query criteria (queryCriteria) can have embedded parametrized system or custom attributes.

Running an Orchestration with One Analytic

The following example walks you through the process to run an orchestration with one analytic.

Before You Begin

  • The REST headers for making requests to Analytics services must be properly configured. See Configuring REST Request Headers.
  • The analytic must be deployed to Predix platform (you can use a sample analytic). You will need the following information to complete this task.
    • analytic catalog entry ID
    • analytic name
    • analytic version

About This Task

Procedure

  1. Download the sample BPMN workflow (OrchestrationWithOneAnalytic.xml) for the Activiti workflow engine from https://github.com/PredixDev/predix-analytics-sample/tree/master/orchestrations.

    Alternatively, copy and save the following sample BPMN workflow file for Activiti workflow engine.

    <?xml version="1.0" encoding="UTF-8"?>
    <definitions xmlns="http://www.omg.org/spec/BPMN/20100524/MODEL"
                 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                 expressionLanguage="http://www.w3.org/1999/XPath" id="sid-81430087-7a44-4be3-8517-914faf923256"
                 targetNamespace="DSP-PM" typeLanguage="http://www.w3.org/2001/XMLSchema"
                 xsi:schemaLocation="http://www.omg.org/spec/BPMN/20100524/MODEL http://www.omg.org/spec/BPMN/2.0/20100501/BPMN20.xsd"
                 xmlns:activiti="http://activiti.org/bpmn">
    
        <process id="OrchestrationWithOneAnalytic" isExecutable="true">
    
            <startEvent id="sid-start-event"
                        name="">
                <outgoing>sid-flow1</outgoing>
            </startEvent>
    
            <serviceTask completionQuantity="1"
                         id="sid-10001"
                         isForCompensation="false"
                         name="<Analytic Catalog Entry Id>::<Analytic Name>::<Analytic Version>"
                         startQuantity="1"
                         activiti:delegateExpression="${javaDelegate}"
                         xmlns:activiti="http://activiti.org/bpmn">
                <incoming>sid-flow1</incoming>
                <outgoing>sid-flow2</outgoing>
            </serviceTask>
    
            <endEvent id="sid-end-event" name="">
                <incoming>sid-flow2</incoming>
            </endEvent>
    
    
            <sequenceFlow id="sid-flow1"
                          name="" sourceRef="sid-start-event"
                          targetRef="sid-10001"/>
    
            <sequenceFlow id="sid-flow2"
                          name="" sourceRef="sid-10001" targetRef="sid-end-event"/>
    
        </process>
    
    </definitions>
    
  2. Edit the saved sample BPMN workflow file and replace <Analytic Catalog Entry Id>, <Analytic Name> and <Analytic Version> with the analytic catalog entry ID, analytic name, and analytic version.
  3. Prepare the input for the demo analytic and save as a file.
    [
      {
        "analyticStepId": "sid-10001",
        "data": "{ \"number1\" : 5, \"number2\" : 24 }"
      }
    ]
    
    • It is an array of input data for the orchestration.
    • Each element of analyticInputData array represents the input data for the corresponding serviceTask in the BPMN workflow. In the BPMN workflow XML, the serviceTask is identified by /definition/process/serviceTask/@id. This serviceTask id is correlated by analyticInputData.analyticStepId in orchestration execution request. The input data for the serviceTask can be specified with analyticInputData.data.
    • Please note that if the input data is in JSON format, the input data needs to be escaped properly.
  4. Send an orchestration execution request by issuing the following REST API request.
    POST  <execution_uri>/api/v2/execution/testrun

    The following is a sample orchestration execution request. The request is a multipart/form-data type with the following parts.

    Name (Case Sensitive)ValueType
    bpmnBPMN workflow XML file from Step 2file
    inputAnalytic input data file from Step 3file

    The following is a sample orchestration execution response.

    {
        "orchestrationStatus": "COMPLETED",
        "analyticOutputData": [
            {
                "analyticStepId": "sid-10001",
                "data": "{\"result\":29}"
            }
        ],
        "contextID": "Execution of Orchestration with One Analytic_777536",
        "output": null,
        "name": "Orchestration with One Analytic"
    } 

Running an Orchestration Using Predix Time Series Tags

To run an orchestration using Predix Time Series tag ids, proceed as follows. The orchestration will communicate with the Predix Time Series service to retrieve and store data for use by the referenced analytics.

Before You Begin

  • All referenced analytics hosted by the Analytics Catalog must be deployed to Cloud Foundry and running.
  • All referenced analytics must be written to accept input and provide output in the expected format.
  • All referenced analytics must have their analytic templates uploaded to the Analytics Catalog.

Procedure

  1. Create an Orchestration Configuration and make a note of its id (orchestrationConfigurationId).
  2. Associate each field in the orchestration configuration's port-to-field map with a tag name in the Time Series service. This information must be collected into a map similar to the follow example.
    "assetDataFieldsMap": {
             "temperatureSensor2": "temperatureSensor2Tag",
             "temperatureSensor1": "temperatureSensor1Tag",
             "temperatureSensorSum": "temperatureSensorSumTag"
     }

    The example contains the fieldIds of three fields: two from input (temperatureSensor1 and temperatureSensor2), one for output (temperatureSensorSum), along with their associated tags in the Time Series service.

  3. Create the orchestration request body containing the orchestrationConfigurationId from Step 1 and the Predix Time Series tag map from Step 2.

    The following is a sample orchestration execution request.

    {
         "orchestrationConfigurationId": "cdf096be-a590-4d0e-a5a9-aaf9d8f3f6db",
         "assetDataFieldsMap": {
             "temperatureSensor2": "temperatureSensor2Tag",
             "temperatureSensor1": "temperatureSensor1Tag",
             "temperatureSensorSum": "temperatureSensorSumTag"
         },
         “modelGroupKey”: “example-model-key”
     }
    Note:modelGroupKey” is only required when an orchestration contains a trained analytic.
  4. Optional: If passing custom attributes, the orchestration execution request with "customAttributes" definition will be similar to the following.
    {
      "orchestrationConfigurationId": "cdf096be-a590-4d0e-a5a9-aaf9d8f3f6db",
      "assetDataFieldsMap": {
        "temperatureSensor2": "temperatureSensor2Tag",
        "temperatureSensor1": "temperatureSensor1Tag",
        "temperatureSensorSum": "temperatureSensorSumTag"
      },
      "modelGroupKey": "example-model-key",
      "dataSource": [],
      "customAttributes": {
        "KW.START_TIME": 1455733669601,
        "KW.END_TIME": 1455733669610,
        "vibration.START_TIME": 1455733669601,
        "vibration.END_TIME": 1455733669610,
        "ATTRIBUTE_1_KEY": "test_attr_key1",
        "ATTRIBUTE_2_KEY": "test_attr_key2",
        "KW.ATTRIBUTE_1_VALUE_1": "test-val1",
        "KW.ATTRIBUTE_1_VALUE_2": "test-val2",
        "vibration.ATTRIBUTE_1_VALUE_1": "test-val1",
        "vibration.ATTRIBUTE_1_VALUE_2": "test-val2",
        "AGGREGATION_TYPE": "interpolation",
        "INTERVAL": "1hr",
        "MESSAGE_ID": "test-message-id-1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_1": "test_write_val1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_2": "test_write_val2",
        "QUERY_CRITERIA1": {
          "start": "${START_TIME}",
          "end": "${END_TIME}",
          "tags": [
            {
              "limit": "${LIMIT}",
              "order": "${ORDER}"
            }
          ]
        },
        "START_TIME": 1455733669601,
        "END_TIME": 1455733669610,
        "LIMIT": 1000,
        "ORDER": "desc",
        "QUERY_CRITERIA2": {
          "messageId": "${MESSAGE_ID}",
          "body": [
            {
              "name": null,
              "attributes": {
                "${ATTRIBUTE_2_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_2_VALUE}",
                "${ATTRIBUTE_1_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_1_VALUE}"
              }
            }
          ]
        },
        "double array.ATTRIBUTE_2_VALUE": "test_write_attr_val2",
        "double array.ATTRIBUTE_1_VALUE": "test_write_attr_val1"
      }
    }
  5. Run the orchestration using Predix Time Series tags by issuing the following REST API request.
    POST <execution_uri>/api/v2/execution/async

Results

The following is a sample response.

{
  "status": "PROCESSING",
  "orchestrationExecutionStatus": [
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-null",
      "assetId": null
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    }
  ],
  "orchestrationRequestId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3",
  "errorResponse": null
}

Running an Orchestration Using Predix Time Series with an Array of Tags

To run an orchestration using Predix Time Series array tag ids, proceed as follows. The orchestration will communicate with the Predix Time Series service to retrieve and store data for use by the referenced analytics.

Before You Begin

  • All referenced analytics hosted by the Analytics Catalog must be deployed to Cloud Foundry and running.
  • All referenced analytics must be written to accept input and provide output in the expected format.
  • All referenced analytics must have their analytic templates uploaded to the Analytics Catalog.

About This Task

Note: You must use an hierarchical analytic template type when running an orchestration with Predix Time Series array data.

Procedure

  1. Create an orchestration configuration entry and make a note its id (orchestrationConfigurationId).
  2. Associate each field in the orchestration configuration's port-to-field map with a tag name in the Time Series service. This information must be collected into a map similar to the follow example. The array type of data is passed in assetdataFieldID as shown.
    "assetDataFieldsMap": {
        "doubleArrayFieldId":  ["timerseries_tag1" , "timerseries_tag2", "timerseries_tag3" ] ,
        "integerArrayFieldId":  ["timerseries_tag4" , "timerseries_tag5" ]
      }
  3. Create the orchestration request body containing the orchestrationConfigurationId from Step 1 and the Predix Time Series tag map from Step 2.

    The following is a sample orchestration execution request.

    {
      "orchestrationConfigurationId": "3a5cd1da-5b6e-47fd-b74d-9e75e959810f",
      "assetDataFieldsMap": {
        "doubleArrayFieldId": [
          "timerseries_tag1",
          "timerseries_tag2",
          "timerseries_tag3"
        ],
        "integerArrayFieldId": [
          "timerseries_tag4",
          "timerseries_tag5"
        ]
      },
      "dataSource": []
    }
  4. Run the orchestration using Predix Time Series tags by issuing the following REST API request.
    POST <execution_uri>/api/v2/execution/async

Running an Orchestration for a Single Asset

To run an orchestration for a single asset, proceed as follows. The orchestration will automatically communicate with both the Predix Asset service to retrieve tags, and the Time Series service to retrieve and store data corresponding to these tags.

Before You Begin

Before executing an orchestration:
  • All referenced analytics hosted by the Analytics Catalog must be deployed to Cloud Foundry and running.
  • All referenced analytics must be written to accept input and provide output in the expected format.
  • All referenced analytics must have their analytic templates uploaded to the Analytics Catalog.

Procedure

  1. Create an Orchestration Configuration and make a note of its id (orchestrationConfigurationId).
  2. Create the orchestration request body containing the orchestrationConfigurationId from Step 1 and the asset id to run the orchestration.

    The following is a sample orchestration execution request.

    {
         "orchestrationConfigurationId": "ca9a61ec-5bd0-43ff-b53f-e7f98a001cc0",
         "assetId": "/assets/32-aa17df93-1915-4da2-804a-6378cee1301f",
         “modelGroupKey”: “example-model-key”
     }
    Note:modelGroupKey” is only required when an orchestration contains a trained analytic.
  3. Optional: If passing custom attributes, the orchestration execution request with "customAttributes" definition will be similar to the following.
    {
      "orchestrationConfigurationId": "ca9a61ec-5bd0-43ff-b53f-e7f98a001cc0",
      "assetId": "/assets/32-aa17df93-1915-4da2-804a-6378cee1301f",
      "modelGroupKey": "example-model-key",
      "dataSource": [],
      "customAttributes": {
        "KW.START_TIME": 1455733669601,
        "KW.END_TIME": 1455733669610,
        "vibration.START_TIME": 1455733669601,
        "vibration.END_TIME": 1455733669610,
        "ATTRIBUTE_1_KEY": "test_attr_key1",
        "ATTRIBUTE_2_KEY": "test_attr_key2",
        "KW.ATTRIBUTE_1_VALUE_1": "test-val1",
        "KW.ATTRIBUTE_1_VALUE_2": "test-val2",
        "vibration.ATTRIBUTE_1_VALUE_1": "test-val1",
        "vibration.ATTRIBUTE_1_VALUE_2": "test-val2",
        "AGGREGATION_TYPE": "interpolation",
        "INTERVAL": "1hr",
        "MESSAGE_ID": "test-message-id-1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_1": "test_write_val1",
        "bearing temperature final.ATTRIBUTE_1_VALUE_2": "test_write_val2",
        "QUERY_CRITERIA1": {
          "start": "${START_TIME}",
          "end": "${END_TIME}",
          "tags": [
            {
              "limit": "${LIMIT}",
              "order": "${ORDER}"
            }
          ]
        },
        "START_TIME": 1455733669601,
        "END_TIME": 1455733669610,
        "LIMIT": 1000,
        "ORDER": "desc",
        "QUERY_CRITERIA2": {
          "messageId": "${MESSAGE_ID}",
          "body": [
            {
              "name": null,
              "attributes": {
                "${ATTRIBUTE_2_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_2_VALUE}",
                "${ATTRIBUTE_1_KEY}": "${ASSET_ID}_${FIELD_ID}_${double array.ATTRIBUTE_1_VALUE}"
              }
            }
          ]
        },
        "double array.ATTRIBUTE_2_VALUE": "test_write_attr_val2",
        "double array.ATTRIBUTE_1_VALUE": "test_write_attr_val1"
      }
    }
  4. Run the orchestration for a single asset by issuing the following REST API request.
    POST <execution_uri>/api/v2/execution/async

Results

The following is a sample response.

{
  "status": "PROCESSING",
  "orchestrationExecutionStatus": [
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-/assets/32",
      "assetId": "/assets/32 ",
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    }
  ],
  "orchestrationRequestId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3",
  "errorResponse": null
}

Running an Orchestration for an Asset Group

To run an orchestration for an asset group, proceed as follows. The orchestration will automatically communicate with both the Predix Asset service to retrieve tags, and the Time Series service to retrieve and store data corresponding to these tags.

Before You Begin

Before executing an orchestration:
  • All referenced analytics hosted by the Analytics Catalog must be deployed to Cloud Foundry and running.
  • All referenced analytics must be written to accept input and provide output in the expected format.
  • All referenced analytics must have their analytic templates uploaded to the Analytics Catalog.

Procedure

  1. Create an Orchestration Configuration and make a note of its id (orchestrationConfigurationId).
  2. Create the orchestration request body containing the orchestrationConfigurationId from Step 1 and an asset group query.

    The following is a sample orchestration execution request.

    {
         "orchestrationConfigurationId": "ca9a61ec-5bd0-43ff-b53f-e7f98a001cc0",
         "assetGroup": {
             "dataSourceId": "PredixAsset",
             "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=1.5sl<turbine_type"
         },
         “modelGroupKey”: “example-model-key”
     }
    Note:modelGroupKey” is only required when an orchestration contains a trained analytic.
  3. Optional: See Passing Custom Attributes During an Orchestration for an example of the orchestration execution request with "customAttributes" definition.
  4. Run the orchestration for an asset group by issuing the following REST API Request.
    POST <execution_uri>/api/v2/execution/async

Results

The following is a sample response.

{
  "status": "PROCESSING",
  "orchestrationExecutionStatus": [
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-/assets/32",
      "assetId": "/assets/32 ",
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    },
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-/assets/37",
      "assetId": "/assets/37 ",
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    }
  ],
  "orchestrationRequestId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3",
  "errorResponse": null
} 

Running an Orchestration Using an External Data Source

To run an orchestration for an analytic using an external data source, proceed as follows. The orchestration will automatically communicate with your Custom Data Connector to retrieve and store data from an external source.

Before You Begin

Before executing an orchestration:

  • All referenced analytics hosted by the Analytics Catalog must be deployed to Cloud Foundry and running.
  • All referenced analytics must be written to accept input and provide output in the expected format.
  • All referenced analytics must have their analytic templates uploaded to the Analytics Catalog.
  • Your Custom Data Connector service must have been developed and deployed to Predix cloud.

Procedure

  1. Create an Orchestration configuration and make a note of its id (orchestrationConfigurationId).
  2. Create the orchestration request body containing the orchestrationConfigurationId from Step 1.
  3. Run the orchestration by issuing the following REST API request.
    POST <execution_uri>/api/v2/execution/async

    The following is a sample orchestration execution request.

    
    {
      "orchestrationConfigurationId": "ca9a61ec-5bd0-43ff-b53f-e7f98a001cc0",
      "assetGroup": {
        "dataSourceId": "PredixAsset",
        "assetSelectionFilter": "/assets?filter=classification=/classifications/turbine:name=1.5sl<turbine_type"
      },
      "modelGroupKey": "example-model-key",
      "dataSource": [
        {
          "dataSourceId": "pg_colo_123",
          "baseUri": "https://data_connector_fqdn",
          "apiVersion": "v1"
        }
      ],
      "customAttributes": {
        "key1": "value1",
        "key2": "value2"
      }
    }
    
    Note: “modelGroupKey” is only required when an orchestration contains a trained analytic.

Results

The following is a sample response.

{
  "status": "PROCESSING",
  "orchestrationExecutionStatus": [
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-/assets/32",
      "assetId": "/assets/32 ",
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    },
    {
      "status": "PROCESSING",
      "contextId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3-/assets/37",
      "assetId": "/assets/37 ",
      "errorResponse": null,
      "orchestrationStepStatus": [],
      "startTime": null,
      "endTime": null
    }
  ],
  "orchestrationRequestId": "f55ac5ba-40a3-4999-8f61-8bf3ef45ece3",
  "errorResponse": null
}