Manage Deployment Schedules

About Analytics Deployment Schedules

Use the Schedule step to create a schedule for running a deployed analytic or execute it on-demand.

You can run the analytic on demand, set the analytic to run continuously, define an end date and time, or stop the run manually. You can select the following units of time and frequency for a recurrent schedule:
Table 1. Execution Time Options
UnitRangeApplicable to
SecondsSelect a numerical value from 1 to 59.
  • Repeats Every
  • Sample Duration (Batch Size)
  • Sampling Interval
MinutesSelect a numerical value from 1 to 59.
  • Repeats Every
  • Sample Duration (Batch Size)
  • Sampling Interval
HoursSelect a numerical value from 1 to 23.
  • Repeats Every
  • Offset
  • Sample Duration (Batch Size)
DaysSelect one day.
  • Offset
  • Sample Duration (Batch Size)
SamplesSelect a numerical value for the number of samples before the offset.
  • Sample Duration (Batch Size)
For each data request, you must define how much data will be pulled and the interval between data points.
Table 2. Data Requests
This requestMeans this
OffsetThe amount of time in the past to begin the data pull.
Sample Duration (Batch Size)The amount of data being pulled before the Offset.
Time SpanThe time span for which the data is being pulled.
Sampling IntervalThe interval between data points.

Schedule a Recurrent Analytic

You can set an analytic to run continuously, define how often it runs, define how far back in time the data is requested, define how large the sample is and how often the data is sampled.

Before You Begin

Make sure that you have selected applicable assets, and mapped input and output tags for deployment.

Procedure

  1. In the module navigation menu, navigate to Analytics > Deployments.
  2. In the Search pane, select an existing deployment to access its workspace.
  3. In the page heading, select Edit to access the 1. Asset Selection section.
  4. Select the 3. Schedule tab.
  5. In the SCHEDULE subsection, select Recurrent.
  6. Enter a value in the Repeats Every box, and select a time unit from the drop-down list box.
  7. In the DATA REQUEST subsection, enter a value in the Offset Before Schedule box, and then select a time unit from the drop-down list box. The permitted values for offset time is hours, days, months, and years.
    The offset determines how long ago the data request should begin.
    Note: Entering 0 in the Offset field equates to the present day.
  8. Enter a value in the Priority box.
    The sample duration determines how much data is pulled.
  9. Enter a value in the Sampling Interval field, and then select a time unit from the drop-down list.
    The sampling interval determines the interval between data points.
  10. Select Save to save the scheduling configuration.
  11. Select Deploy.
    Deploy is not enabled unless all conditions are met.
    Tip: Do not select Deploy until you have reviewed the deployment configuration. If you modify the schedule, you must redeploy the analytic to activate these changes.
    The analytic is deployed to the respective runtime environment. After deployment, the analytic runs when scheduled.

What To Do Next

Review the deployment.

Schedule an Analytic On Demand

You can schedule an analytic to run on-demand and configure how the data is sampled.

Before You Begin

Make sure that you have selected applicable assets, and mapped input and output tags for deployment.

Procedure

  1. In the module navigation menu, navigate to Analytics > Deployments.
  2. In the Search pane, select an existing deployment to access its workspace.
  3. In the page header, select Edit to access the 1. Asset Selection section.
  4. Select the 3. Schedule tab.
  5. In the SCHEDULE subsection, select Only Once.
    The Time Span date picker appears under DATA REQUEST subsection. The default time span is one day.
  6. Make the start and end time span selections in the Time Span date picker.
  7. Enter a value in the Priority box.
  8. Select Save to save the scheduling configuration.
  9. Select Deploy.
    Deploy is not enabled unless all conditions are met.
    Tip: Do not select Deploy until you have reviewed the deployment configuration. If you modify the schedule, you must redeploy the analytic to activate these changes.
    The analytic runs in the background.

What To Do Next

Review the deployment.

Schedule a Streaming Analytic

Tenant-specific option to deploy and run analytics for continuous streaming.

Before You Begin

This procedure assumes that the following prerequisite tasks have been completed.

  • You have uploaded the required assets, tags and time series data required for this analytic.
  • Your tenant has been configured to stream to Predix timeseries.
  • You have the required access permissions to manage analytics on Spark runtime for the OPM tenant.
  • You have added and configured a streaming analytic in the analytic catalog.
  • You have created a deployment for that streaming analytics.
  • You have applied the deployment to the applicable assets through asset selection.
  • You have mapped tags to the inputs, constants and outputs before deploying the analytic job to the Spark runtime environment

About This Task

The streaming schedule is only available for tenants that are configured to support big data streaming through Spark runtime cluster environment. Depending on the tenant configuration, this option may not be available. Use the Schedule section to deploy and run your forecasting and streaming analytic jobs. You can run the deployed job on demand or through continuous streaming. Once started, the streaming job runs continuously.

Once you have selected the tags to map, you can continue to the 4. Review tab, or save and close. Navigating by selecting outside the deployment screen saves and closes the access deployment. Select the Deployment Name link to continue your deployment.

Procedure

  1. In the module navigation menu, navigate to Analytics > Deployments.
  2. In the Search pane, select an existing deployment to access its workspace.
  3. In the page header, select Edit to access the 1. Asset Selection section.
  4. Select the 3. Schedule tab.
  5. In the SCHEDULE subsection, select Streaming.
  6. Enter a value in the Sampling Interval box, and then select Seconds or Minutes in the drop-down list box. For example, to fetch a data point for every minute within the specified date/time range, enter 1, and then select the denomination as minute.
  7. Select Save to save the scheduling configuration.
  8. Select Deploy.
    Deploy is not enabled unless all conditions are met.
  9. Select Close to save and close the deployment without deploying.

Results

After deployment, the analytic is initiated in the Spark runtime environment and runs continuously until it is manually stopped.

Schedule an Orchestration

You can schedule an orchestration to run at regular intervals, or run the orchestration on demand.

About This Task

Similar to any analytic deployment, you can also use the 3. Schedule section of the deployment to run orchestrations.

Note: If the user has configured the Data Points value/field for a particular analytic step or and/or any of its iterations, these data point values will be used (override) instead of any data-related settings in scheduling a recurrent orchestration.
The following are the Orchestration Scheduling Enhancements:
  • Data request parameters can be set on per step basis. Data request parameters, such as Start Time, End Time, for Run Once, Offset before schedule, and Sample duration, which are set at orchestration level, apply to all steps by default. However, you can set the parameters on per step basis and override these parameters.
  • In addition to the Historian mode, the following modes are available in the Time Series data:
    • Math
    • Sample and Hold
  • Runtime supported for Predix Insights only.
You can run the orchestration on demand, set the orchestration to run continuously, and define an end date and time. You can select the following units of time and frequency for a recurrent schedule:
Table 3. Execution Time Options
UnitRangeApplies to the following fields
Seconds1-59
  • Repeats Every
  • Sample Duration (Batch Size)
  • Sampling Interval
Minutes1-59
  • Repeats Every
  • Sample Duration (Batch Size)
  • Sampling Interval
Hours1-24
  • Repeats Every
  • Offset
  • Sample Duration (Batch Size)
DaysSelect one or more days of the week.
  • Offset
  • Sample Duration (Batch Size)
Months1-12Offset
YearsSelect a numerical value. Offset
SamplesSelect a numerical value for the number of samples before the offset.Sample Duration (Batch Size)
For each data request, you must define how much data will be pulled and the interval between data points.
Table 4. Data Requests
This requestMeans this
OffsetThe amount of time in the past to begin the data pull.
Sample Duration (Batch Size)The amount of data being sampled before the Offset.
Time SpanThe time span for which the data is being sampled.
Sampling IntervalThe interval between data points.

Procedure

Schedule a Analytic to Trigger on an Event

Subject to your tenant configuration, you can trigger an analytic to run when a specific alert event happens.

Before You Begin

This procedure assumes that the following prerequisite tasks have been completed.

  • You have uploaded the required assets, tags, and time series data required for this analytic.
  • You have alert templates defined for your tenant.
  • Your tenant has been configured to enable the PowerDataFabric runtime.
  • You have the required access permissions to manage analytics.
  • You have added and configured an analytic template for the PowerDataFabric runtime in the analytic catalog.
  • You have created a deployment for that analytic template.
  • You have applied the deployment to the applicable assets through asset selection.
  • You have mapped tags to the inputs, constants, and outputs before deploying the analytic job.

About This Task

The triggering schedule is only available for tenants that are configured to support event trigger through the PowerDataFabric runtime environment. Depending on the tenant configuration, this option may not be available. Use the Schedule section to deploy and trigger analytic jobs. You can run the deployed job on demand or through event trigger. Once started, the job is only triggered when one or more of the configured alert event happens.

Once you have selected the tags to map, you can continue to the 4. Review section, or save and close. Navigating by selecting outside the deployment screen saves and closes the deployment.

Procedure

  1. In the module navigation menu, navigate to Analytics > Deployments.
  2. In the Search pane, select an existing deployment to access its workspace.
  3. In the page heading, select Edit to access the 1. Asset Selection section.
  4. Select the 3. Schedule tab.
  5. In the SCHEDULE subsection, select Triggered.
  6. In the Alert Templates subsection, add alert templates.
    Important: You must not add alert templates that are configured in the analytic template used in the current deployment. This will cause a circular dependency, and the job will fail.
    1. In the TEMPLATE column, add the alert template.
    2. In the EXECUTION DELAY column, add the time delay needed to trigger the job to run.
    3. In the START OFFSET column, add the time offset before the job starts.
    4. In the END OFFSET column, add the time offset after the job ends.
  7. In the DATA REQUEST subsection, add the following:
    Offset Before ScheduleEnter a value in the box, and then select a time unit in the drop-down list box. The permitted values for offset time is hours, days, months, and years.
    Sample Duration (Batch Size)Enter the amount of data to be sampled before the Offset.
    The offset determines how long ago the data request should begin.
    Note: Entering 0 in the Offset box equates to the present day.
  8. Enter a value in the Sampling Interval box, and then select Seconds or Minutes in the drop-down list box. For example, to fetch a data point for every minute within the specified date/time range, enter 1, and then select the denomination as minute.
  9. Select Save to save the scheduling configuration.
  10. Select Deploy.
    The Deploy button is not enabled unless all conditions are met.
  11. Select Close to save and close the deployment without deploying.

Results

After deployment, the analytic is initiated in the PowerDataFabric runtime environment whenever the configured alert events happen.