Steps to upload and deploy a sample Python analytic with alerts on Spark runtime.
Before You Begin
This procedure assumes that the following prerequites tasks have been completed.
Procedure
- Add the alert template
- Select Alert Templates.
- Select + to add new template.
- Enter Analytics in the Alert Template field.
- Select Save.
- Upload the analytics template to the catalog.
Configure the following information for the analytic.
- In the Analytic Template, configure the input definition, constant, and output definition through CSV upload and select Save.
- Enter Analytics in the Output Events field and select Save.
- Add and configure the deployment as follows:
- Enter deployment_analytics_alert in the Deployment Name field and then select Submit.
- In the 1. Asset Selection step, select the asset defined in the analytic, and then select Save.
- Select Next to access the 2. I/O Mapping step.
- Select the Tag drop-down menu and then, select Add Tags....
- In the tag browser, search for the tag in the analytic. As represented in the sample example, search for OO_TAG_Temperature_ID14. After the search displays the tag, drag and drop it onto the input for mapping it.
- Select Save and Next to save the I/O Mapping configuration.
- In the 3. Schedule step, leave the selection at Only Once for Define how often a new run will be executed option.
- Select Time Span between May 23, 2017 to current date.
- Leave the Sample Interval at the default value of 1 Minute.
- Select Save and then select Deploy.
The deployment is saved to the Spark runtime. After successful deployment the status updates to Run Once.
- Claim the analytic alert in the Alerts module.
- Select
- Filter alerts by Time Received.
Search for the Alert on date Oct 06, 2017. The date of alert is hardcoded in the analytics.