Troubleshoot Time Series
Troubleshoot Time Series Queries
Error Message When Querying the Time Series Service Instance
When you submit a query using raw query mode or another aggregation, you receive an error message in this format:
"<tag_name> [0] exceeded the maximum number of data points that could be retrieved (<count of data points found > query maximum limit). Reduce the time window and try again."
Cause
The number of data points retrieved by a query cannot exceed the maximum limit of 500,000.
Solution
Narrow your query criteria (for example, the time window) to return a fewer number of data points.
401 Unauthorized Message Received When Making a Query Request
Cause
If the client cannot be authenticated, an HTTP response code 401: Unauthorized
is returned.
- The
<Predix-Zone-Id>
tag you are using is incorrect. - The token is invalid or expired.
- The client ID is incorrect.
- The client is missing the OAuth2 scopes and authorities.
Solution
cf env <application_name>
Ensure that:
- You are using the correct
<Predix-Zone-Id>
for your service instance. - You are using a valid token.
- You are using the correct client ID.
- You added the OAuth2 scopes and authorities to the client.
See Updating the OAuth2 Client Using UAAC for more information.
503 Acknowledgement Message
Cause
There could be an issue with the environment.
Solution
File a support ticket. See tss-troubleshoot.html#task_f2d2f6a2-a469-4a10-bb5e-cecbdeb704a2.
Unable to Query the Data Points That Were Ingested
Cause
- The
<Predix-Zone-Id>
used in the ingestion request client is different from the<Predix-Zone-Id>
used in the query client. - The application that is bound to the time series instance used for data ingestion is not the same as the one bound to the application used for querying.
- Invalid characters used in tag name, or attribute names, or attribute values.
- Data does not comply to the required structure.
Solution
- Verify that the
<Predix-Zone-Id>
you are using in your query request is the same as the one used in the ingestion request. - Verify that the application bound to the time series service instance for querying is the same as the one used for data ingestion.
- Verify that you are using the correct tag structure and valid characters.
See Time Series Tag Structure in tss-using-service.html#concept_dc613f2c-bb63-4287-9c95-8aaf2c1ca6f7.
Unable to Query Some of the Data Points That Were Ingested
Description
After sending an ingestion request, you are unable to query some of the data points.
Cause
- Network delays
- Missing data points
Solution
-
Check the list of possible causes in Unable to Query the Data Points That Were Ingested.
- File a support ticket. See tss-troubleshoot.html#task_f2d2f6a2-a469-4a10-bb5e-cecbdeb704a2.
Query Returns No Data Points
Description
When you submit a query, you receive a 200 response, indicating the query was successful, but no data points are returned.
Cause
This can happen if the Timestamp in your time series tag is not in milliseconds.
Solution
Use only UNIX epoch time in milliseconds in the time series tag Timestamp.
Troubleshoot Time Series Data Ingestion
The following are some issues you may encounter when sending ingestion requests to the Time Series service.
400 Acknowledgement Message
You receive a 400
acknowledgement message when making an ingestion request.
Cause
- The data points you are attempting to ingest do not conform to the message envelope JSON.
- You are using the Java WebSocket package and using the
sendText
method. - The write buffer for your WebSocket client is not set to 512KB.
Solution
If you receive a 400
error, check the following:
-
Check the message and make sure it is well-formed JSON is the message ID is returned with the 400 error.
- Verify that you are not chunking your ingestion request. Each message must conform to the JSON format specified.
-
If you are using the Java WebSocket package (javax.websocket), use the
sendBinary
method instead ofsendText
. - If you are using a WebSocket package in any language other than Java, verify the write buffer for your WebSocket client is set to 512KB (524,288 bytes).
401 Acknowledgement Message
If the client cannot be authenticated, an HTTP response code 401: Unauthorized
is returned.
Cause
When you receive a 401 Authorization
acknowledgement message when making a data ingestion request, it may be due to one of the following reasons:
- The
<Predix-Zone-Id>
you are using is incorrect. - The token is invalid or expired.
- The client ID is incorrect.
- The client is missing the OAuth2 scopes and authorities.
- If you are using the Time Series client library on Linux, the Java program may not be picking up the environment variables you set for the client secret.
Solution
If you receive a 401: Unauthorized
error, try the following:
- Check the environment variables for your application:
cf env <application_name>
- Verify that you are using the correct
<Predix-Zone-Id>
for your service instance. - Verify that you are using a valid token.
- Verify that you are using the correct client ID.
- Verify that you added the OAuth2 scopes and authorities to the client.
- If you are using the Time Series client library on Linux, make sure the Java program is picking up the environment variables you set for the client secret.
- Run the following command:
System.out.println(System.getenv())
This prints all of the environment variables Java has access to. If the environment variable is not there, do the following:- Make sure it is set correctly, either by exporting it prior to runtime or setting it in ~/.bashrc or
- Set the environment variable in the run configuration of the IDE.
- Run the following command:
See Updating the OAuth2 Client Using UAAC for more information.
413 Acknowledgement Message
You receive a 413
acknowledgement message when making a data ingestion request.
Cause
The ingestion request payload exceeds the maximum limit of 512KB. Note that even though we now support compressed GZIP JSON payloads, the decompressed payload must still be less than 512KB.
Solution
Resubmit the ingestion request with a request payload that is less than 512KB.