C3 AI Documentation Home

Time Series Debugging

When running applications or monitoring normalization, you might encounter errors in time series data. You can debug these errors using the procedures described in this section.

Check if a series has been normalized

Follow these steps to debug if a series has been normalized correctly or not:

Example:

JavaScript
 //Find the id
 var id = PhysicalMeasurementSeries.normalizedTimeseriesKey('YOUR_MS_ID', 'TS_FIELD_NAME');

 //Retreive the normalized time series object
 var object = NormalizedTimeseriesPersister.getId(id);

 //Find and Retrieve in one API call
 NormalizedTimeseriesPersister.getId(PhysicalMeasurementSeries.normalizedTimeseriesKey('YOUR_MS_ID', 'TS_FIELD_NAME'));

Check the time range of normalized data

Inspect the earliest and latest field on the series header to find the total range of normalized data.

Querying normalized data

Execute a query using tsEval on the time series header. See the Evaluating Time series page for more information on tsEval commands.

Example:

JavaScript
PhysicalMeasurementSeries.tsEval({
  projection:"sum(normalized.data.quantity)",
  start:"2010-01-01", end:"2011-01-01",
  interval: "MONTH", filter:"id == 'YOUR_MS_ID'"
});
  • If the normalized values are not what you expect, check the AggOp that is being applied on the series header that determines how the series is normalized.

See Time Series Data Treatments for more information.

  • If, even after creating Types your data is not normalized, check to see if the time series field is marked with the @ts annotation.

Note the @ts annotation on the usage field that is required for normalization to occur in this field.

Type
@db(datastore='kv')
entity type UtilityBill mixes IntervalDataPoint<UtilityBillSeries> schema name 'structure_UtilityBill' {
  /**
   * Time series field to track actual usage
   */
  @ts(treatment='integral', unitPath="parent.currencyUnit")
  usage : !double
}

Limits on normalization

Total number of normalized data points

By default, normalization only accumulates 15 minute intervals for 20 years of normalized data points: that is 4 * 24 * 365 * 20 = 700,800 data points.

If the number of normalized points is greater than 700,800 data points, an exception is thrown indicating a potential flaw in the shape of the data.

Total number of lock attempts

Before normalization begins a DbLock is acquired on the series header (the same series cannot be concurrently be normalized within the cluster). Currently, the normalization engine attempts to acquire a lock 10 times with a delay of 100 ms between the two checks after which it fails with Could not acquire lock exception error.

Although an exception is thrown, failed jobs can be recovered from the queue and the normalization engine can attempt to acquire a lock again.

Data types

The value type that can be used as the time series field are limited to:

See also

Was this page helpful?