Skip to main content

Nexalis Macros

Nexalis Cloud provides built-in functions for advanced time-series data processing. These functions simplify complex operations and ensure accurate calculations.

@nexalis/scale

Applies scaling transformations to GTS values using multiplier and adder attributes. This is useful for unit conversions, calibration adjustments, or normalization.

Use Cases

The primary purpose of @nexalis/scale is to transform raw industrial values into standardized units defined by the Nexalis data model. Values are stored “raw” (as sent from the data sources), and the multiplier and adder attributes bring them into the correct units. Example: A site sends active power measurements in Watts (W), but the Nexalis data model specifies kilowatts (kW). The GTS would have:
  • multiplier: 0.001 (converts W → kW)
  • adder: 0
When you fetch this data and apply @nexalis/scale, a raw value of 5000 W becomes 5 kW.

Common Transformations

  1. Power Units: W → kW (multiplier: 0.001, adder: 0)
  2. Temperature Scales: Celsius → Fahrenheit (multiplier: 1.8, adder: 32)
  3. Sensor Offsets: Apply zero-point corrections using adder

Attributes Used

The function reads these attributes from each GTS:
  • multiplier: Multiplies each value (e.g., for unit conversion)
  • adder: Adds to each value (e.g., for offset correction)
If one of ‘multiplier’ or ‘adder’ is missing or null, that transformation is skipped (you get raw values).

Formula

For each data point:
scaled_value = original_value × multiplier + adder

Parameters

ParameterTypeDescription
DATALIST<GTS>List of GTS to apply scaling transformations

Returns

A list of GTS with scaled values. The original multiplier and adder attributes remain in the GTS metadata.

Example: Power Standardization

// Fetch raw active power data (stored in Watts from SCADA)
{
  'token' $read_token
  'class' 'nx.value'
  'labels' { 'assetType' 'INV' 'dataObject' 'TotW' }
  'start' '2026-01-15T00:00:00Z'    // the trailing "Z" means "UTC timezone"
  'end' '2026-01-18T23:59:59Z' 
} FETCH

// Apply scaling to convert W → kW per Nexalis data model
// GTS has attributes: multiplier=0.001, adder=0

@nexalis/scale
⚠️ Warning: This function is not idempotent. If you call it twice, it will scale the data twice.

@nexalis/fetch_trapezoidal_averages

Fetches time-series data from Nexalis Cloud and computes time-weighted (trapezoidal) averages over fixed-width time buckets. This is ideal for accurate aggregation of non-uniformly sampled data. This macro calls @nexalis/scale macro by default (the values returned are scaled).

Why Trapezoidal Averages?

Unlike simple arithmetic means, trapezoidal averaging accounts for the time duration between data points, providing more accurate averages when:
  • Data points are irregularly spaced
  • Sampling rates vary over time
  • You need true time-weighted calculations

Parameters

ParameterTypeRequiredDescription
read_tokenSTRINGYesWarp10 read token with fetch permissions
startSTRING or LONGYesStart timestamp (ISO8601 string or microseconds)
endSTRING or LONGYesEnd timestamp (ISO8601 string or microseconds)
bucket_sizeLONGYesBucket width in minutes
labelsMAPYesLabels and attributes to filter GTS (supports regexp with ~ prefix)
scalingBOOLEANNoBoolean to apply scaling using @nexalis/scale macro (default: true)
classSTRINGNoSelector for the GTS classes (default: “nx.value”)

Returns

A list of GTS containing trapezoidal averages for each bucket. Only LONG and DOUBLE value types are processed; discrete measurements (STRING/BOOLEAN) are automatically filtered out.
NaN values: The macro returns NaN (Not a Number) for buckets where no values were recorded during the interval and no anterior values exist for trapezoidal interpolation. This will only occur at the start of a time series.

Example Usage

// Fetch 15-minute trapezoidal averages over 1 day

{ 'token' $read_token
  'start' '2026-01-01T00:00:00Z'
  'end' '2026-01-02T00:00:00Z'
  'bucket_size' 15    //minutes
  'labels' { 'assetType' 'INV' 'dataObject' 'TotW' }
  }
  @nexalis/fetch_trapezoidal_averages

Example with regular expression

// Fetch dataPoints containing "INV" or "METER"

{ 'token' $read_token
  'start' '2026-01-01T00:00:00Z'
  'end' '2026-01-02T00:00:00Z'
  'bucket_size' 15    //minutes
  'labels' { 'dataPoint' '~(.*INV.*|.*METER.*)' }  // Regexp pattern
  }

How It Works

  1. Retrieves GTS data according to labels selection and extends time windows with boundary points (1 pre and post) for accurate edge calculations
  2. Filters to keep only analog measurements (gets rid of strings/booleans)
  3. Averages values by applying trapezoidal averaging to compute time-weighted means per bucket