# Analytics Reference Guide 🔗

## Absolute value 🔗

For each input time series, calculates the absolute value of each datapoint.

## Ceiling 🔗

Rounds datapoints up (away from zero) to the nearest integer.

## Count 🔗

For each time interval, publishes the number of input time series which reported a datapoint within that time interval. Count is typically used to determine if datapoints are missing for whatever reason.

Figure 1. Example of using the Count function

## Delta 🔗

For each time interval, calculates the difference between the current value and the previous value. It operates independently on each time series in the plot.

Figure 2. Example of using the Delta function

## EWMA and Double EWMA 🔗

Calculates an exponentially weighted moving average, where more recent datapoints are given higher weight. The weight of a datapoint decreases exponentially with time.

The EWMA (exponentially weighted moving average) summarizes a window of data with an emphasis on points received recently. Thresholds for alerts can be constructed by forming a band around the EWMA using standard deviations or a percentage. Alternatively, alerting on the EWMA, much like alerting on the usual moving average, can be used in the place of duration conditions.

Double EWMA also incorporates a weighted moving average of the metric’s trend, and can be used to forecast. For example, if the forecast parameter (see below) is set to 10m, the output time series estimates the value of the input time series 10 minutes from now. This can be used to predict when a resource is likely to be exhausted, or more generally as a way of getting alerts earlier. This will also eliminate some false alarms in the scenario where the values are worrisome (too high, say) but the trend is benign (decreasing back to healthy).

Data Smoothing (number)

Smoothing parameter, often called alpha, applied to the datapoints of the input time series. Must be between 0 and 1. Smaller values correspond to longer time windows and thus more smoothing (weights decay more slowly). Always uses the finest resolution available.

Trend Smoothing (number, applies only to Double EWMA)

Smoothing parameter, often called beta, applied to the trend of the input time series. Must be between 0 and 1. Smaller values correspond to longer time windows and thus more smoothing (weights decay more slowly). Always uses the finest resolution available.

Forecast (duration – applies only to Double EWMA)

How far into the future to forecast. Calculated by adding an appropriate multiple of the trend term to the level term. The default value of 0 simply smooths the series.

Damping (number – applies only to Double EWMA)

A number between 0 and 1. A value of 1 projects the trend will continue indefinitely (no damping). Smaller values decay the trend towards zero as the projection gets further into the future. This parameter is generally relevant when Forecast is not zero.

## Exclude 🔗

Restricts the data to be analyzed by filtering out values above or below given thresholds. If a time series value meets the criteria set in the function, you can choose to drop the value altogether (such that the output time series has no datapoint at that time) or pin it to the defined threshold.

Exclude is particularly useful in situations where you want to apply a conditional to another analytics function. For example, if you want to count the number of servers with a CPU utilization above 80%, then you can use`CPUUtilization`

as the metric, apply an`Exclude x < 80`

function, then apply a Count.

## Floor 🔗

Rounds datapoints down (toward zero) to the nearest integer.

## Integrate 🔗

Multiplies the values of each input time series by the resolution (in seconds) of the chart. Integrate is most useful in SignalFx for gauge metrics. For example, if your gauge is an accelerometer, then Integrate can calculate the change in velocity over a window of time.

Figure 3. Example of using the Integrate function

For counters and cumulative counters, Integrate is less useful simply because a built-in rollup with equivalent functionality already exists. For counters, applying an Integrate function to the *Rate/sec* (rate per second) rollup is equivalent to simply using the *Sum* rollup, assuming no missing datapoints. For cumulative counters, the same is true for the *Delta* rollup.

## LN - Log (natural) 🔗

LN calculates the natural logarithm (base e) of each datapoint. For each input time series, LN generates a corresponding output time series.

## Log10 🔗

Calculates the logarithm (base 10) of each datapoint. For each input time series, Log10 generates a corresponding output time series.

## Mean 🔗

Calculates the average `μ`

of the available datapoints by dividing the sum of the values of the available datapoints by the number of available datapoints.

- Mean:Aggregation
Outputs one time series for each group of input time series expressing, for each time period, the mean of the values present in the input in this time period for that group. Missing datapoints are treated as

`null`

values. Optional parameter: group‑by.Figure 4. Example of using the Mean:Aggregation function

- Mean:Transformation
Calculates a moving average over a configurable time window. For each input time series, Mean:Transformation outputs a corresponding time series expressing for each time period the mean of the values of the input time series over a configurable time window leading up to said period. Required parameter: time window (default is 1 hour).

Figure 5. Example of using the Mean:Transformation function over a time window of 10 seconds

The Mean function also supports transformation over a calendar window (day, week, month, etc.) instead of a moving window. For more information, see Calendar window transformations.

## Mean + standard deviation 🔗

Applies the formula `μ+n*σ`

, where `μ`

is the mean, `σ`

is the standard deviation, and `n`

is a given number of standard deviations to add (or subtract, for negative numbers) from the mean. The aggregation and transformation modes work in the same manner as for the independent mean and standard deviation functions. Required parameter: number of standard deviations (default is 1). Optional parameter: group‑by.

## Minimum / Maximum 🔗

- Minimum:Aggregation and Maximum:Aggregation
Output one time series for each group of input time series expressing, for each time period, the minimum (or maximum) of the values present in the input in this time period. Optional parameter: group‑by.

Figure 6. Example of using the Minimum and Maximum Aggregation functions

- Minimum:Transformation and Maximum:Transformation
For each input time series, outputs a corresponding time series expressing for each time period the minimum (or maximum) of the values of the input time series over a configurable time window leading up to that period. Required parameter: time window (default is 1 hour).

Figure 7. Example of using the Minimum and Maximum Transformation functions over a time window of 10 seconds.

The Minimum and Maximum functions also support transformation over a calendar window (day, week, month, etc.) instead of a moving window. For more information, see Calendar window transformations.

## Percentile 🔗

- Percentile:Aggregation
- Outputs one time series for each group of input time series expressing, for each time period, the configured percentile (between 1 and 100, inclusive) of the values present in the input in this time period. Required parameter: percentile value (default is 95). Optional parameter: group‑by.
- Percentile:Transformation
- For each input time series, outputs a corresponding time series expressing, for each time period, the configured percentile (between 1 and 100, inclusive) of the input time series over a configurable time window leading up to that period. Required parameters: percentile value (default is 95), time window (default is 1 hour).

## Power 🔗

Outputs the value of each datapoint raised to a specified power, or of a specified number to the power of the datapoint value.

## Rate of change 🔗

Similar to the Delta function, except that it divides the difference between the current value and the previous value by the time elapsed, in seconds, between these two values to normalize the change over the compute resolution.

Figure 8. Example of using the Rate of change function

## Scale 🔗

Transforms each input time series by applying the given scale factor (by multiplication) to each datapoint. Scale is often used for converting to percentages (using `Scale(100)`

to represent the value `0.12`

as `12`

, with the Y-axis appropriately labeled percent) or for converting between units of time (using `Scale(60)`

to calculate a rate per minute from an input rate per second). Similar results can be obtained by simply entering an algebraic expression in a subsequent plot, e.g. `100 * A`

. Required parameter: scale factor (default is 1).

## Square root 🔗

Calculates the square root of the datapoint values.

## Standard deviation 🔗

The standard deviation `σ`

is the square root of the variance. See Variance for how the variance is calculated for both aggregation and transformation modes.

## Sum 🔗

- Sum:Aggregation
- Outputs a single time series expressing, for each period, the sum of all the values of the input time series from that same period. Otherwise, it outputs one time series for each unique combination of the values of the grouping properties, each of those time series expressing the sum of the values of the input time series which metadata match those groups. Input time series that do not have dimensions or properties matching those grouping properties are not included in the computation and in the output. Optional parameter: group‑by.
- Sum:Transformation
Calculates the sum of the values of an input time series over a moving time window. As with other transformations, an output time series is generated for each input time series. Required parameter: size of time window (default is 1 hour).

Figure 9. Example of using the Sum Aggregation and Transformation functions over a time window of 10 seconds

The Sum function also supports transformation over a calendar window (day, week, month, etc.) instead of a moving window. For more information, see Calendar window transformations.

## Timeshift 🔗

Answers the question: “How does `x`

compare to a week (or month, or year) ago?” Timeshift is not an analytics function per se; the presence of a Timeshift element in a plot affects the entirety of the plot it is on, regardless of its position. It instructs the Analytics engine to fetch data for all the time series of this plot with the specified time offset (in the past).

For example, specifying a `Timeshift(1d)`

will fetch data for these time series from one day in the past, then stream the offset data in real-time. This allows you to compare the current value reported in a time series with the value that was reported in the past with a constant relative offset (day-over-day in this example).

Required parameter: offset value, specified in weeks(w), days(d), hours(h), minutes(m) and seconds(s) (default is 0). The offset value is always assumed to be towards the past, and must be zero or positive. To specify an offset of 2 weeks and 2 hours, enter an offset value of `2w2h`

. Note also that the offset value must be greater than or equal to the minimum resolution of the data used in the current chart. For example, if you select `Timeshift(30s)`

, but the resolution of your chart is 5 minutes, then the function will be invalid.

## Top / Bottom 🔗

Can be used to select a subset of the time series in the plot.

- By count
- When operating by count, the output will be the top-N (or bottom-N) time series with the highest (respectively, lowest) values in each time period, where N is the given count value. Required parameter: count value (default is 5).
- By percent
- When operating by percent, the output will be the time series for which the value in each time period is higher (respectively, lower) than the P-th percentile, where P is the given percentage value between 1% and 100% (inclusive). This is equivalent to the “top x%” or “bottom x%” of time series, by value. Required parameter: percentage value (default is 5).

A line chart using this analytic will show all series that were in the Top/Bottom N at any point in the specified window. The value for a series will be masked (replaced with a null) at a timestamp if that series is not in the Top/Bottom N at that timestamp.

## Variance 🔗

Calculated by dividing the sum of the squares of the difference of each value to the mean of the available datapoints by the number of available datapoints.

- Variance:Aggregation
- Calculates the variance of values across a group of input time series at a given point in time. Optional parameter: group‑by.
- Variance:Transformation
- Calculates the variance of the values of an input time series over a moving time window. As with other transformations, an output time series is generated for each input time series. Required parameter: size of time window (default is 1 hour).