diff --git a/README.md b/README.md index 20c3b3a64..3b72a0cea 100644 --- a/README.md +++ b/README.md @@ -40,7 +40,8 @@ VictoriaMetrics has the following prominent features: * It can be used as long-term storage for Prometheus. See [these docs](#prometheus-setup) for details. * It can be used as a drop-in replacement for Prometheus in Grafana, because it supports [Prometheus querying API](#prometheus-querying-api-usage). * It can be used as a drop-in replacement for Graphite in Grafana, because it supports [Graphite API](#graphite-api-usage). -* It features easy setup and operation: + VictoriaMetrics allows reducing infrastructure costs by more than 10x comparing to Graphite - see [this case study](https://docs.victoriametrics.com/CaseStudies.html#grammarly). +* It is easy to setup and operate: * VictoriaMetrics consists of a single [small executable](https://medium.com/@valyala/stripping-dependency-bloat-in-victoriametrics-docker-image-983fb5912b0d) without external dependencies. * All the configuration is done via explicit command-line flags with reasonable defaults. @@ -627,7 +628,6 @@ The `__graphite__` pseudo-label supports e.g. alternate regexp filters such as ` VictoriaMetrics also supports Graphite query language - see [these docs](#graphite-render-api-usage). - ## How to send data from OpenTSDB-compatible agents VictoriaMetrics supports [telnet put protocol](http://opentsdb.net/docs/build/html/api_telnet/put.html) @@ -829,10 +829,10 @@ VictoriaMetrics supports `__graphite__` pseudo-label for filtering time series w ### Graphite Render API usage -[VictoriaMetrics Enterprise](https://docs.victoriametrics.com/enterprise.html) supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset +VictoriaMetrics supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset at `/render` endpoint, which is used by [Graphite datasource in Grafana](https://grafana.com/docs/grafana/latest/datasources/graphite/). -When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. -Enterprise binaries can be downloaded and evaluated for free from [the releases page](https://github.com/VictoriaMetrics/VictoriaMetrics/releases). +When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points +stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. ### Graphite Metrics API usage @@ -2438,9 +2438,9 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.disableCache Whether to disable response caching. This may be useful during data backfilling -search.graphiteMaxPointsPerSeries int - The maximum number of points per series Graphite render API can return. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 1000000) + The maximum number of points per series Graphite render API can return (default 1000000) -search.graphiteStorageStep duration - The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 10s) + The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API (default 10s) -search.latencyOffset duration The time when data points become visible in query results after the collection. It can be overridden on per-query basis via latency_offset arg. Too small value can result in incomplete last points for query results (default 30s) -search.logQueryMemoryUsage size @@ -2457,7 +2457,7 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.maxFederateSeries int The maximum number of time series, which can be returned from /federate. This option allows limiting memory usage (default 1000000) -search.maxGraphiteSeries int - The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage . This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 300000) + The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage (default 300000) -search.maxLookback duration Synonym to -search.lookback-delta from Prometheus. The value is dynamically detected from interval between time series datapoints if not set. It can be overridden on per-query basis via max_lookback arg. See also '-search.maxStalenessInterval' flag, which has the same meaining due to historical reasons -search.maxMemoryPerQuery size diff --git a/app/vmselect/graphite/aggr.go b/app/vmselect/graphite/aggr.go new file mode 100644 index 000000000..49c51efbf --- /dev/null +++ b/app/vmselect/graphite/aggr.go @@ -0,0 +1,259 @@ +package graphite + +import ( + "fmt" + "math" + "strings" + "sync" + + "github.com/valyala/histogram" +) + +var aggrFuncs = map[string]aggrFunc{ + "average": aggrAvg, + "avg": aggrAvg, + "avg_zero": aggrAvgZero, + "median": aggrMedian, + "sum": aggrSum, + "total": aggrSum, + "min": aggrMin, + "max": aggrMax, + "diff": aggrDiff, + "pow": aggrPow, + "stddev": aggrStddev, + "count": aggrCount, + "range": aggrRange, + "rangeOf": aggrRange, + "multiply": aggrMultiply, + "first": aggrFirst, + "last": aggrLast, + "current": aggrLast, +} + +func getAggrFunc(funcName string) (aggrFunc, error) { + s := strings.TrimSuffix(funcName, "Series") + aggrFunc := aggrFuncs[s] + if aggrFunc == nil { + return nil, fmt.Errorf("unsupported aggregate function %q", funcName) + } + return aggrFunc, nil +} + +type aggrFunc func(values []float64) float64 + +func (af aggrFunc) apply(xFilesFactor float64, values []float64) float64 { + if aggrCount(values) >= float64(len(values))*xFilesFactor { + return af(values) + } + return nan +} + +func aggrAvg(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + sum := values[pos] + count := 1 + for _, v := range values[pos+1:] { + if !math.IsNaN(v) { + sum += v + count++ + } + } + return sum / float64(count) +} + +func aggrAvgZero(values []float64) float64 { + if len(values) == 0 { + return nan + } + sum := float64(0) + for _, v := range values { + if !math.IsNaN(v) { + sum += v + } + } + return sum / float64(len(values)) +} + +var aggrMedian = newAggrFuncPercentile(50) + +func aggrSum(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + sum := values[pos] + for _, v := range values[pos+1:] { + if !math.IsNaN(v) { + sum += v + } + } + return sum +} + +func aggrMin(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + min := values[pos] + for _, v := range values[pos+1:] { + if !math.IsNaN(v) && v < min { + min = v + } + } + return min +} + +func aggrMax(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + max := values[pos] + for _, v := range values[pos+1:] { + if !math.IsNaN(v) && v > max { + max = v + } + } + return max +} + +func aggrDiff(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + sum := float64(0) + for _, v := range values[pos+1:] { + if !math.IsNaN(v) { + sum += v + } + } + return values[pos] - sum +} + +func aggrPow(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + pow := values[pos] + for _, v := range values[pos+1:] { + if !math.IsNaN(v) { + pow = math.Pow(pow, v) + } + } + return pow +} + +func aggrStddev(values []float64) float64 { + avg := aggrAvg(values) + if math.IsNaN(avg) { + return nan + } + sum := float64(0) + count := 0 + for _, v := range values { + if !math.IsNaN(v) { + d := avg - v + sum += d * d + count++ + } + } + return math.Sqrt(sum / float64(count)) +} + +func aggrCount(values []float64) float64 { + count := 0 + for _, v := range values { + if !math.IsNaN(v) { + count++ + } + } + return float64(count) +} + +func aggrRange(values []float64) float64 { + min := aggrMin(values) + if math.IsNaN(min) { + return nan + } + max := aggrMax(values) + return max - min +} + +func aggrMultiply(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + p := values[pos] + for _, v := range values[pos+1:] { + if !math.IsNaN(v) { + p *= v + } + } + return p +} + +func aggrFirst(values []float64) float64 { + pos := getFirstNonNaNPos(values) + if pos < 0 { + return nan + } + return values[pos] +} + +func aggrLast(values []float64) float64 { + for i := len(values) - 1; i >= 0; i-- { + v := values[i] + if !math.IsNaN(v) { + return v + } + } + return nan +} + +func getFirstNonNaNPos(values []float64) int { + for i, v := range values { + if !math.IsNaN(v) { + return i + } + } + return -1 +} + +var nan = math.NaN() + +func newAggrFuncPercentile(n float64) aggrFunc { + f := func(values []float64) float64 { + h := getHistogram() + for _, v := range values { + if !math.IsNaN(v) { + h.Update(v) + } + } + p := h.Quantile(n / 100) + putHistogram(h) + return p + } + return f +} + +func getHistogram() *histogram.Fast { + return histogramPool.Get().(*histogram.Fast) +} + +func putHistogram(h *histogram.Fast) { + h.Reset() + histogramPool.Put(h) +} + +var histogramPool = &sync.Pool{ + New: func() interface{} { + return histogram.NewFast() + }, +} diff --git a/app/vmselect/graphite/aggr_state.go b/app/vmselect/graphite/aggr_state.go new file mode 100644 index 000000000..38fbe183f --- /dev/null +++ b/app/vmselect/graphite/aggr_state.go @@ -0,0 +1,724 @@ +package graphite + +import ( + "fmt" + "math" + "strings" + + "github.com/valyala/histogram" +) + +var aggrStateFuncs = map[string]func(int) aggrState{ + "average": newAggrStateAvg, + "avg": newAggrStateAvg, + "avg_zero": newAggrStateAvgZero, + "median": newAggrStateMedian, + "sum": newAggrStateSum, + "total": newAggrStateSum, + "min": newAggrStateMin, + "max": newAggrStateMax, + "diff": newAggrStateDiff, + "pow": newAggrStatePow, + "stddev": newAggrStateStddev, + "count": newAggrStateCount, + "range": newAggrStateRange, + "rangeOf": newAggrStateRange, + "multiply": newAggrStateMultiply, + "first": newAggrStateFirst, + "last": newAggrStateLast, + "current": newAggrStateLast, +} + +type aggrState interface { + Update(values []float64) + Finalize(xFilesFactor float64) []float64 +} + +func newAggrState(pointsLen int, funcName string) (aggrState, error) { + s := strings.TrimSuffix(funcName, "Series") + asf := aggrStateFuncs[s] + if asf == nil { + return nil, fmt.Errorf("unsupported aggregate function %q", funcName) + } + return asf(pointsLen), nil +} + +type aggrStateAvg struct { + pointsLen int + sums []float64 + counts []int + seriesTotal int +} + +func newAggrStateAvg(pointsLen int) aggrState { + return &aggrStateAvg{ + pointsLen: pointsLen, + sums: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateAvg) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + sums := as.sums + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + sums[i] += v + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStateAvg) Finalize(xFilesFactor float64) []float64 { + sums := as.sums + counts := as.counts + values := make([]float64, as.pointsLen) + xff := int(xFilesFactor * float64(as.seriesTotal)) + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = sums[i] / float64(count) + } + values[i] = v + } + return values +} + +type aggrStateAvgZero struct { + pointsLen int + sums []float64 + seriesTotal int +} + +func newAggrStateAvgZero(pointsLen int) aggrState { + return &aggrStateAvgZero{ + pointsLen: pointsLen, + sums: make([]float64, pointsLen), + } +} + +func (as *aggrStateAvgZero) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + sums := as.sums + for i, v := range values { + if !math.IsNaN(v) { + sums[i] += v + } + } + as.seriesTotal++ +} + +func (as *aggrStateAvgZero) Finalize(xFilesFactor float64) []float64 { + sums := as.sums + values := make([]float64, as.pointsLen) + count := float64(as.seriesTotal) + for i, sum := range sums { + v := nan + if count > 0 { + v = sum / count + } + values[i] = v + } + return values +} + +func newAggrStateMedian(pointsLen int) aggrState { + return newAggrStatePercentile(pointsLen, 50) +} + +type aggrStatePercentile struct { + phi float64 + pointsLen int + hs []*histogram.Fast + counts []int + seriesTotal int +} + +func newAggrStatePercentile(pointsLen int, n float64) aggrState { + hs := make([]*histogram.Fast, pointsLen) + for i := 0; i < pointsLen; i++ { + hs[i] = histogram.NewFast() + } + return &aggrStatePercentile{ + phi: n / 100, + pointsLen: pointsLen, + hs: hs, + counts: make([]int, pointsLen), + } +} + +func (as *aggrStatePercentile) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + hs := as.hs + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + hs[i].Update(v) + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStatePercentile) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + hs := as.hs + for i, count := range as.counts { + v := nan + if count > 0 && count >= xff { + v = hs[i].Quantile(as.phi) + } + values[i] = v + } + return values +} + +type aggrStateSum struct { + pointsLen int + sums []float64 + counts []int + seriesTotal int +} + +func newAggrStateSum(pointsLen int) aggrState { + return &aggrStateSum{ + pointsLen: pointsLen, + sums: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateSum) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + sums := as.sums + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + sums[i] += v + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStateSum) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + sums := as.sums + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = sums[i] + } + values[i] = v + } + return values +} + +type aggrStateMin struct { + pointsLen int + mins []float64 + counts []int + seriesTotal int +} + +func newAggrStateMin(pointsLen int) aggrState { + return &aggrStateMin{ + pointsLen: pointsLen, + mins: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateMin) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + mins := as.mins + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + counts[i]++ + if counts[i] == 1 { + mins[i] = v + } else if v < mins[i] { + mins[i] = v + } + } + as.seriesTotal++ +} + +func (as *aggrStateMin) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + mins := as.mins + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = mins[i] + } + values[i] = v + } + return values +} + +type aggrStateMax struct { + pointsLen int + maxs []float64 + counts []int + seriesTotal int +} + +func newAggrStateMax(pointsLen int) aggrState { + return &aggrStateMax{ + pointsLen: pointsLen, + maxs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateMax) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + maxs := as.maxs + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + counts[i]++ + if counts[i] == 1 { + maxs[i] = v + } else if v > maxs[i] { + maxs[i] = v + } + } + as.seriesTotal++ +} + +func (as *aggrStateMax) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + maxs := as.maxs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = maxs[i] + } + values[i] = v + } + return values +} + +type aggrStateDiff struct { + pointsLen int + vs []float64 + counts []int + seriesTotal int +} + +func newAggrStateDiff(pointsLen int) aggrState { + return &aggrStateDiff{ + pointsLen: pointsLen, + vs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateDiff) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + vs := as.vs + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + if counts[i] == 0 { + vs[i] = v + } else { + vs[i] -= v + } + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStateDiff) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + vs := as.vs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = vs[i] + } + values[i] = v + } + return values +} + +type aggrStatePow struct { + pointsLen int + vs []float64 + counts []int + seriesTotal int +} + +func newAggrStatePow(pointsLen int) aggrState { + return &aggrStatePow{ + pointsLen: pointsLen, + vs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStatePow) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + vs := as.vs + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + if counts[i] == 0 { + vs[i] = v + } else { + vs[i] = math.Pow(vs[i], v) + } + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStatePow) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + vs := as.vs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = vs[i] + } + values[i] = v + } + return values +} + +type aggrStateStddev struct { + pointsLen int + means []float64 + m2s []float64 + counts []int + seriesTotal int +} + +func newAggrStateStddev(pointsLen int) aggrState { + return &aggrStateStddev{ + pointsLen: pointsLen, + means: make([]float64, pointsLen), + m2s: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateStddev) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + means := as.means + m2s := as.m2s + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + // See https://en.m.wikipedia.org/wiki/Algorithms_for_calculating_variance#Welford's_online_algorithm + count := counts[i] + mean := means[i] + count++ + delta := v - mean + mean += delta / float64(count) + delta2 := v - mean + means[i] = mean + m2s[i] += delta * delta2 + counts[i] = count + } + as.seriesTotal++ +} + +func (as *aggrStateStddev) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + m2s := as.m2s + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = math.Sqrt(m2s[i] / float64(count)) + } + values[i] = v + } + return values +} + +type aggrStateCount struct { + pointsLen int + counts []int + seriesTotal int +} + +func newAggrStateCount(pointsLen int) aggrState { + return &aggrStateCount{ + pointsLen: pointsLen, + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateCount) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + counts := as.counts + for i, v := range values { + if !math.IsNaN(v) { + counts[i]++ + } + } + as.seriesTotal++ +} + +func (as *aggrStateCount) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = float64(count) + } + values[i] = v + } + return values +} + +type aggrStateRange struct { + pointsLen int + mins []float64 + maxs []float64 + counts []int + seriesTotal int +} + +func newAggrStateRange(pointsLen int) aggrState { + return &aggrStateRange{ + pointsLen: pointsLen, + mins: make([]float64, pointsLen), + maxs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateRange) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + mins := as.mins + maxs := as.maxs + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + counts[i]++ + if counts[i] == 1 { + mins[i] = v + maxs[i] = v + } else if v < mins[i] { + mins[i] = v + } else if v > maxs[i] { + maxs[i] = v + } + } + as.seriesTotal++ +} + +func (as *aggrStateRange) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + mins := as.mins + maxs := as.maxs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = maxs[i] - mins[i] + } + values[i] = v + } + return values +} + +type aggrStateMultiply struct { + pointsLen int + ms []float64 + counts []int + seriesTotal int +} + +func newAggrStateMultiply(pointsLen int) aggrState { + return &aggrStateMultiply{ + pointsLen: pointsLen, + ms: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateMultiply) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + ms := as.ms + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + counts[i]++ + if counts[i] == 1 { + ms[i] = v + } else { + ms[i] *= v + } + } + as.seriesTotal++ +} + +func (as *aggrStateMultiply) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + ms := as.ms + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = ms[i] + } + values[i] = v + } + return values +} + +type aggrStateFirst struct { + pointsLen int + vs []float64 + counts []int + seriesTotal int +} + +func newAggrStateFirst(pointsLen int) aggrState { + return &aggrStateFirst{ + pointsLen: pointsLen, + vs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateFirst) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + vs := as.vs + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + counts[i]++ + if counts[i] == 1 { + vs[i] = v + } + } + as.seriesTotal++ +} + +func (as *aggrStateFirst) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + vs := as.vs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = vs[i] + } + values[i] = v + } + return values +} + +type aggrStateLast struct { + pointsLen int + vs []float64 + counts []int + seriesTotal int +} + +func newAggrStateLast(pointsLen int) aggrState { + return &aggrStateLast{ + pointsLen: pointsLen, + vs: make([]float64, pointsLen), + counts: make([]int, pointsLen), + } +} + +func (as *aggrStateLast) Update(values []float64) { + if len(values) != as.pointsLen { + panic(fmt.Errorf("BUG: unexpected number of points in values; got %d; want %d", len(values), as.pointsLen)) + } + vs := as.vs + counts := as.counts + for i, v := range values { + if math.IsNaN(v) { + continue + } + vs[i] = v + counts[i]++ + } + as.seriesTotal++ +} + +func (as *aggrStateLast) Finalize(xFilesFactor float64) []float64 { + xff := int(xFilesFactor * float64(as.seriesTotal)) + values := make([]float64, as.pointsLen) + vs := as.vs + counts := as.counts + for i, count := range counts { + v := nan + if count > 0 && count >= xff { + v = vs[i] + } + values[i] = v + } + return values +} diff --git a/app/vmselect/graphite/eval.go b/app/vmselect/graphite/eval.go new file mode 100644 index 000000000..22526ddd1 --- /dev/null +++ b/app/vmselect/graphite/eval.go @@ -0,0 +1,210 @@ +package graphite + +import ( + "flag" + "fmt" + "time" + + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/graphiteql" + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/netstorage" + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/searchutils" + "github.com/VictoriaMetrics/VictoriaMetrics/lib/cgroup" + "github.com/VictoriaMetrics/VictoriaMetrics/lib/logger" + "github.com/VictoriaMetrics/VictoriaMetrics/lib/storage" + "github.com/VictoriaMetrics/VictoriaMetrics/lib/timerpool" +) + +var maxGraphiteSeries = flag.Int("search.maxGraphiteSeries", 300e3, "The maximum number of time series, which can be scanned during queries to Graphite Render API. "+ + "See https://docs.victoriametrics.com/#graphite-render-api-usage") + +type evalConfig struct { + startTime int64 + endTime int64 + storageStep int64 + deadline searchutils.Deadline + + currentTime time.Time + + // xFilesFactor is used for determining when consolidateFunc must be applied. + // + // 0 means that consolidateFunc should be applied if at least a single non-NaN data point exists on the given step. + // 1 means that consolidateFunc should be applied if all the data points are non-NaN on the given step. + xFilesFactor float64 + + // Enforced tag filters + etfs [][]storage.TagFilter + + // originalQuery contains the original query - used for debug logging. + originalQuery string +} + +func (ec *evalConfig) pointsLen(step int64) int { + return int((ec.endTime - ec.startTime) / step) +} + +func (ec *evalConfig) newTimestamps(step int64) []int64 { + pointsLen := ec.pointsLen(step) + timestamps := make([]int64, pointsLen) + ts := ec.startTime + for i := 0; i < pointsLen; i++ { + timestamps[i] = ts + ts += step + } + return timestamps +} + +type series struct { + Name string + Tags map[string]string + Timestamps []int64 + Values []float64 + + // holds current path expression like graphite does. + pathExpression string + + expr graphiteql.Expr + + // consolidateFunc is applied to raw samples in order to generate data points algined to the given step. + // see series.consolidate() function for details. + consolidateFunc aggrFunc + + // xFilesFactor is used for determining when consolidateFunc must be applied. + // + // 0 means that consolidateFunc should be applied if at least a single non-NaN data point exists on the given step. + // 1 means that consolidateFunc should be applied if all the data points are non-NaN on the given step. + xFilesFactor float64 + + step int64 +} + +func (s *series) consolidate(ec *evalConfig, step int64) { + aggrFunc := s.consolidateFunc + if aggrFunc == nil { + aggrFunc = aggrAvg + } + xFilesFactor := s.xFilesFactor + if s.xFilesFactor <= 0 { + xFilesFactor = ec.xFilesFactor + } + s.summarize(aggrFunc, ec.startTime, ec.endTime, step, xFilesFactor) +} + +func (s *series) summarize(aggrFunc aggrFunc, startTime, endTime, step int64, xFilesFactor float64) { + pointsLen := int((endTime - startTime) / step) + timestamps := s.Timestamps + values := s.Values + dstTimestamps := make([]int64, 0, pointsLen) + dstValues := make([]float64, 0, pointsLen) + ts := startTime + i := 0 + for len(dstTimestamps) < pointsLen { + tsEnd := ts + step + j := i + for j < len(timestamps) && timestamps[j] < tsEnd { + j++ + } + if i == j && i > 0 && ts-timestamps[i-1] <= 2000 { + // The current [ts ... tsEnd) interval has no samples, + // but the last sample on the previous interval [ts - step ... ts) + // is closer than 2 seconds to the current interval. + // Let's consider that this sample belongs to the current interval, + // since such discrepancy could appear because of small jitter in samples' ingestion. + i-- + } + v := aggrFunc.apply(xFilesFactor, values[i:j]) + dstTimestamps = append(dstTimestamps, ts) + dstValues = append(dstValues, v) + ts = tsEnd + i = j + } + // Do not reuse s.Timestamps and s.Values, since they can be too big + s.Timestamps = dstTimestamps + s.Values = dstValues + s.step = step +} + +func execExpr(ec *evalConfig, query string) (nextSeriesFunc, error) { + expr, err := graphiteql.Parse(query) + if err != nil { + return nil, fmt.Errorf("cannot parse %q: %w", query, err) + } + return evalExpr(ec, expr) +} + +func evalExpr(ec *evalConfig, expr graphiteql.Expr) (nextSeriesFunc, error) { + switch t := expr.(type) { + case *graphiteql.MetricExpr: + return evalMetricExpr(ec, t) + case *graphiteql.FuncExpr: + return evalFuncExpr(ec, t) + default: + return nil, fmt.Errorf("unexpected expression type %T; want graphiteql.MetricExpr or graphiteql.FuncExpr; expr: %q", t, t.AppendString(nil)) + } +} + +func evalMetricExpr(ec *evalConfig, me *graphiteql.MetricExpr) (nextSeriesFunc, error) { + tfs := []storage.TagFilter{{ + Key: []byte("__graphite__"), + Value: []byte(me.Query), + }} + tfss := joinTagFilterss(tfs, ec.etfs) + sq := storage.NewSearchQuery(ec.startTime, ec.endTime, tfss, *maxGraphiteSeries) + return newNextSeriesForSearchQuery(ec, sq, me) +} + +func newNextSeriesForSearchQuery(ec *evalConfig, sq *storage.SearchQuery, expr graphiteql.Expr) (nextSeriesFunc, error) { + rss, err := netstorage.ProcessSearchQuery(nil, sq, ec.deadline) + if err != nil { + return nil, fmt.Errorf("cannot fetch data for %q: %w", sq, err) + } + seriesCh := make(chan *series, cgroup.AvailableCPUs()) + errCh := make(chan error, 1) + go func() { + err := rss.RunParallel(nil, func(rs *netstorage.Result, workerID uint) error { + nameWithTags := getCanonicalPath(&rs.MetricName) + tags := unmarshalTags(nameWithTags) + s := &series{ + Name: tags["name"], + Tags: tags, + Timestamps: append([]int64{}, rs.Timestamps...), + Values: append([]float64{}, rs.Values...), + expr: expr, + pathExpression: string(expr.AppendString(nil)), + } + s.summarize(aggrAvg, ec.startTime, ec.endTime, ec.storageStep, 0) + t := timerpool.Get(30 * time.Second) + select { + case seriesCh <- s: + case <-t.C: + logger.Errorf("resource leak when processing the %s (full query: %s); please report this error to VictoriaMetrics developers", + expr.AppendString(nil), ec.originalQuery) + } + timerpool.Put(t) + return nil + }) + close(seriesCh) + errCh <- err + }() + f := func() (*series, error) { + s := <-seriesCh + if s != nil { + return s, nil + } + err := <-errCh + return nil, err + } + return f, nil +} + +func evalFuncExpr(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + // Do not lowercase the fe.FuncName, since Graphite function names are case-sensitive. + tf := transformFuncs[fe.FuncName] + if tf == nil { + return nil, fmt.Errorf("unknown function %q", fe.FuncName) + } + nextSeries, err := tf(ec, fe) + if err != nil { + return nil, fmt.Errorf("cannot evaluate %s: %w", fe.AppendString(nil), err) + } + return nextSeries, nil +} diff --git a/app/vmselect/graphite/eval_test.go b/app/vmselect/graphite/eval_test.go new file mode 100644 index 000000000..053c92a4f --- /dev/null +++ b/app/vmselect/graphite/eval_test.go @@ -0,0 +1,4064 @@ +package graphite + +import ( + "fmt" + "math" + "reflect" + "strings" + "testing" + "time" + + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/graphiteql" +) + +func TestExecExprSuccess(t *testing.T) { + ec := &evalConfig{ + startTime: 120e3, + endTime: 210e3, + storageStep: 30e3, + currentTime: time.Unix(150e3, 0), + } + f := func(query string, expectedSeries []*series) { + t.Helper() + ecCopy := *ec + nextSeries, err := execExpr(&ecCopy, query) + if err != nil { + t.Fatalf("unexpected error in execExpr(%q): %s", query, err) + } + ss, err := fetchAllSeries(nextSeries) + if err != nil { + t.Fatalf("cannot fetch all series: %s", err) + } + expr, err := graphiteql.Parse(query) + if err != nil { + t.Fatalf("cannot parse query %q: %s", query, err) + } + if err := compareSeries(ss, expectedSeries, expr); err != nil { + t.Fatalf("series mismatch for query %q: %s\ngot series\n%s\nexpected series\n%s", query, err, printSeriess(ss), printSeriess(expectedSeries)) + } + // make sure ec isn't changed during query exection. + if !reflect.DeepEqual(ec, &ecCopy) { + t.Fatalf("unexpected ec\ngot\n%v\nwant\n%v", &ecCopy, ec) + } + } + + f("absolute(constantLine(-1.23))", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{1.23, 1.23, 1.23}, + Name: "absolute(-1.23)", + Tags: map[string]string{"name": "-1.23", "absolute": "1"}, + }, + }) + f("add(constantLine(1.23), 4.57)", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{5.8, 5.8, 5.8}, + Name: "add(1.23,4.57)", + Tags: map[string]string{"name": "1.23", "add": "4.57"}, + pathExpression: "add(1.23,4.57)", + }, + }) + f("add(constantLine(-123), constant=-457)", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{-580, -580, -580}, + Name: "add(-123,-457)", + Tags: map[string]string{"name": "-123", "add": "-457"}, + pathExpression: "add(-123,-457)", + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + constantLine(2)|alias("bar;aa=bb") + ), + "sum" + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{3, 3}, + Name: "sumSeries(constantLine(1),constantLine(2))", + Tags: map[string]string{"name": "sumSeries(constantLine(1),constantLine(2))", "aggregatedBy": "sum"}, + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + time("bar", 10), + ), + "count", + xFilesFactor = 1, + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{2, 2}, + Name: "countSeries(bar,constantLine(1))", + Tags: map[string]string{"name": "countSeries(bar,constantLine(1))", "aggregatedBy": "count"}, + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + time("bar", 10) + ), + "avg_zero" + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{70.5, 93}, + Name: "avg_zeroSeries(bar,constantLine(1))", + Tags: map[string]string{"name": "avg_zeroSeries(bar,constantLine(1))", "aggregatedBy": "avg_zero"}, + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + time("bar", 10) + ), + "min" + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{1, 1}, + Name: "minSeries(bar,constantLine(1))", + Tags: map[string]string{"name": "minSeries(bar,constantLine(1))", "aggregatedBy": "min"}, + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + time("bar", 10), + ), + "diff", + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{-139, -184}, + Name: "diffSeries(constantLine(1),bar)", + Tags: map[string]string{"name": "diffSeries(constantLine(1),bar)", "aggregatedBy": "diff"}, + }, + }) + f(`aggregate( + group( + constantLine(1)|alias("foo"), + time("bar", 10), + ), + "range", + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{139, 184}, + Name: "rangeSeries(bar,constantLine(1))", + Tags: map[string]string{"name": "rangeSeries(bar,constantLine(1))", "aggregatedBy": "range"}, + }, + }) + f(`aggregate( + group( + constantLine(2)|alias("foo"), + time("bar", 10), + ), + "multiply", + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{280, 370}, + Name: "multiplySeries(bar,constantLine(2))", + Tags: map[string]string{"name": "multiplySeries(bar,constantLine(2))", "aggregatedBy": "multiply"}, + }, + }) + f(`aggregate( + group( + constantLine(2)|alias("foo"), + time("bar", 10), + ), + "first", + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{2, 2}, + Name: "firstSeries(constantLine(2),bar)", + Tags: map[string]string{"name": "firstSeries(constantLine(2),bar)", "aggregatedBy": "first"}, + }, + }) + f(`aggregate( + group( + constantLine(2)|alias("foo"), + time("bar", 10), + ), + "last", + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{140, 185}, + Name: "lastSeries(constantLine(2),bar)", + Tags: map[string]string{"name": "lastSeries(constantLine(2),bar)", "aggregatedBy": "last"}, + }, + }) + f("aggregate(group(),'avg')", []*series{}) + f(`aggregateLine( + group( + time("foo", 10), + time("bar", 25), + ) + )`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{165, 165, 165}, + Name: "aggregateLine(foo,165)", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{157.5, 157.5, 157.5}, + Name: "aggregateLine(bar,157.5)", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`aggregateLine(constantLine(1),"count")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{3, 3, 3}, + Name: "aggregateLine(1,3)", + Tags: map[string]string{"name": "1"}, + }, + }) + f(`aggregateLine(time('foo',10),"median")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{170, 170, 170}, + Name: "aggregateLine(foo,170)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time('foo',10),"max")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{210, 210, 210}, + Name: "aggregateLine(foo,210)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time('foo',10),"diff")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{-1410, -1410, -1410}, + Name: "aggregateLine(foo,-1410)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time('foo',10),"stddev")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{28.722813232690143, 28.722813232690143, 28.722813232690143}, + Name: "aggregateLine(foo,28.722813232690143)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time('foo',10),"range")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{90, 90, 90}, + Name: "aggregateLine(foo,90)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time('foo',10),"multiply")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{1.2799358208e+22, 1.2799358208e+22, 1.2799358208e+22}, + Name: "aggregateLine(foo,1.2799358208e+22)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateLine(time("foo",20),func="min",keepStep=True)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 120, 120, 120, 120}, + Name: "aggregateLine(foo,120)", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`aggregateWithWildcards( + group( + time("foo.bar", 30), + time("foo.baz", 60) + ), + func='max' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, nan, 180}, + Name: "foo.baz", + Tags: map[string]string{"name": "foo.baz", "aggregatedBy": "max"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo.bar", + Tags: map[string]string{"name": "foo.bar", "aggregatedBy": "max"}, + }, + }) + f(`aggregateWithWildcards( + group( + time("foo.bar", 30), + time("foo.baz", 60) + ), + func='median', + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo", + Tags: map[string]string{"name": "medianSeries(foo.bar,foo.baz)", "aggregatedBy": "median"}, + pathExpression: "medianSeries(foo.bar,foo.baz)", + }, + }) + f(`aggregateWithWildcards( + group( + time("foo.bar", 30), + time("foo.baz", 60) + ), + func='stddev', + 1, 0, 2 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "", + Tags: map[string]string{"aggregatedBy": "stddev", "name": "stddevSeries(foo.bar,foo.baz)"}, + pathExpression: "stddevSeries(foo.bar,foo.baz)", + }, + }) + + f("alias(constantLine(123), 'foo.bar;baz=aaa')", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{123, 123, 123}, + Name: "foo.bar;baz=aaa", + Tags: map[string]string{"name": "123"}, + pathExpression: "constantLine(123)", + }, + }) + f(`aliasByMetric( + group( + time("foo.bar.baz;x=y"), + time("aaa.bb", 30), + summarize(group(time('a'),time('c.d.b')),'30s'), + ) + )`, []*series{ + { + Timestamps: []int64{ec.startTime, ec.startTime + 60*1000}, + Values: []float64{120, 180}, + Name: "baz;x=y", + Tags: map[string]string{"name": "foo.bar.baz", "x": "y"}, + pathExpression: "foo.bar.baz;x=y", + }, + { + Timestamps: []int64{ec.startTime, ec.startTime + 30*1000, ec.startTime + 60*1000, ec.startTime + 90*1000}, + Values: []float64{120, 150, 180, 210}, + Name: "bb", + Tags: map[string]string{"name": "aaa.bb"}, + pathExpression: "aaa.bb", + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, nan, 180, nan}, + Name: "a", + Tags: map[string]string{"name": "a", "summarize": "30s", "summarizeFunction": "sum"}, + pathExpression: "summarize(a,'30s','sum')", + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, nan, 180, nan}, + Name: "b", + Tags: map[string]string{"name": "c.d.b", "summarize": "30s", "summarizeFunction": "sum"}, + pathExpression: "summarize(c.d.b,'30s','sum')", + }, + }) + f(`aliasByMetric( + summarize( + exclude( + groupByNode( + time("svc.default.first.prod.srv.1.http.returned-codes.500"), + 8, + 'sum'), + '200'), + '5min', + 'sum', + false))`, []*series{ + { + Timestamps: []int64{0}, + Values: []float64{600}, + Name: "500", + Tags: map[string]string{ + "aggregatedBy": "sum", + "name": "svc.default.first.prod.srv.1.http.returned-codes.500", + "summarize": "5min", + "summarizeFunction": "sum", + }, + pathExpression: "summarize(500,'5min','sum')", + }, + }) + + f(`aliasByNode(time("foo.bar.baz"))`, []*series{ + { + Timestamps: []int64{ec.startTime, ec.startTime + 60*1000}, + Values: []float64{120, 180}, + Name: "", + Tags: map[string]string{"name": "foo.bar.baz"}, + pathExpression: "foo.bar.baz", + }, + }) + f(`aliasByTags( + group( + time("foo.bar.baz;aa=bb", 20), + time("foo.xx", 50) + ), + 1, "aa" + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "bar.bb", + Tags: map[string]string{"name": "foo.bar.baz", "aa": "bb"}, + pathExpression: "foo.bar.baz;aa=bb", + }, + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "xx", + Tags: map[string]string{"name": "foo.xx"}, + pathExpression: "foo.xx", + }, + }) + f(`aliasQuery( + group( + time("foo.1.2", 20), + time("foo.3.4", 50), + ), + "foo\.([^.]+\.[^.]+)", + "constantLine(\1)|alias('aaa.\1')", + "foo %d bar %g" + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "foo 1 bar 1.2", + Tags: map[string]string{"name": "foo.1.2"}, + pathExpression: "foo.1.2", + }, + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "foo 3 bar 3.4", + Tags: map[string]string{"name": "foo.3.4"}, + pathExpression: "foo.3.4", + }, + }) + f(`aliasSub( + group( + time("foo.1.2", 20), + time("foo.3.4", 50), + ), + "foo\.([^.]+)\.([^.]+)", + "bar\2\1.x\2" + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "bar21.x2", + Tags: map[string]string{"name": "foo.1.2"}, + pathExpression: "foo.1.2", + }, + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "bar43.x4", + Tags: map[string]string{"name": "foo.3.4"}, + pathExpression: "foo.3.4", + }, + }) + f(`alpha(time("foo",50),0.5)`, []*series{ + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`applyByNode( + time("foo.bar.baz",25), + 1, + "time('%.abc;de=fg',50)" + )`, []*series{ + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "foo.bar.abc;de=fg", + Tags: map[string]string{"name": "foo.bar.abc", "de": "fg"}, + pathExpression: "foo.bar", + }, + }) + f(`applyByNode( + time("foo.bar.baz",25), + 1, + "time('%.abc;de=fg',50)", + "a.%.end" + )`, []*series{ + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "a.foo.bar.end", + Tags: map[string]string{"name": "foo.bar.abc", "de": "fg"}, + pathExpression: "foo.bar", + }, + }) + f(`areaBetween( + group( + time("a"), + time("b"), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "areaBetween(a)", + Tags: map[string]string{"name": "a", "areaBetween": "1"}, + }, + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "areaBetween(b)", + Tags: map[string]string{"name": "b", "areaBetween": "1"}, + }, + }) + f(`asPercent( + group( + time("foo", 30), + time("bar", 30), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{50, 50, 50}, + Name: "asPercent(foo,sumSeries(bar,foo))", + Tags: map[string]string{"name": "asPercent(foo,sumSeries(bar,foo))"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{50, 50, 50}, + Name: "asPercent(bar,sumSeries(bar,foo))", + Tags: map[string]string{"name": "asPercent(bar,sumSeries(bar,foo))"}, + }, + }) + f(`asPercent( + group( + time("foo", 17), + time("bar", 23), + ), + 150 + )`, []*series{ + { + Timestamps: []int64{120000, 137000, 154000, 171000, 188000, 205000}, + Values: []float64{80, 91.33333333333333, 102.66666666666666, 113.99999999999999, 125.33333333333334, 136.66666666666666}, + Name: "asPercent(foo,150)", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{80, 95.33333333333334, 110.66666666666667, 126}, + Name: "asPercent(bar,150)", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`asPercent( + group( + time("foo.x", 30), + time("bar.x", 30), + time("bar.y", 30), + ), + group(), + )`, []*series{}) + f(`asPercent( + group( + time("foo.x", 30), + time("bar.x", 30), + time("bar.y", 30), + ), + None, + 0 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{100, 100, 100}, + Name: "asPercent(foo.x,foo.x)", + Tags: map[string]string{"name": "asPercent(foo.x,foo.x)"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{50, 50, 50}, + Name: "asPercent(bar.x,sumSeries(bar.x,bar.y))", + Tags: map[string]string{"name": "asPercent(bar.x,sumSeries(bar.x,bar.y))"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{50, 50, 50}, + Name: "asPercent(bar.y,sumSeries(bar.x,bar.y))", + Tags: map[string]string{"name": "asPercent(bar.y,sumSeries(bar.x,bar.y))"}, + }, + }) + f(`asPercent( + group( + time("foo;a=b", 30), + time("bar", 30) + ), + constantLine(100)|alias("baz;x=y") + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{135, 180}, + Name: "asPercent(bar,baz;x=y)", + Tags: map[string]string{"name": "asPercent(bar,baz;x=y)"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{135, 180}, + Name: "asPercent(foo;a=b,baz;x=y)", + Tags: map[string]string{"name": "asPercent(foo;a=b,baz;x=y)", "a": "b"}, + }, + }) + f(`asPercent( + group( + time("foo", 30), + time("bar", 30), + ), + group( + time("x", 30), + time("y", 30), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{100, 100, 100}, + Name: "asPercent(bar,y)", + Tags: map[string]string{"name": "asPercent(bar,y)"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{100, 100, 100}, + Name: "asPercent(foo,x)", + Tags: map[string]string{"name": "asPercent(foo,x)"}, + }, + }) + f(`asPercent( + group( + time("foo.x;c=d", 30), + time("bar.b;a=b", 30), + time("bar.a", 30) + ), + group( + time("bar.sss", 30), + time("abc;e=g", 30) + ), + 0 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{100, 100, 100}, + Name: "asPercent(bar.b;a=b,bar.sss)", + Tags: map[string]string{"name": "asPercent(bar.b;a=b,bar.sss)", "a": "b"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{100, 100, 100}, + Name: "asPercent(bar.a,bar.sss)", + Tags: map[string]string{"name": "asPercent(bar.a,bar.sss)"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{nan, nan, nan}, + Name: `asPercent(MISSING,abc;e=g)`, + Tags: map[string]string{"name": `asPercent(MISSING,abc;e=g)`}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{nan, nan, nan}, + Name: "asPercent(foo.x;c=d,MISSING)", + Tags: map[string]string{"name": "asPercent(foo.x;c=d,MISSING)", "c": "d"}, + }, + }) + f(`averageAbove( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 160 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{220, 240, 260, 280, 300}, + Name: "add(baz,100)", + Tags: map[string]string{"name": "baz", "add": "100"}, + }, + }) + f(`averageBelow( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 160 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{10, 10, 10}, + Name: "bar", + Tags: map[string]string{"name": "10"}, + pathExpression: "constantLine(10)", + }, + }) + f(`averageOutsidePercentile( + group( + add(time('a'),-10), + time('b'), + add(time('c'),10), + add(time('d'),20), + ), + 75 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{110, 170}, + Name: "add(a,-10)", + Tags: map[string]string{"name": "a", "add": "-10"}, + }, + { + Timestamps: []int64{120000, 180000}, + Values: []float64{140, 200}, + Name: "add(d,20)", + Tags: map[string]string{"name": "d", "add": "20"}, + }, + }) + f(`averageSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "averageSeries(bar,foo)", + Tags: map[string]string{"name": "averageSeries(bar,foo)", "aggregatedBy": "average"}, + }, + }) + f(`averageSeriesWithWildcards( + group( + time('foo.bar',30), + time('foo.baz',30), + time('xxx.yy',30), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "xxx", + Tags: map[string]string{"aggregatedBy": "average", "name": "xxx.yy"}, + pathExpression: "xxx.yy", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "average", "name": "averageSeries(foo.bar,foo.baz)"}, + pathExpression: "averageSeries(foo.bar,foo.baz)", + }, + }) + f(`averageSeriesWithWildcards( + group( + time('foo.bar',30), + time('foo.baz',30), + time('xxx.yy',30), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "xxx.yy", + Tags: map[string]string{"aggregatedBy": "average", "name": "xxx.yy"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo.bar", + Tags: map[string]string{"aggregatedBy": "average", "name": "foo.bar"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo.baz", + Tags: map[string]string{"aggregatedBy": "average", "name": "foo.baz"}, + }, + }) + f(`avg( + group( + time('foo',30), + time('xxx',30), + ), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "averageSeries(bar,foo,xxx)", + Tags: map[string]string{"name": "averageSeries(bar,foo,xxx)", "aggregatedBy": "average"}, + }, + }) + f(`changed( + group( + constantLine(123)|alias('foo'), + time('bar') + ) + )`, []*series{ + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{0, 0, 0}, + Name: "changed(foo)", + Tags: map[string]string{"name": "123"}, + pathExpression: "changed(foo)", + }, + { + Timestamps: []int64{120000, 180000}, + Values: []float64{0, 1}, + Name: "changed(bar)", + Tags: map[string]string{"name": "bar"}, + pathExpression: "changed(bar)", + }, + }) + f(`color(time("foo",50),'green')`, []*series{ + { + Timestamps: []int64{120000, 170000}, + Values: []float64{120, 170}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`averageSeries( + consolidateBy( + group( + time('foo',30), + time('bar',30) + ), + 'first' + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: `averageSeries(consolidateBy(bar,'first'),consolidateBy(foo,'first'))`, + Tags: map[string]string{ + "name": "averageSeries(consolidateBy(bar,'first'),consolidateBy(foo,'first'))", + "aggregatedBy": "average", + "consolidateBy": "first", + }, + }, + }) + f(`constantLine(123) | alias("foo.bar;baz=aaa")`, []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{123, 123, 123}, + Name: "foo.bar;baz=aaa", + Tags: map[string]string{"name": "123"}, + pathExpression: "constantLine(123)", + }, + }) + f("constantLine(123.456)", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{123.456, 123.456, 123.456}, + Name: "123.456", + Tags: map[string]string{"name": "123.456"}, + pathExpression: "constantLine(123.456)", + }, + }) + f("constantLine(value=-123)", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{-123, -123, -123}, + Name: "-123", + Tags: map[string]string{"name": "-123"}, + pathExpression: "constantLine(value=-123)", + }, + }) + f(`countSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{2, 2, 2}, + Name: "countSeries(bar,foo)", + Tags: map[string]string{"name": "countSeries(bar,foo)", "aggregatedBy": "count"}, + }, + }) + f(`averageSeries( + cumulative( + time('foo', 30) + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: `averageSeries(consolidateBy(foo,'sum'))`, + Tags: map[string]string{"name": "foo", "aggregatedBy": "average", "consolidateBy": "sum"}, + }, + }) + f(`currentAbove( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{220, 240, 260, 280, 300}, + Name: "add(baz,100)", + Tags: map[string]string{"name": "baz", "add": "100"}, + }, + }) + f(`currentBelow( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{10, 10, 10}, + Name: "bar", + Tags: map[string]string{"name": "10"}, + pathExpression: "constantLine(10)", + }, + }) + f(`dashed(time('foo'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "dashed(foo,5)", + Tags: map[string]string{"name": "foo", "dashed": "5"}, + pathExpression: "foo", + }, + }) + f(`delay(time('foo',20),1)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{nan, 120, 140, 160, 180}, + Name: "delay(foo,1)", + Tags: map[string]string{"name": "foo", "delay": "1"}, + }, + }) + f(`delay(time('foo',20),-1)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{140, 160, 180, 200, nan}, + Name: "delay(foo,-1)", + Tags: map[string]string{"name": "foo", "delay": "-1"}, + }, + }) + f(`delay(time('foo',20),0)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "delay(foo,0)", + Tags: map[string]string{"name": "foo", "delay": "0"}, + }, + }) + f(`delay(time('foo',20),100)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{nan, nan, nan, nan, nan}, + Name: "delay(foo,100)", + Tags: map[string]string{"name": "foo", "delay": "100"}, + }, + }) + f(`delay(time('foo',20),-100)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{nan, nan, nan, nan, nan}, + Name: "delay(foo,-100)", + Tags: map[string]string{"name": "foo", "delay": "-100"}, + }, + }) + f(`derivative(time('foo',25))`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{nan, 25, 25, 25}, + Name: "derivative(foo)", + Tags: map[string]string{"name": "foo", "derivative": "1"}, + }, + }) + f(`diffSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "diffSeries(foo,bar)", + Tags: map[string]string{"name": "diffSeries(foo,bar)", "aggregatedBy": "diff"}, + }, + }) + f(`divideSeries( + group( + time('foo',30), + time('bar',30) + ), + add(time('xx',30),100) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0.5454545454545454, 0.6, 0.6428571428571429}, + Name: "divideSeries(foo,add(xx,100))", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0.5454545454545454, 0.6, 0.6428571428571429}, + Name: "divideSeries(bar,add(xx,100))", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`divideSeries( + group( + time('foo',20), + time('bar',30) + ), + group() + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{nan, nan, nan, nan, nan}, + Name: "divideSeries(foo,MISSING)", + Tags: map[string]string{"name": "foo"}, + pathExpression: "divideSeries(foo,MISSING)", + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{nan, nan, nan, nan}, + Name: "divideSeries(bar,MISSING)", + Tags: map[string]string{"name": "bar"}, + pathExpression: "divideSeries(bar,MISSING)", + }, + }) + f(`divideSeriesLists( + group( + time('foo',30), + time('bar',30) + ), + group( + time('xx',30), + time('y',30), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{1, 1, 1}, + Name: "divideSeries(foo,xx)", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{1, 1, 1}, + Name: "divideSeries(bar,y)", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`drawAsInfinite(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "drawAsInfinite(a)", + Tags: map[string]string{"name": "a", "drawAsInfinite": "1"}, + pathExpression: "a", + }, + }) + f(`events()`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{nan, nan, nan}, + Name: "events()", + Tags: map[string]string{"name": "events()"}, + }, + }) + f(`events("foo","bar")`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{nan, nan, nan}, + Name: "events('foo','bar')", + Tags: map[string]string{"name": "events('foo','bar')"}, + }, + }) + f(`exclude( + group( + time("foo.bar.baz"), + time("x"), + ), + "bar" + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "x", + Tags: map[string]string{"name": "x"}, + }, + }) + f(`exp(scale(time('a',25),1e-2))`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{3.3201169227365472, 4.263114515168817, 5.4739473917272, 7.028687580589293}, + Name: "exp(scale(a,0.01))", + Tags: map[string]string{"name": "a", "exp": "e"}, + pathExpression: "exp(scale(a,0.01))", + }, + }) + f(`exponentialMovingAverage(time('a',20),'1min')`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{81.31147540983606, 83.23568933082504, 85.75255197571603, 88.8426322388073, 92.48713609983001}, + Name: "exponentialMovingAverage(a,'1min')", + Tags: map[string]string{"name": "a", "exponentialMovingAverage": "'1min'"}, + }, + }) + f(`exponentialMovingAverage(time('a',20),'10s')`, []*series{ + { + Timestamps: []int64{130000, 150000, 170000, 190000, 210000}, + Values: []float64{113.63636363636364, 120.24793388429751, 129.2937640871525, 140.33126152585203, 152.998304884788}, + Name: "exponentialMovingAverage(a,'10s')", + Tags: map[string]string{"name": "a", "exponentialMovingAverage": "'10s'"}, + }, + }) + f(`exponentialMovingAverage(time('a',20),5)`, []*series{ + { + Timestamps: []int64{130000, 150000, 170000, 190000, 210000}, + Values: []float64{70, 96.66666666666667, 121.11111111111111, 144.07407407407408, 166.0493827160494}, + Name: "exponentialMovingAverage(a,5)", + Tags: map[string]string{"name": "a", "exponentialMovingAverage": "5"}, + }, + }) + f(`fallbackSeries(time('a'),constantLine(10))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "a", + Tags: map[string]string{"name": "a"}, + }, + }) + f(`fallbackSeries(group(),constantLine(10))`, []*series{ + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{10, 10, 10}, + Name: "10", + Tags: map[string]string{"name": "10"}, + pathExpression: "constantLine(10)", + }, + }) + f(`filterSeries( + group( + time('a',20), + add(time('b',20),200), + ), + 'last','>=',300 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{320, 340, 360, 380, 400}, + Name: "add(b,200)", + Tags: map[string]string{"name": "b", "add": "200"}, + }, + }) + f(`filterSeries( + group( + time('a',20), + add(time('b',20),200), + ), + 'first','<=',120 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "a", + Tags: map[string]string{"name": "a"}, + }, + }) + f(`filterSeries( + group( + time('a',20), + add(time('b',20),200), + ), + 'first','=',120 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "a", + Tags: map[string]string{"name": "a"}, + }, + }) + f(`filterSeries( + group( + time('a',20), + add(time('b',20),200), + ), + 'first','!=',120 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{320, 340, 360, 380, 400}, + Name: "add(b,200)", + Tags: map[string]string{"name": "b", "add": "200"}, + }, + }) + f(`grep( + group( + time("foo.bar.baz"), + time("x"), + ), + "bar" + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo.bar.baz", + Tags: map[string]string{"name": "foo.bar.baz"}, + }, + }) + f("group()", []*series{}) + f("group(constantLine(1)|alias('foo'), constantLine(2) | alias('bar'))", []*series{ + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{1, 1, 1}, + Name: "foo", + Tags: map[string]string{"name": "1"}, + pathExpression: "constantLine(1)", + }, + { + Timestamps: []int64{ec.startTime, (ec.startTime + ec.endTime) / 2, ec.endTime}, + Values: []float64{2, 2, 2}, + Name: "bar", + Tags: map[string]string{"name": "2"}, + pathExpression: "constantLine(2)", + }, + }) + f(`groupByNode( + group( + time("foo.bar", 30), + time("foo.baz", 30) + ), + 0 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "average", "name": "averageSeries(foo.bar,foo.baz)"}, + pathExpression: "averageSeries(foo.bar,foo.baz)", + }, + }) + f(`groupByNode( + group( + time("foo.bar", 30), + time("foo.baz", 30) + ), + 0, + 'last' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "last", "name": "lastSeries(foo.bar,foo.baz)"}, + pathExpression: "lastSeries(foo.bar,foo.baz)", + }, + }) + f(`groupByNodes( + group( + time("foo.bar", 30), + time("foo.baz", 30) + ), + callback='first', + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "", + Tags: map[string]string{"aggregatedBy": "first", "name": "firstSeries(foo.bar,foo.baz)"}, + pathExpression: "firstSeries(foo.bar,foo.baz)", + }, + }) + f(`groupByNodes( + group( + time("foo.bar", 30), + time("foo.baz", 30) + ), + 'median', + 0 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "median", "name": "medianSeries(foo.bar,foo.baz)"}, + pathExpression: "medianSeries(foo.bar,foo.baz)", + }, + }) + f(`groupByTags( + group( + time("foo;bar=baz", 30), + time("x;bar=baz;aa=bb", 30) + ), + 'median', + 'bar' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "median;bar=baz", + Tags: map[string]string{"aggregatedBy": "median", "bar": "baz", "name": `medianSeries(foo;bar=baz,x;bar=baz;aa=bb)`}, + pathExpression: "medianSeries(foo;bar=baz,x;bar=baz;aa=bb)", + }, + }) + f(`groupByTags( + group( + time("foo;bar=baz", 30), + time("x;bar=baz;aa=bb", 30) + ), + 'median', + 'bar', 'name' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo;bar=baz", + Tags: map[string]string{"aggregatedBy": "median", "bar": "baz", "name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "x;bar=baz", + Tags: map[string]string{"aa": "bb", "aggregatedBy": "median", "bar": "baz", "name": "x"}, + pathExpression: "x", + }, + }) + f(`highest( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 147000, 174000, 201000}, + Values: []float64{120, 147, 174, 201}, + Name: "bar", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`highest( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 4, + 'avg' + )`, []*series{ + { + Timestamps: []int64{120000, 147000, 174000, 201000}, + Values: []float64{120, 147, 174, 201}, + Name: "bar", + Tags: map[string]string{"name": "bar"}, + }, + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 145, 170, 195}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{120, 143, 166, 189}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + }) + f(`highestAverage( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 147000, 174000, 201000}, + Values: []float64{120, 147, 174, 201}, + Name: "bar", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`highestCurrent( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 147000, 174000, 201000}, + Values: []float64{120, 147, 174, 201}, + Name: "bar", + Tags: map[string]string{"name": "bar"}, + }, + }) + f(`highestMax( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 2 + )`, []*series{ + { + Timestamps: []int64{120000, 147000, 174000, 201000}, + Values: []float64{120, 147, 174, 201}, + Name: "bar", + Tags: map[string]string{"name": "bar"}, + }, + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 145, 170, 195}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`hitcount(time('foo',20),'60s')`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{6000, 4000}, + Name: "hitcount(foo,'60s')", + Tags: map[string]string{"name": "foo", "hitcount": "60s"}, + }, + }) + f(`hitcount(time('foo',25),'60s')`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{7875, 5475}, + Name: "hitcount(foo,'60s')", + Tags: map[string]string{"name": "foo", "hitcount": "60s"}, + }, + }) + f(`hitcount(time('foo',25),'60s',true)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{7875, 5475}, + Name: "hitcount(foo,'60s',true)", + Tags: map[string]string{"name": "foo", "hitcount": "60s"}, + }, + }) + f(`identity('foo')`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`integral( + group( + time('foo',30), + time('bar',25), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, 270, 450, 660}, + Name: "integral(foo)", + Tags: map[string]string{"name": "foo", "integral": "1"}, + }, + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 265, 435, 630}, + Name: "integral(bar)", + Tags: map[string]string{"name": "bar", "integral": "1"}, + }, + }) + f(`integralByInterval( + group( + time('foo',30), + time('bar',25), + ), + '60s' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, 270, 180, 390}, + Name: "integralByInterval(foo,'60s')", + Tags: map[string]string{"name": "foo", "integralByInterval": "1"}, + }, + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 265, 435, 195}, + Name: "integralByInterval(bar,'60s')", + Tags: map[string]string{"name": "bar", "integralByInterval": "1"}, + }, + }) + f(`interpolate(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "interpolate(a)", + Tags: map[string]string{"name": "a"}, + pathExpression: "interpolate(a)", + }, + }) + f(`invert(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{0.008333333333333333, 0.005555555555555556}, + Name: "invert(a)", + Tags: map[string]string{"name": "a", "invert": "1"}, + pathExpression: "invert(a)", + }, + }) + f(`keepLastValue(removeAboveValue(time('a'),150))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 120}, + Name: "keepLastValue(removeAboveValue(a,150))", + Tags: map[string]string{"name": "a"}, + pathExpression: ("keepLastValue(removeAboveValue(a,150))"), + }, + }) + f(`limit( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 145, 170, 195}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`lineWidth(time('a'),2)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "a", + Tags: map[string]string{"name": "a"}, + }, + }) + f(`logarithm(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{2.0791812460476247, 2.255272505103306}, + Name: "log(a,10)", + Tags: map[string]string{"name": "a", "log": "10"}, + }, + }) + f(`logarithm(time('a'), 2)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{6.906890595608519, 7.491853096329675}, + Name: "log(a,2)", + Tags: map[string]string{"name": "a", "log": "2"}, + }, + }) + f(`logarithm(time('a'),-2)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{nan, nan}, + Name: "log(a,-2)", + Tags: map[string]string{"name": "a", "log": "-2"}, + }, + }) + f(`logit(invert(time('a')))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{-4.77912349311153, -5.187385805840755}, + Name: "logit(invert(a))", + Tags: map[string]string{"name": "a", "invert": "1", "logit": "logit"}, + pathExpression: "logit(invert(a))", + }, + }) + f(`logit(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{nan, nan}, + Name: "logit(a)", + Tags: map[string]string{"name": "a", "logit": "logit"}, + pathExpression: "logit(a)", + }, + }) + f(`lowest( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{120, 143, 166, 189}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + }) + f(`lowest( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 2, + 'sum' + )`, []*series{ + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{120, 143, 166, 189}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{120, 145, 170, 195}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`lowestAverage( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{120, 143, 166, 189}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + }) + f(`lowestCurrent( + group( + time('foo',25), + time('bar',27), + time('baz',23), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 143000, 166000, 189000}, + Values: []float64{120, 143, 166, 189}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + }) + f(`maxSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "maxSeries(bar,foo)", + Tags: map[string]string{"name": "maxSeries(bar,foo)", "aggregatedBy": "max"}, + }, + }) + f(`maximumAbove( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{220, 240, 260, 280, 300}, + Name: "add(baz,100)", + Tags: map[string]string{"name": "baz", "add": "100"}, + }, + }) + f(`maximumBelow( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{10, 10, 10}, + Name: "bar", + Tags: map[string]string{"name": "10"}, + pathExpression: "constantLine(10)", + }, + }) + f(`minMax(time('foo',20))`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{0, 0.25, 0.5, 0.75, 1}, + Name: "minMax(foo)", + Tags: map[string]string{"name": "foo"}, + pathExpression: "minMax(foo)", + }, + }) + f(`minSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "minSeries(bar,foo)", + Tags: map[string]string{"name": "minSeries(bar,foo)", "aggregatedBy": "min"}, + }, + }) + f(`minimumAbove( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{220, 240, 260, 280, 300}, + Name: "add(baz,100)", + Tags: map[string]string{"name": "baz", "add": "100"}, + }, + }) + f(`minimumBelow( + group( + time('foo'), + constantLine(10)|alias('bar'), + time('baz', 20)|add(100), + ), + 200 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{10, 10, 10}, + Name: "bar", + Tags: map[string]string{"name": "10"}, + pathExpression: "constantLine(10)", + }, + }) + f(`mostDeviant( + group( + time('foo',18), + time('bar',23), + time('baz',30), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, 150, 180, 210}, + Name: "baz", + Tags: map[string]string{"name": "baz"}, + }, + }) + f(`movingAverage( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingAverage(foo,5)", + Tags: map[string]string{"name": "foo", "movingAverage": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingAverage(bar,5)", + Tags: map[string]string{"name": "bar", "movingAverage": "5"}, + }, + }) + f(`movingAverage( + summarize( + group( + time('foo',10), + time('bar',20), + ),'1m','sum',false + ), + 2 + )`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{330, 690}, + Name: "movingAverage(summarize(foo,'1m','sum'),2)", + Tags: map[string]string{"name": "foo", "movingAverage": "2", "summarize": "1m", "summarizeFunction": "sum"}, + }, + { + Timestamps: []int64{120000, 180000}, + Values: []float64{150, 330}, + Name: "movingAverage(summarize(bar,'1m','sum'),2)", + Tags: map[string]string{"name": "bar", "movingAverage": "2", "summarize": "1m", "summarizeFunction": "sum"}, + }, + }) + f(`movingMax( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{90, 120, 150, 180}, + Name: "movingMax(foo,5)", + Tags: map[string]string{"name": "foo", "movingMax": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{90, 120, 150, 180}, + Name: "movingMax(bar,5)", + Tags: map[string]string{"name": "bar", "movingMax": "5"}, + }, + }) + f(`movingMedian( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingMedian(foo,5)", + Tags: map[string]string{"name": "foo", "movingMedian": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingMedian(bar,5)", + Tags: map[string]string{"name": "bar", "movingMedian": "5"}, + }, + }) + f(`movingMin( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{-30, 0, 30, 60}, + Name: "movingMin(foo,5)", + Tags: map[string]string{"name": "foo", "movingMin": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{-30, 0, 30, 60}, + Name: "movingMin(bar,5)", + Tags: map[string]string{"name": "bar", "movingMin": "5"}, + }, + }) + f(`movingSum( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{150, 300, 450, 600}, + Name: "movingSum(foo,5)", + Tags: map[string]string{"name": "foo", "movingSum": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{150, 300, 450, 600}, + Name: "movingSum(bar,5)", + Tags: map[string]string{"name": "bar", "movingSum": "5"}, + }, + }) + f(`movingWindow( + group( + time('foo',30), + time('bar',30), + ), + 5 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingAvg(foo,5)", + Tags: map[string]string{"name": "foo", "movingAvg": "5"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{30, 60, 90, 120}, + Name: "movingAvg(bar,5)", + Tags: map[string]string{"name": "bar", "movingAvg": "5"}, + }, + }) + f(`movingWindow( + group( + time('foo',30), + time('bar',30), + ), + '30s', + 'avg_zero' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{90, 120, 150, 180}, + Name: `movingAvg_zero(foo,'30s')`, + Tags: map[string]string{"name": "foo", "movingAvg_zero": "'30s'"}, + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{90, 120, 150, 180}, + Name: `movingAvg_zero(bar,'30s')`, + Tags: map[string]string{"name": "bar", "movingAvg_zero": "'30s'"}, + }, + }) + f(`multiplySeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{14400, 22500, 32400}, + Name: "multiplySeries(bar,foo)", + Tags: map[string]string{"name": "multiplySeries(bar,foo)", "aggregatedBy": "multiply"}, + }, + }) + f(`multiplySeriesWithWildcards( + group( + time('foo.bar',30), + time('foo.baz',30), + time('xxx.yy',30), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "xxx", + Tags: map[string]string{"aggregatedBy": "multiply", "name": "xxx.yy"}, + pathExpression: "xxx.yy", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{14400, 22500, 32400}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "multiply", "name": "multiplySeries(foo.bar,foo.baz)"}, + pathExpression: "multiplySeries(foo.bar,foo.baz)", + }, + }) + f(`nPercentile( + group( + time('a',20), + time('b',17) + ), + 30 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{140, 140, 140, 140, 140}, + Name: "nPercentile(a,30)", + Tags: map[string]string{"name": "a", "nPercentile": "30"}, + }, + { + Timestamps: []int64{120000, 137000, 154000, 171000, 188000, 205000}, + Values: []float64{154, 154, 154, 154, 154, 154}, + Name: "nPercentile(b,30)", + Tags: map[string]string{"name": "b", "nPercentile": "30"}, + }, + }) + f(`nonNegativeDerivative(time('foo.bar;baz=1',25))`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{nan, 25, 25, 25}, + Name: "nonNegativeDerivative(foo.bar;baz=1)", + Tags: map[string]string{"name": "foo.bar", "baz": "1", "nonNegativeDerivative": "1"}, + }, + }) + f(`offset(time('a'),10)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{130, 190}, + Name: "offset(a,10)", + Tags: map[string]string{"name": "a", "offset": "10"}, + pathExpression: "offset(a,10)", + }, + }) + f(`offsetToZero(time('a',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{0, 30, 60, 90}, + Name: "offsetToZero(a)", + Tags: map[string]string{"name": "a", "offsetToZero": "120"}, + pathExpression: "offsetToZero(a)", + }, + }) + f(`rangeOfSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "rangeOfSeries(bar,foo)", + Tags: map[string]string{"name": "rangeOfSeries(bar,foo)", "aggregatedBy": "rangeOf"}, + }, + }) + f(`pow(time('a'),0.5)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{10.954451150103322, 13.416407864998739}, + Name: "pow(a,0.5)", + Tags: map[string]string{"name": "a", "pow": "0.5"}, + pathExpression: "pow(a,0.5)", + }, + }) + f(`powSeries( + time('a',30), + time('b',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{3.1750423737803376e+249, math.Inf(1), math.Inf(1)}, + Name: "powSeries(a,b)", + Tags: map[string]string{"name": "powSeries(a,b)", "aggregatedBy": "pow"}, + }, + }) + f(`removeAbovePercentile(time('a',35), 50)`, []*series{ + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{120, 155, nan}, + Name: "removeAbovePercentile(a,50)", + Tags: map[string]string{"name": "a"}, + pathExpression: "removeAbovePercentile(a,50)", + }, + }) + f(`removeAboveValue(time('a'), 150)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, nan}, + Name: "removeAboveValue(a,150)", + Tags: map[string]string{"name": "a"}, + pathExpression: "removeAboveValue(a,150)", + }, + }) + f(`removeBelowPercentile(time('a'), 50)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{nan, 180}, + Name: "removeBelowPercentile(a,50)", + Tags: map[string]string{"name": "a"}, + pathExpression: "removeBelowPercentile(a,50)", + }, + }) + f(`removeBelowValue(time('a'), 150)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{nan, 180}, + Name: "removeBelowValue(a,150)", + Tags: map[string]string{"name": "a"}, + pathExpression: "removeBelowValue(a,150)", + }, + }) + f(`removeBetweenPercentile( + group( + time('a',30), + time('b',30), + time('c',30), + ), + 70 + )`, []*series{}) + f(`removeEmptySeries(time('a'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "a", + Tags: map[string]string{"name": "a"}, + }, + }) + f(`removeEmptySeries(removeBelowValue(time('a'),150),1)`, []*series{}) + f(`round(time('a',17),-1)`, []*series{ + { + Timestamps: []int64{120000, 137000, 154000, 171000, 188000, 205000}, + Values: []float64{120, 140, 150, 170, 190, 210}, + Name: "round(a,-1)", + Tags: map[string]string{"name": "a"}, + pathExpression: "round(a,-1)", + }, + }) + f(`round(time('a',17))`, []*series{ + { + Timestamps: []int64{120000, 137000, 154000, 171000, 188000, 205000}, + Values: []float64{120, 137, 154, 171, 188, 205}, + Name: "round(a)", + Tags: map[string]string{"name": "a"}, + pathExpression: "round(a)", + }, + }) + f(`scale(time('a'),0.5)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{60, 90}, + Name: "scale(a,0.5)", + Tags: map[string]string{"name": "a"}, + pathExpression: ("scale(a,0.5)"), + }, + }) + f(`setXFilesFactor( + time('foo',20), + 0.5 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "foo", + Tags: map[string]string{"name": "foo", "xFilesFactor": "0.5"}, + }, + }) + f(`sumSeriesWithWildcards( + group( + time('foo.bar',30), + time('foo.baz',30), + time('xxx.yy',30), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "xxx", + Tags: map[string]string{"aggregatedBy": "sum", "name": "xxx.yy"}, + pathExpression: "xxx.yy", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{240, 300, 360}, + Name: "foo", + Tags: map[string]string{"aggregatedBy": "sum", "name": "sumSeries(foo.bar,foo.baz)"}, + pathExpression: "sumSeries(foo.bar,foo.baz)", + }, + }) + f(`summarize( + group( + time('foo',13), + time('bar',21), + ), + '45s' + )`, []*series{ + { + Timestamps: []int64{90000, 135000, 180000}, + Values: []float64{333, 327, 411}, + Name: `summarize(bar,'45s','sum')`, + Tags: map[string]string{"name": "bar", "summarize": "45s", "summarizeFunction": "sum"}, + }, + { + Timestamps: []int64{90000, 135000, 180000}, + Values: []float64{438, 465, 802}, + Name: `summarize(foo,'45s','sum')`, + Tags: map[string]string{"name": "foo", "summarize": "45s", "summarizeFunction": "sum"}, + }, + }) + f(`summarize( + group( + time('foo',13), + time('bar',21), + ), + '45s', + 'sum', + True + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{558, 555}, + Name: `summarize(foo,'45s','sum',true)`, + Tags: map[string]string{"name": "foo", "summarize": "45s", "summarizeFunction": "sum"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{423, 387}, + Name: `summarize(bar,'45s','sum',true)`, + Tags: map[string]string{"name": "bar", "summarize": "45s", "summarizeFunction": "sum"}, + }, + }) + f(`summarize( + group( + time('foo',13), + time('bar',21), + ), + '45s', + 'sumSeries', + True + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{558, 555}, + Name: `summarize(foo,'45s','sumSeries',true)`, + Tags: map[string]string{"name": "foo", "summarize": "45s", "summarizeFunction": "sumSeries"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{423, 387}, + Name: `summarize(bar,'45s','sumSeries',true)`, + Tags: map[string]string{"name": "bar", "summarize": "45s", "summarizeFunction": "sumSeries"}, + }, + }) + f(`summarize( + group( + time('foo',13), + time('bar',21), + ), + '45s', + 'last', + True + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{159, 198}, + Name: `summarize(foo,'45s','last',true)`, + Tags: map[string]string{"name": "foo", "summarize": "45s", "summarizeFunction": "last"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{162, 204}, + Name: `summarize(bar,'45s','last',true)`, + Tags: map[string]string{"name": "bar", "summarize": "45s", "summarizeFunction": "last"}, + }, + }) + f(`time('foo.bar;baz=aa', 40)`, []*series{ + { + Timestamps: []int64{ec.startTime, ec.startTime + 40e3, ec.startTime + 80e3}, + Values: []float64{float64(ec.startTime) / 1e3, float64(ec.startTime)/1e3 + 40, float64(ec.startTime)/1e3 + 80}, + Name: "foo.bar;baz=aa", + Tags: map[string]string{"name": "foo.bar", "baz": "aa"}, + pathExpression: "foo.bar;baz=aa", + }, + }) + f(`timeFunction("foo.bar.baz")`, []*series{ + { + Timestamps: []int64{ec.startTime, ec.startTime + 60e3}, + Values: []float64{float64(ec.startTime) / 1e3, float64(ec.startTime)/1e3 + 60}, + Name: "foo.bar.baz", + Tags: map[string]string{"name": "foo.bar.baz"}, + pathExpression: "foo.bar.baz", + }, + }) + f(`timeFunction('foo.bar;baz=aa', step=30)`, []*series{ + { + Timestamps: []int64{ec.startTime, ec.startTime + 30e3, ec.startTime + 60e3, ec.startTime + 90e3}, + Values: []float64{float64(ec.startTime) / 1e3, float64(ec.startTime)/1e3 + 30, float64(ec.startTime)/1e3 + 60, float64(ec.startTime)/1e3 + 90}, + Name: "foo.bar;baz=aa", + Tags: map[string]string{"name": "foo.bar", "baz": "aa"}, + }, + }) + f(`weightedAverage(time('foo',30),time('bar',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "weightedAverage(foo,bar,)", + Tags: map[string]string{"name": "weightedAverage(foo,bar,)"}, + }, + }) + f(`weightedAverage( + group( + time("foo.x", 30), + time("bar.y", 30), + ), + group( + time("bar.x", 30), + time("foo.y", 30), + ), + 0 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "weightedAverage(bar.y,foo.x,bar.x,foo.y,0)", + Tags: map[string]string{"name": "weightedAverage(bar.y,foo.x,bar.x,foo.y,0)"}, + }, + }) + f(`weightedAverage( + group( + time("foo", 10) | alias("foo.bar1"), + time("bar", 10) | alias("foo.bar2"), + ), + group( + time("bar", 10) | alias("foo.bar3"), + time("foo", 10) | alias("foo.bar4"), + ), + 1 + )`, []*series{}) + f(`weightedAverage( + group( + time("foo0.bar2",30), + time("foo0.bar1",30) , + ), + group( + time("foo1.bar1",30), + time("foo1.bar2",30), + ), + 1 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "weightedAverage(foo0.bar1,foo0.bar2,foo1.bar1,foo1.bar2,1)", + Tags: map[string]string{"name": "weightedAverage(foo0.bar1,foo0.bar2,foo1.bar1,foo1.bar2,1)"}, + }, + }) + f(`xFilesFactor( + time('foo',20), + 0.5 + )`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, 200}, + Name: "foo", + Tags: map[string]string{"name": "foo", "xFilesFactor": "0.5"}, + }, + }) + f(`verticalLine("00:03_19700101","event","blue")`, []*series{ + { + Timestamps: []int64{180000, 180000}, + Values: []float64{1.0, 1.0}, + Name: "event", + Tags: map[string]string{"name": "event"}, + }, + }) + f(`verticalLine("00:0319700101","event","blue")`, []*series{ + { + Timestamps: []int64{180000, 180000}, + Values: []float64{1.0, 1.0}, + Name: "event", + Tags: map[string]string{"name": "event"}, + }, + }) + f(`useSeriesAbove(time('foo.baz',10),10000,"reqs","time")`, []*series{}) + f(`unique(time('foo',30),time('foo',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120.0, 150.0, 180.0, 210.0}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + }) + f(`unique(time('foo',30),time('foo',40),time('foo.bar',40))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120.0, 150.0, 180.0, 210.0}, + Name: "foo", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 160000, 200000}, + Values: []float64{120.0, 160.0, 200.0}, + Name: "foo.bar", + Tags: map[string]string{"name": "foo.bar"}, + }, + }) + f(`perSecond(time('foo.bar;baz=1',25))`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{nan, 1, 1, 1}, + Name: "perSecond(foo.bar;baz=1)", + Tags: map[string]string{"name": "foo.bar", "baz": "1", "perSecond": "1"}, + }, + }) + f(`perSecond(time('foo.bar;baz=1',25),150)`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{nan, 1, nan, nan}, + Name: "perSecond(foo.bar;baz=1)", + Tags: map[string]string{"name": "foo.bar", "baz": "1", "perSecond": "1"}, + }, + }) + f(`perSecond(time('foo.bar;baz=1',25),None,140)`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{nan, nan, 1, 1}, + Name: "perSecond(foo.bar;baz=1)", + Tags: map[string]string{"name": "foo.bar", "baz": "1", "perSecond": "1"}, + }, + }) + f(`percentileOfSeries( + group( + time('a',30), + time('b',30), + time('c',30) + ), + 40 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "percentileOfSeries(a,40)", + Tags: map[string]string{"name": "percentileOfSeries(a,40)"}, + }, + }) + f(`percentileOfSeries( + group( + time('a',30), + time('b',30), + time('c',30), + ), + 90 + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "percentileOfSeries(a,90)", + Tags: map[string]string{"name": "percentileOfSeries(a,90)"}, + }, + }) + f(`transformNull(time('foo.bar',35),-1,time('foo.bar',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 155, 190}, + Name: "transformNull(foo.bar,-1,referenceSeries)", + Tags: map[string]string{"name": "foo.bar", "referenceSeries": "1", "transformNull": "-1"}, + pathExpression: "transformNull(foo.bar,-1,referenceSeries)", + }, + }) + f(`transformNull(time('foo.bar',35),-1)`, []*series{ + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{120, 155, 190}, + Name: "transformNull(foo.bar,-1)", + Tags: map[string]string{"name": "foo.bar", "transformNull": "-1"}, + pathExpression: "transformNull(foo.bar,-1)", + }, + }) + f(`timeShift(time('foo.bar;baz=1',25),"+1min")`, []*series{ + { + Timestamps: []int64{120000, 145000}, + Values: []float64{180, 205}, + Name: `timeShift(foo.bar;baz=1,'+1min')`, + Tags: map[string]string{"name": "foo.bar", "baz": "1", "timeShift": "+1min"}, + pathExpression: "foo.bar;baz=1", + }, + }) + f(`timeShift(time('foo.bar;baz=1',25),"+1min",false)`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{180, 205, 230, 255}, + Name: `timeShift(foo.bar;baz=1,'+1min')`, + Tags: map[string]string{"name": "foo.bar", "baz": "1", "timeShift": "+1min"}, + pathExpression: "foo.bar;baz=1", + }, + }) + f(`timeShift(time('foo.bar;baz=1',25),"-1min",true)`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{60, 85, 110, 135}, + Name: "timeShift(foo.bar;baz=1,'-1min')", + Tags: map[string]string{"name": "foo.bar", "baz": "1", "timeShift": "-1min"}, + pathExpression: "foo.bar;baz=1", + }, + }) + f(`timeShift(time('foo.bar;baz=1',25),"1min",false,true)`, []*series{ + { + Timestamps: []int64{120000, 145000, 170000, 195000}, + Values: []float64{60, 85, 110, 135}, + Name: `timeShift(foo.bar;baz=1,'1min')`, + Tags: map[string]string{"name": "foo.bar", "baz": "1", "timeShift": "1min"}, + pathExpression: "foo.bar;baz=1", + }, + }) + f(`timeSlice(time('foo.bar;bar=1',20),"00:00 19700101","00:03 19700101")`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{120, 140, 160, 180, nan}, + Name: "timeSlice(foo.bar;bar=1,0,180)", + Tags: map[string]string{"name": "foo.bar", "bar": "1", "timeSliceEnd": "180", "timeSliceStart": "0"}, + pathExpression: "foo.bar;bar=1", + }, + }) + f(`timeStack(time('foo.bar',35),"+1min",1,3)`, []*series{ + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{180, 215, 250}, + Name: "timeShift(foo.bar,+1min,1)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "1", "timeShiftUnit": "+1min"}, + pathExpression: `timeShift(foo.bar,+1min,1)`, + }, + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{240, 275, 310}, + Name: "timeShift(foo.bar,+1min,2)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "2", "timeShiftUnit": "+1min"}, + pathExpression: `timeShift(foo.bar,+1min,2)`, + }, + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{300, 335, 370}, + Name: "timeShift(foo.bar,+1min,3)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "3", "timeShiftUnit": "+1min"}, + pathExpression: `timeShift(foo.bar,+1min,3)`, + }, + }) + f(`timeStack(time('foo.bar',35),"1min",1,3)`, []*series{ + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{60, 95, 130}, + Name: "timeShift(foo.bar,1min,1)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "1", "timeShiftUnit": "1min"}, + pathExpression: `timeShift(foo.bar,1min,1)`, + }, + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{0, 35, 70}, + Name: "timeShift(foo.bar,1min,2)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "2", "timeShiftUnit": "1min"}, + pathExpression: `timeShift(foo.bar,1min,2)`, + }, + { + Timestamps: []int64{120000, 155000, 190000}, + Values: []float64{-60, -25, 10}, + Name: "timeShift(foo.bar,1min,3)", + Tags: map[string]string{"name": "foo.bar", "timeShift": "3", "timeShiftUnit": "1min"}, + pathExpression: `timeShift(foo.bar,1min,3)`, + }, + }) + f(`threshold(1.5)`, []*series{ + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{1.5, 1.5, 1.5}, + Name: "1.5", + Tags: map[string]string{"name": "1.5"}, + pathExpression: "threshold(1.5)", + }, + }) + f(`threshold(1.5,"max","black")`, []*series{ + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{1.5, 1.5, 1.5}, + Name: "max", + Tags: map[string]string{"name": "1.5"}, + pathExpression: "threshold(1.5,'max','black')", + }, + }) + f(`sum( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{240, 300, 360}, + Name: "sumSeries(bar,foo)", + Tags: map[string]string{"name": "sumSeries(bar,foo)", "aggregatedBy": "sum"}, + }, + }) + f(`sumSeries( + time('foo',30), + time('bar',30), + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{240, 300, 360}, + Name: "sumSeries(bar,foo)", + Tags: map[string]string{"name": "sumSeries(bar,foo)", "aggregatedBy": "sum"}, + }, + }) + f(`substr(time('collectd.test-db1.load.value;tag1=value1;tag2=value2'),1,3)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "test-db1.load", + Tags: map[string]string{"name": "collectd.test-db1.load.value", "tag1": "value1", "tag2": "value2"}, + pathExpression: "collectd.test-db1.load.value;tag1=value1;tag2=value2", + }, + }) + + f(`substr(time('foo.baz.host;tag1=value1;tag2=value2'),1)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "baz.host;tag1=value1;tag2=value2", + Tags: map[string]string{"name": "foo.baz.host", "tag1": "value1", "tag2": "value2"}, + pathExpression: "foo.baz.host;tag1=value1;tag2=value2", + }, + }) + + f(`substr(time('foo.baz.host;tag1=value1;tag2=value2'),5)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "", + Tags: map[string]string{"name": "foo.baz.host", "tag1": "value1", "tag2": "value2"}, + pathExpression: "foo.baz.host;tag1=value1;tag2=value2", + }, + }) + f(`substr(time('foo.baz.host;tag1=value1;tag2=value2'),1,10)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "baz.host;tag1=value1;tag2=value2", + Tags: map[string]string{"name": "foo.baz.host", "tag1": "value1", "tag2": "value2"}, + pathExpression: "foo.baz.host;tag1=value1;tag2=value2", + }, + }) + f(`substr(time('foo.baz.host;tag1=value1;tag2=value2'),-1)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "host;tag1=value1;tag2=value2", + Tags: map[string]string{"name": "foo.baz.host", "tag1": "value1", "tag2": "value2"}, + pathExpression: "foo.baz.host;tag1=value1;tag2=value2", + }, + }) + f(`substr(time('foo.baz.host;tag1=value1;tag2=value2'),1,-1)`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{120, 180}, + Name: "baz", + Tags: map[string]string{"name": "foo.baz.host", "tag1": "value1", "tag2": "value2"}, + pathExpression: "foo.baz.host;tag1=value1;tag2=value2", + }, + }) + f(`stdev(time('foo.baz',20),3,0.1)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{0, 10, 16.32993161855452, 16.32993161855452, 16.32993161855452}, + Name: "stdev(foo.baz,3)", + Tags: map[string]string{"name": "foo.baz", "stdev": "3"}, + pathExpression: "foo.baz", + }, + }) + f(`stdev(time('foo.baz',20),3,0.5)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{nan, 10, 16.32993161855452, 16.32993161855452, 16.32993161855452}, + Name: "stdev(foo.baz,3)", + Tags: map[string]string{"name": "foo.baz", "stdev": "3"}, + pathExpression: "foo.baz", + }, + }) + + f(`stddevSeries(time('foo.baz',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "stddevSeries(foo.baz)", + Tags: map[string]string{"name": "foo.baz", "aggregatedBy": "stddev"}, + }, + }) + + f(`stacked( + group( + time("foo", 30) | alias("foo1.bar2"), + time("bar", 30) | alias("foo1.bar3") + ))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "stacked(foo1.bar2)", + Tags: map[string]string{"name": "foo", "stacked": "__DEFAULT__"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{240, 300, 360}, + Name: "stacked(foo1.bar3)", + Tags: map[string]string{"name": "bar", "stacked": "__DEFAULT__"}, + }, + }) + f(`stacked( + group( + time("bar", 30)| alias("foo1.bar1"), + time("foo", 30) | alias("foo1.bar2"), + time("foo", 30) | alias("foo1.bar3") + ), + '' + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{240, 300, 360}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{360, 450, 540}, + Name: "foo1.bar3", + Tags: map[string]string{"name": "foo"}, + }, + }) + + f(`squareRoot(time('foo.baz',10))`, []*series{ + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{10.954451150103322, 11.40175425099138, 11.832159566199232, 12.24744871391589, 12.649110640673518, 13.038404810405298, 13.416407864998739, 13.784048752090222, 14.142135623730951, 14.491376746189438}, + Name: "squareRoot(foo.baz)", + Tags: map[string]string{"name": "foo.baz", "squareRoot": "1"}, + pathExpression: "squareRoot(foo.baz)", + }, + }) + + f(`sortByTotal( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2"), + time("foo", 30) | alias("foo1.bar3") + ))`, []*series{ + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, 150, 180, 210}, + Name: "foo1.bar3", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + }) + + f(`sortBy( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2") + ) + )`, []*series{ + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + }) + f(`sortBy( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2"), + ),'average',true)`, []*series{ + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`sortBy( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2"), + ),'multiply',true)`, []*series{ + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`sortBy( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2"), + ),'diff')`, []*series{ + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`sortByName( + group( + time("bar", 10)| alias("foo1.bar1"), + time("foo", 15) | alias("foo1.bar2") + ))`, []*series{ + { + Timestamps: []int64{120000, 135000, 150000, 165000, 180000, 195000, 210000}, + Values: []float64{120, 135, 150, 165, 180, 195, 210}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "foo"}, + pathExpression: "foo", + }, + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`sortByMaxima( + group( + time("bar", 10)| alias("foo1.bar1"), + constantLine( 15) | alias("foo1.bar2") + ))`, []*series{ + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{15, 15, 15}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "15"}, + pathExpression: "constantLine(15)", + }, + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`sortByMinima( + group( + time("bar", 10)| alias("foo1.bar1"), + constantLine( 15) | alias("foo1.bar2") + ))`, []*series{ + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + { + Timestamps: []int64{120000, 165000, 210000}, + Values: []float64{15, 15, 15}, + Name: "foo1.bar2", + Tags: map[string]string{"name": "15"}, + pathExpression: "constantLine(15)", + }, + }) + f(`sortByMinima( + group( + time("bar", 10)| alias("foo1.bar1"), + constantLine( 0) | alias("foo1.bar2") + ))`, []*series{ + { + Timestamps: []int64{120000, 130000, 140000, 150000, 160000, 170000, 180000, 190000, 200000, 210000}, + Values: []float64{120, 130, 140, 150, 160, 170, 180, 190, 200, 210}, + Name: "foo1.bar1", + Tags: map[string]string{"name": "bar"}, + pathExpression: "bar", + }, + }) + f(`smartSummarize( + group( + time('foo',13), + time('bar',21), + ), + '45s' + )`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{423, 387}, + Name: `smartSummarize(bar,'45s','sum')`, + Tags: map[string]string{"name": "bar", "smartSummarize": "45s", "smartSummarizeFunction": "sum"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{558, 555}, + Name: `smartSummarize(foo,'45s','sum')`, + Tags: map[string]string{"name": "foo", "smartSummarize": "45s", "smartSummarizeFunction": "sum"}, + }, + }) + f(`smartSummarize( + group( + time('foo',13), + time('bar',21), + ), + '1min','sum','hour' + )`, []*series{ + { + Timestamps: []int64{0, 60000, 120000}, + Values: []float64{130, 455, 598}, + Name: `smartSummarize(foo,'1min','sum')`, + Tags: map[string]string{"name": "foo", "smartSummarize": "1min", "smartSummarizeFunction": "sum"}, + }, + { + Timestamps: []int64{0, 60000, 120000}, + Values: []float64{63, 252, 441}, + Name: `smartSummarize(bar,'1min','sum')`, + Tags: map[string]string{"name": "bar", "smartSummarize": "1min", "smartSummarizeFunction": "sum"}, + }, + }) + + f(`sinFunction("base",1,30)`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0.5806111842123143, -0.7148764296291645, -0.8011526357338306}, + Name: "base", + Tags: map[string]string{"name": "base"}, + }, + }) + f(`sinFunction("base",2,30)`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{1.1612223684246286, -1.429752859258329, -1.602305271467661}, + Name: "base", + Tags: map[string]string{"name": "base"}, + }, + }) + f(`sinFunction("base",step=20)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{0.5806111842123143, 0.9802396594403116, 0.21942525837900473, -0.8011526357338306, -0.8732972972139945}, + Name: "base", + Tags: map[string]string{"name": "base"}, + }, + }) + + f(`sigmoid(time('foo.baz'))`, []*series{ + { + Timestamps: []int64{120000, 180000}, + Values: []float64{1, 1}, + Name: "sigmoid(foo.baz)", + Tags: map[string]string{"name": "foo.baz", "sigmoid": "sigmoid"}, + pathExpression: "sigmoid(foo.baz)", + }, + }) + + f(`scaleToSeconds(time('foo.bas',20),5)`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{30, 35, 40, 45, 50}, + Name: "scaleToSeconds(foo.bas,5)", + Tags: map[string]string{"name": "foo.bas", "scaleToSeconds": "5"}, + pathExpression: "scaleToSeconds(foo.bas,5)", + }, + }) + + f(`secondYAxis(time('foo.bas',30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000, 210000}, + Values: []float64{120, 150, 180, 210}, + Name: "secondYAxis(foo.bas)", + Tags: map[string]string{"name": "foo.bas", "secondYAxis": "1"}, + pathExpression: "foo.bas", + }, + }) + + f(`isNonNull(timeSlice(time('foo.bar',20),"00:00 19700101","00:03 19700101"))`, []*series{ + { + Timestamps: []int64{120000, 140000, 160000, 180000, 200000}, + Values: []float64{1, 1, 1, 1, 0}, + Name: "isNonNull(timeSlice(foo.bar,0,180))", + Tags: map[string]string{"name": "foo.bar", "isNonNull": "1", "timeSliceEnd": "180", "timeSliceStart": "0"}, + }, + }) + f(`linearRegression( + group( + time("foo.baz",30), + time("baz.bar",30), + ) + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(foo.baz, 120, 210)", + Tags: map[string]string{"name": "foo.baz", "linearRegressions": "120, 210"}, + pathExpression: "linearRegression(foo.baz, 120, 210)", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(baz.bar, 120, 210)", + Tags: map[string]string{"name": "baz.bar", "linearRegressions": "120, 210"}, + pathExpression: "linearRegression(baz.bar, 120, 210)", + }, + }) + f(`linearRegression( + group( + time("foo.baz",30), + time("baz.bar",30), + ), + startSourceAt=100, + EndSourceAt=None, + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(foo.baz, 100, 210)", + Tags: map[string]string{"name": "foo.baz", "linearRegressions": "100, 210"}, + pathExpression: "linearRegression(foo.baz, 100, 210)", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(baz.bar, 100, 210)", + Tags: map[string]string{"name": "baz.bar", "linearRegressions": "100, 210"}, + pathExpression: "linearRegression(baz.bar, 100, 210)", + }, + }) + f(`linearRegression( + group( + time("foo.baz",30), + time("baz.bar",30), + ), + startSourceAt=None, + endSourceAt="00:08 19700101" + )`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(foo.baz, 120, 480)", + Tags: map[string]string{"name": "foo.baz", "linearRegressions": "120, 480"}, + pathExpression: "linearRegression(foo.baz, 120, 480)", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: "linearRegression(baz.bar, 120, 480)", + Tags: map[string]string{"name": "baz.bar", "linearRegressions": "120, 480"}, + pathExpression: "linearRegression(baz.bar, 120, 480)", + }, + }) + f(`holtWintersForecast(time("foo.baz",30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00026583823248, 151.53351196300892, 182.8503377518708}, + Name: "holtWintersForecast(foo.baz)", + Tags: map[string]string{"name": "holtWintersForecast(foo.baz)", "holtWintersForecast": "1"}, + }, + }) + f(`holtWintersForecast(time("foo.baz",30),"4d")`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00027210295323, 152.034912932407, 183.72178095512407}, + Name: "holtWintersForecast(foo.baz)", + Tags: map[string]string{"name": "holtWintersForecast(foo.baz)", "holtWintersForecast": "1"}, + }, + }) + f(`holtWintersForecast(time("foo.baz",30),"8d","2d")`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00000001724152, 152.03464171718454, 183.72151060765324}, + Name: "holtWintersForecast(foo.baz)", + Tags: map[string]string{"name": "holtWintersForecast(foo.baz)", "holtWintersForecast": "1"}, + }, + }) + f(`holtWintersConfidenceBands(time("foo.bar",30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00214864234894, 158.95265929117159, 196.72661235783855}, + Name: "holtWintersConfidenceUpper(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceUpper": "1"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{119.99838303411602, 144.11436463484625, 168.97406314590305}, + Name: "holtWintersConfidenceLower(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceLower": "1"}, + }, + }) + f(`holtWintersConfidenceBands(time("foo.bar",30),5,"4d")`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00407891280487, 165.87605713703562, 209.67633502193422}, + Name: "holtWintersConfidenceUpper(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceUpper": "1"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{119.9964652931016, 138.19376872777838, 157.76722688831393}, + Name: "holtWintersConfidenceLower(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceLower": "1"}, + }, + }) + f(`holtWintersConfidenceBands(time("foo.bar",30),5,"8d","2d")`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00000014163967, 165.87883899077733, 209.679106539474}, + Name: "holtWintersConfidenceUpper(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceUpper": "1"}, + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{119.99999989284336, 138.19044444359176, 157.7639146758325}, + Name: "holtWintersConfidenceLower(foo.bar)", + Tags: map[string]string{"name": "foo.bar", "holtWintersConfidenceLower": "1"}, + }, + }) + + f(`holtWintersAberration(time("baz.baf",30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "holtWintersAberration(baz.baf)", + Tags: map[string]string{"name": "baz.baf", "holtWintersAberration": "1"}, + }, + }) + f(`holtWintersAberration(time("baz.baf",30),2)`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{0, 0, 0}, + Name: "holtWintersAberration(baz.baf)", + Tags: map[string]string{"name": "baz.baf", "holtWintersAberration": "1"}, + }, + }) + + f(`holtWintersConfidenceArea(time("foo.baz",30))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120.00214864234894, 158.95265929117159, 196.72661235783855}, + Name: "areaBetween(holtWintersConfidenceUpper(foo.baz))", + Tags: map[string]string{"holtWintersConfidenceUpper": "1", "areaBetween": "1", "name": "foo.baz"}, + pathExpression: "holtWintersConfidenceUpper(foo.baz)", + }, + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{119.99838303411602, 144.11436463484625, 168.97406314590305}, + Name: "areaBetween(holtWintersConfidenceLower(foo.baz))", + Tags: map[string]string{"holtWintersConfidenceLower": "1", "areaBetween": "1", "name": "foo.baz"}, + pathExpression: "holtWintersConfidenceLower(foo.baz)", + }, + }) + + f(`groupByNode(summarize( + group( + time('foo.bar.baz',20), + time('bar.foo.bad',20), + time('bar.foo.bad',20), + ), + '45s' + ),1)`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{325, 400}, + Name: `foo`, + Tags: map[string]string{"aggregatedBy": "average", "name": "bar.foo.bad", "summarize": "45s", "summarizeFunction": "sum"}, + pathExpression: "bar.foo.bad", + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{325, 400}, + Name: `bar`, + Tags: map[string]string{"aggregatedBy": "average", "name": "foo.bar.baz", "summarize": "45s", "summarizeFunction": "sum"}, + pathExpression: "foo.bar.baz", + }, + }) + f(`divideSeries( + summarize( + group( + time('foo.bar.baz',10), + time('bar.foo.bad',10) + ), + '45s' + ), + summarize( + group( + time('foo.bar.baz',10) + ), + '45s' + ))`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{1, 1}, + Name: `divideSeries(summarize(foo.bar.baz,'45s','sum'),summarize(foo.bar.baz,'45s','sum'))`, + Tags: map[string]string{"name": "foo.bar.baz", "summarize": "45s", "summarizeFunction": "sum"}, + }, + { + Timestamps: []int64{120000, 165000}, + Values: []float64{1, 1}, + Name: `divideSeries(summarize(bar.foo.bad,'45s','sum'),summarize(foo.bar.baz,'45s','sum'))`, + Tags: map[string]string{"name": "bar.foo.bad", "summarize": "45s", "summarizeFunction": "sum"}, + }, + }) + f(`divideSeriesLists( + summarize( + time('foo.bar.baz',10), + '45s' + ), + summarize( + time('bar.foo.bad',10), + '45s' + ))`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{1, 1}, + Name: `divideSeries(summarize(foo.bar.baz,'45s','sum'),summarize(bar.foo.bad,'45s','sum'))`, + Tags: map[string]string{"name": "foo.bar.baz", "summarize": "45s", "summarizeFunction": "sum"}, + }, + }) + f(`weightedAverage( + summarize( + group( + time('foo.bar.baz',10), + time('bar.foo.bad',10) + ), + '45s' + ), + summarize( + group( + time('bar.foo.bad',10), + time('foo.bar.baz',10) + ), + '45s' + ))`, []*series{ + { + Timestamps: []int64{120000, 165000}, + Values: []float64{292.5, 500}, + Name: `weightedAverage(summarize(bar.foo.bad,'45s','sum'),summarize(foo.bar.baz,'45s','sum'),summarize(bar.foo.bad,'45s','sum'),summarize(foo.bar.baz,'45s','sum'),)`, + Tags: map[string]string{"name": "weightedAverage(summarize(bar.foo.bad,'45s','sum'),summarize(foo.bar.baz,'45s','sum'),summarize(bar.foo.bad,'45s','sum'),summarize(foo.bar.baz,'45s','sum'),)"}, + }, + }) + f(`transformNull( + time('foo.bar.baz',30), + -1, + summarize( + group( + time('foo.bar.baz',10) + ), + '30s' + ))`, []*series{ + { + Timestamps: []int64{120000, 150000, 180000}, + Values: []float64{120, 150, 180}, + Name: `transformNull(foo.bar.baz,-1,referenceSeries)`, + Tags: map[string]string{"name": "foo.bar.baz", "referenceSeries": "1", "transformNull": "-1"}, + pathExpression: ("transformNull(foo.bar.baz,-1,referenceSeries)"), + }, + }) +} + +func TestExecExprFailure(t *testing.T) { + f := func(query string) { + t.Helper() + ec := &evalConfig{ + startTime: 120e3, + endTime: 420e3, + storageStep: 60e3, + } + nextSeries, err := execExpr(ec, query) + if err == nil { + if _, err = drainAllSeries(nextSeries); err == nil { + t.Fatalf("expecting non-nil error for query %q", query) + } + nextSeries = nil + } + if nextSeries != nil { + t.Fatalf("expecting nil nextSeries") + } + } + f("123") + f("nonExistingFunc()") + + f("absolute()") + f("absolute(1)") + f("absolute('foo')") + + f("add()") + f("add(foo.bar)") + f("add(1.23)") + f("add(1.23, 4.56)") + f("add(time('a'), baz)") + + f("aggregate()") + f("x.y|aggregate()") + f("aggregate(1)") + f("aggregate(time('a'), 123)") + f("aggregate(time('a'), 123, bar.baz)") + f("aggregate(time('a'), bar)") + f("aggregate(time('a'), 'non-existing-func')") + f("aggregate(time('a'), 'sum', 'bar')") + f("aggregate(1,'sum')") + + f("aggregateLine()") + f("aggregateLine(123)") + f("aggregateLine(time('a'), bar)") + f("aggregateLine(time('a'),'non-existing-func')") + f("aggregateLine(time('a'),keepStep=aaa)") + f("aggregateLine(time('a'),'sum',123)") + + f("aggregateWithWildcards()") + f("aggregateWithWildcards(time('a'),bar)") + f("aggregateWithWildcards(constantLine(123),'non-existing-func')") + f("aggregateWithWildcards(time('a'),'sum',bar)") + f("aggregateWithWildcards(1,'sum')") + + f("alias()") + f("alias(time('a'))") + f("alias(time('a'),123)") + f("alias(1,'aa')") + + f("aliasByMetric()") + f("aliasByMetric(123)") + + f("aliasByNode()") + f("aliasByNode(123)") + f("aliasByNode(time('a'),bar)") + + f("aliasByTags()") + f("aliasByTags(123)") + f("aliasByTags(time('a'),bar)") + + f("aliasQuery()") + f("aliasQuery(1,2,3,4)") + f("aliasQuery(1,'foo[bar',3,4)") + f("aliasQuery(1,'foo',3,4)") + f("aliasQuery(1,'foo','bar',4)") + f("aliasQuery(constantLine(1)|alias('x'),'x','abc(de','aa')") + f("aliasQuery(constantLine(1)|alias('x'),'x','group()','aa')") + f("aliasQuery(1,'foo','bar','aaa')") + + f("aliasSub()") + f("aliasSub(1,2,3)") + f("aliasSub(1,'foo[bar',3)") + f("aliasSub(1,'foo',3)") + f("aliasSub(1,'foo','bar')") + + f("alpha()") + f("alpha(1,2)") + f("alpha(1,foo)") + + f("applyByNode()") + f("applyByNode(1,2,3)") + f("applyByNode(1,foo,3)") + f("applyByNode(1,2,'aaa',4)") + f("applyByNode(1,2,'foo')") + + f("areaBetween()") + f("areaBetween(1)") + f("areaBetween(group(time('1'),time('2'),time('3')))") + + f("asPercent()") + f("asPercent(1)") + f("asPercent(time('abc'),'foo')") + f("asPercent(1,'foo',bar)") + f("asPercent(time('abc'),100,1)") + f("asPercent(time('a'),group(time('b'),time('c')))") + + f("averageAbove()") + f("averageAbove(1,2)") + f("averageAbove(1,foo)") + + f("averageBelow()") + f("averageBelow(1,2)") + f("averageBelow(1,foo)") + + f("averageOutsidePercentile()") + f("averageOutsidePercentile(1,2)") + f("averageOutsidePercentile(1,'foo')") + + f("averageSeries(1)") + f("averageSeries(time('a'),1)") + + f("averageSeriesWithWildcards()") + f("averageSeriesWithWildcards(1)") + f("averageSeriesWithWildcards(time('a'),'foo')") + + f("avg(1)") + + f("changed()") + f("changed(1)") + + f("color()") + f("color(1,'foo')") + f("color(1,foo)") + + f("consolidateBy()") + f("consolidateBy(1,2)") + f("consolidateBy(1,'foobar')") + f("consolidateBy(1,'sum')") + + f("constantLine()") + f("constantLine('foobar')") + f("constantLine(time('a'))") + f("constantLine(true)") + f("constantLine(None)") + f("constantLine(constantLine(123))") + f("constantLine(foo=123)") + f("constantLine(123, 456)") + + f("countSeries(1)") + f("countSeries(time('a'),1)") + + f("cumulative()") + f("cumulative(1)") + + f("currentAbove()") + f("currentAbove(1,2)") + f("currentAbove(1,foo)") + + f("currentBelow()") + f("currentBelow(1,2)") + f("currentBelow(1,foo)") + + f("dashed()") + f("dashed(1)") + f("dashed(time('a'),'foo')") + + f("delay()") + f("delay(1,2)") + f("delay(time('a'),'foo')") + + f("derivative()") + f("derivative(1)") + + f("diffSeries(1)") + f("diffSeries(time('a'),1)") + + f("divideSeries()") + f("divideSeries(1,2)") + f("divideSeries(time('a'),group(time('a'),time('b')))") + + f("divideSeriesLists()") + f("divideSeriesLists(1,2)") + f("divideSeriesLists(time('a'),2)") + f("divideSeriesLists(time('a'),group(time('b'),time('c')))") + + f("drawAsInfinite()") + f("drawAsInfinite(1)") + + f("events(1)") + + f("exclude()") + f("exclude(1)") + f("exclude(time('a'),2)") + f("exclude(1,'foo')") + f("exclude(1,'f[')") + + f("exp()") + f("exp(1)") + + f("exponentialMovingAverage()") + f("exponentialMovingAverage(1,time('a'))") + f("exponentialMovingAverage(time('a'),'foobar')") + + f("fallbackSeries()") + f("fallbackSeries(1,2)") + f("fallbackSeries(group(),2)") + + f("filterSeries()") + f("filterSeries(1,2,3,4)") + f("filterSeries(time('a'),'foo','bar','baz')") + f("filterSeries(time('a'),'sum',1,'baz')") + f("filterSeries(time('a'),'sum','bar','baz')") + f("filterSeries(time('a'),'sum','>','baz')") + f("filterSeries(time('a'),'foo','>',3)") + f("filterSeries(time('a'),'sum','xxx',3)") + + f("grep()") + f("grep(1)") + f("grep(time('a'),2)") + f("grep(1,'foo')") + f("grep(1,'f[')") + + f("group(1)") + f("group('a.b.c')") + f("group(xx=aa.bb)") + f("group(constantLine(1),123)") + + f("groupByNode()") + f("groupByNode(1,123)") + f("groupByNode(1,time('a'))") + f("groupByNode(time('a'),1,'foobar')") + f("groupByNode(time('a'),1,2)") + + f("groupByNodes()") + f("groupByNodes(time('a'),123)") + f("groupByNodes(time('a'),'foobar')") + f("groupByNodes(time('a'),'sum',time('b'))") + f("groupByNodes(1,'sum')") + + f("groupByTags()") + f("groupByTags(1,1)") + f("groupByTags(1,'foo')") + f("groupByTags(1,'sum',1)") + + f("highest()") + f("highest(1,'foo')") + f("highest(1,2,3)") + f("highest(1,2,'foo')") + f("highest(1,2,'sum')") + + f("highestAverage()") + f("highestAverage(1,2)") + f("highestAverage(1,'foo')") + + f("highestCurrent()") + f("highestCurrent(1,2)") + f("highestCurrent(1,'foo')") + + f("highestMax()") + f("highestMax(1,2)") + f("highestMax(1,'foo')") + + f("hitcount()") + f("hitcount(1,2)") + f("hitcount(1,'5min')") + f("hitcount(1,'5min','foo')") + + f("identity()") + f("identity(1)") + + f("integral()") + f("integral('a')") + + f("integralByInterval()") + f("integralByInterval(1,2)") + f("integralByInterval(1,'1h')") + + f("interpolate()") + f("interpolate(1)") + + f("invert()") + f("invert(1)") + + f("keepLastValue()") + f("keepLastValue(1)") + + f("limit()") + f("limit(1,2)") + f("limit(1,'foo')") + + f("lineWidth()") + f("lineWidth(1,2)") + f("lineWidth(1,'foo')") + + f("logarithm()") + f("logarithm(1)") + f("logarithm(1,'foo')") + + f("logit()") + f("logit(1)") + + f("lowest()") + f("lowest(1,'foo')") + f("lowest(1,2,3)") + f("lowest(1,2,'foo')") + f("lowest(1,2,'sum')") + + f("lowestAverage()") + f("lowestAverage(1,2)") + f("lowestAverage(1,'foo')") + + f("lowestCurrent()") + f("lowestCurrent(1,2)") + f("lowestCurrent(1,'foo')") + + f("maxSeries(1)") + f("maxSeries(time('a'),1)") + + f("maximumAbove()") + f("maximumAbove(1,2)") + f("maximumAbove(1,foo)") + + f("maximumBelow()") + f("maximumBelow(1,2)") + f("maximumBelow(1,foo)") + + f("minMax()") + f("minMax(1)") + + f("minSeries(1)") + f("minSeries(time('a'),1)") + + f("minimumAbove()") + f("minimumAbove(1,2)") + f("minimumAbove(1,foo)") + + f("minimumBelow()") + f("minimumBelow(1,2)") + f("minimumBelow(1,foo)") + + f("mostDeviant()") + f("mostDeviant(1,2)") + f("mostDeviant(1,foo)") + + f("movingAverage()") + f("movingAverage(time('a'),time('b'))") + f("movingAverage(1,1)") + f("movingAverage(time('a),1,'foo')") + f("movingAverage(foo=a,bar=2,baz=3)") + + f("movingMax()") + f("movingMax(1,'5min')") + f("movingMax(1,foo=true)") + + f("movingMedian()") + f("movingMedian(1,'5min')") + f("movingMedian(1,foo=true)") + + f("movingMin()") + f("movingMin(1,'5min')") + f("movingMin(1,foo=true)") + + f("movingSum()") + f("movingSum(1,'5min')") + f("movingSum(1,foo=true)") + + f("movingWindow()") + f("movingWindow(1,foo)") + f("movingWindow(1,'foo')") + f("movingWindow(1,-1)") + f("movingWindow(1,2)") + f("movingWindow(1,2,3)") + f("movingWindow(1,2,'non-existing-aggr-func')") + f("movingWindow(1,2,'sum',foo)") + + f("multiplySeries(1)") + f("multiplySeries(time('a'),1)") + + f("multiplyWithWildcards()") + f("multiplyWithWildcards(time('a'),bar)") + f("multiplyWithWildcards(constantLine(123),'non-existing-func')") + f("multiplyWithWildcards(time('a'),'sum',bar)") + f("multiplyWithWildcards(1,'sum')") + + f(`nPercentile()`) + f(`nPercentile(1,1)`) + f(`nPercentile(1,'foo')`) + + f(`nonNegativeDerivative()`) + f(`nonNegativeDerivative(1)`) + + f("offset()") + f("offset(1,2)") + f("offset(time('a'),'fo')") + + f("offsetToZero()") + f("offsetToZero(1)") + + f("pow()") + f("pow(1,2)") + f("pow(1,'foo')") + + f("rangeOfSeries(1)") + f("rangeOfSeries(time('a'),1)") + + f("randomWalk()") + f("randomWalk(1)") + f("randomWalk('foo','bar')") + + f("removeAbovePercentile()") + f("removeAbovePercentile(1, 2)") + f("removeAbovePercentile(1, 'foo')") + + f("removeAboveValue()") + f("removeAboveValue(1, 2)") + f("removeAboveValue(1, 'foo')") + + f("removeBelowPercentile()") + f("removeBelowPercentile(1, 2)") + f("removeBelowPercentile(1, 'foo')") + + f("removeBelowValue()") + f("removeBelowValue(1, 2)") + f("removeBelowValue(1, 'foo')") + + f("removeBetweenPercentile()") + f("removeBetweenPercentile(1,2)") + f("removeBetweenPercentile(1,'foo')") + + f("removeEmptySeries()") + f("removeEmptySeries(1)") + f("removeEmptySeries(1,'fii')") + + f("round()") + f("round(1)") + f("round(1,'foo')") + + f("scale()") + f("scale(1,2)") + f("scale(time('a'),'foo')") + + f("setXFilesFactor()") + f("setXFilesFactor(1,'foo')") + f("setXFilesFactor(1,0.5)") + + f("sumWithWildcards()") + f("sumWithWildcards(time('a'),bar)") + f("sumWithWildcards(constantLine(123),'non-existing-func')") + f("sumWithWildcards(time('a'),'sum',bar)") + f("sumWithWildcards(1,'sum')") + + f("summarize()") + f("summarize(1,2)") + f("summarize(1,'foobar')") + f("summarize(1,'-2min')") + f("summarize(1, '0seconds')") + f("summarize(1,'2min',3)") + f("summarize(1,'1s','non-existing-func')") + f("summarize(1,'1s','sum',3)") + f("summarize(1,'1s')") + + f("time()") + f("time(1)") + + f("timeFunction()") + f("timeFunction(1)") + f("timeFunction(False)") + f("timeFunction(None)") + f("timeFunction(constantLine(123))") + f("timeFunction(foo='bar')") + f("timeFunction('foo', 'bar')") + + f(`verticalLine("12:3420131108","event","blue",5)`) + f(`verticalLine(10)`) + f(`verticalLine("12:3420131108",4,5)`) + f(`verticalLine("12:3420131108","event",4)`) + f(`verticalLine("12:3420131108SF1bad","event")`) + f(`verticalLine("00:01 19700101","event")`) + + f(`useSeriesAbove()`) + f(`useSeriesAbove(1,10)`) + f(`useSeriesAbove(1,10,10,5)`) + f(`useSeriesAbove(1,"10",10,15)`) + f(`useSeriesAbove(1,10,"(?=10)",15)`) + f(`useSeriesAbove(1,10,"10",5)`) + + f(`unique(5,2)`) + + f(`perSecond()`) + f(`perSecond(1)`) + + f(`percentileOfSeries()`) + f(`percentileOfSeries(1)`) + + f(`substr()`) + f(`substr(time('a'),'foo')`) + f(`substr(time('a'),1,'foo')`) + + f(`sumSeries(1)`) + f("sumSeries(time('a'),1)") + + f(`threshold()`) + f(`threshold("bad arg")`) + f(`threshold(1.5,5,"black")`) + f(`threshold(1.5,"max",5)`) + + f(`timeShift()`) + f(`timeShift(time('a'),1)`) + f(`timeShift(time('a'),'foo')`) + + f(`timeSlice()`) + f(`timeSlice(time('a'),1)`) + f(`timeSlice(time('a'),'foo')`) + f(`timeSlice(time('a'),'5min',1)`) + f(`timeSlice(time('a'),'5min','bar')`) + f(`timeSlice(1,'5min','10min')`) + + f(`timeStack()`) + f(`timeStack(time('a'),timeShiftUnit=123)`) + f(`timeStack(time('a'),'foo')`) + f(`timeStack(time('a'),'1m',timeShiftStart='foo')`) + f(`timeStack(time('a'),'1m',timeShiftEnd='bar')`) + f(`timeStack(time('a'),'1m',10,1)`) + + f(`transformNull()`) + f(`transformNull(1,-1,5,2)`) + f(`transformNull(None)`) + f(`transformNull(time('a'),2,'xxx')`) + f(`transformNull(time('a'),'foo')`) + + f("weightedAverage()") + f("weightedAverage(1,2)") + f("weightedAverage(time('a'),2)") + f("weightedAverage(time('a'),time('b'),foo.bar)") + f("weightedAverage(time('a'),group(time('b'),time('c')))") + + f("xFilesFactor()") + f("xFilesFactor(1,'foo')") + f("xFilesFactor(1,0.5)") + + f(`stdev()`) + f(`stdev(1,3,0.5)`) + f(`stdev(1,"5",0.5)`) + f(`stdev(1,3,"0.5")`) + + f(`stddevSeries(5)`) + f(`stddevSeries(1)`) + + f(`stacked()`) + f(`stacked(1)`) + f(`stacked(1,5)`) + + f(`squareRoot()`) + f(`squareRoot(5)`) + + f(`sortByTotal()`) + f(`sortByTotal(1)`) + + f(`sortBy()`) + f(`sortBy(1)`) + f(`sortBy(1,5)`) + f(`sortBy(1,'bad func name')`) + f(`sortBy(1,'sum','non bool')`) + + f(`sortByName()`) + f(`sortByName(1)`) + f(`sortByName(1,"bad bool")`) + f(`sortByName(1,true,"bad bool")`) + f(`sortByName(1,5,5,6)`) + + f(`sortByMinima()`) + f(`sortByMinima(1)`) + + f(`sortByMaxima()`) + f(`sortByMaxima(1)`) + + f(`smartSummarize(1)`) + f(`smartSummarize(1,"1d")`) + f(`smartSummarize(1,"1d","sum","1light year")`) + f(`smartSummarize(1,1)`) + f(`smartSummarize(1,"1d",1)`) + f(`smartSummarize(1,"1light year")`) + f(`smartSummarize(1,"-1d")`) + f(`smartSummarize(1,"1d","bad func")`) + f(`smartSummarize(1,"1d","sum",true)`) + + f(`sinFunction()`) + f(`sinFunction(5)`) + f(`sinFunction("name","bad arg")`) + f(`sinFunction("name",1,"bad arg")`) + f(`sinFunction(1,-2,3)`) + + f(`sigmoid()`) + f(`sigmoid(1)`) + + f(`scaleToSeconds()`) + f(`scaleToSeconds(1,10)`) + f(`scaleToSeconds(1,"10")`) + + f(`secondYAxis()`) + f(`secondYAxis(1)`) + + f(`isNonNull()`) + f(`isNonNull(1)`) + + f(`linearRegression()`) + f(`linearRegression(10)`) + f(`linearRegression(none.exist.metric)`) + f(`linearRegression(none.exist.metric,"badarg1")`) + f(`linearRegression(time("foo.baz",15),"-1min","badargv2")`) + + f(`holtWintersForecast()`) + f(`holtWintersForecast(none.exist.metric)`) + f(`holtWintersForecast(none.exist.metric,124124)`) + f(`holtWintersForecast(none.exist.metric,7d,"ads124")`) + f(`holtWintersForecast(none.exist.metric,"7d","ads124")`) + f(`holtWintersForecast(none.exist.metric,"afsf","7d")`) + f(`holtWintersForecast(none.exist.metric,"7d",124214)`) + + f(`holtWintersConfidenceBands()`) + f(`holtWintersConfidenceBands(none.exist.metric)`) + f(`holtWintersConfidenceBands(none.exist.metric,"124124")`) + f(`holtWintersConfidenceBands(none.exist.metric,7,123)`) + f(`holtWintersConfidenceBands(none.exist.metric,7,"ads124")`) + f(`holtWintersConfidenceBands(none.exist.metric,7,"7d","ads124")`) + f(`holtWintersConfidenceBands(none.exist.metric,7,"afsf","7d")`) + f(`holtWintersConfidenceBands(none.exist.metric,7,"7d",124214)`) + + f(`holtWintersAberration()`) + f(`holtWintersAberration(124)`) + f(`holtWintersAberration(none.exist.metric)`) + + f(`holtWintersConfidenceArea(group(time("foo.baz",15),time("foo.baz",15)))`) + f(`holtWintersConfidenceArea()`) +} + +func compareSeries(ss, ssExpected []*series, expr graphiteql.Expr) error { + if len(ss) != len(ssExpected) { + return fmt.Errorf("unexpected series count; got %d; want %d", len(ss), len(ssExpected)) + } + m := make(map[string]*series) + for _, s := range ssExpected { + m[s.Name] = s + } + exprStrExpected := string(expr.AppendString(nil)) + for _, s := range ss { + sExpected := m[s.Name] + if sExpected == nil { + return fmt.Errorf("missing series with name %q", s.Name) + } + if !reflect.DeepEqual(s.Tags, sExpected.Tags) { + return fmt.Errorf("unexpected tag for series %q\ngot\n%s\nwant\n%s", s.Name, s.Tags, sExpected.Tags) + } + if !reflect.DeepEqual(s.Timestamps, sExpected.Timestamps) { + return fmt.Errorf("unexpected timestamps for series %q\ngot\n%d\nwant\n%d", s.Name, s.Timestamps, sExpected.Timestamps) + } + if !equalFloats(s.Values, sExpected.Values) { + return fmt.Errorf("unexpected values for series %q\ngot\n%g\nwant\n%g", s.Name, s.Values, sExpected.Values) + } + expectedPathExpression := sExpected.Name + if sExpected.pathExpression != "" { + expectedPathExpression = sExpected.pathExpression + } + if expectedPathExpression != s.pathExpression { + return fmt.Errorf("unexpected pathExpression for series %q\ngot\n%s\nwant\n%s", s.Name, s.pathExpression, expectedPathExpression) + } + exprStr := string(s.expr.AppendString(nil)) + if exprStr != exprStrExpected { + return fmt.Errorf("unexpected expr for series %q\ngot\n%s\nwant\n%s", s.Name, exprStr, exprStrExpected) + } + } + return nil +} + +func equalFloats(a, b []float64) bool { + if len(a) != len(b) { + return false + } + for i, v1 := range a { + v2 := b[i] + if math.IsNaN(v1) { + if math.IsNaN(v2) { + continue + } + return false + } else if math.IsNaN(v2) { + return false + } + eps := math.Abs(v1) / 1e15 + if math.Abs(v1-v2) > eps { + return false + } + } + return true +} + +func printSeriess(ss []*series) string { + var sb strings.Builder + fmt.Fprintf(&sb, "[\n") + for _, s := range ss { + fmt.Fprintf(&sb, "\t{name=%q,tags=%v,timestamps=%s,values=%s}\n", s.Name, s.Tags, formatTimestamps(s.Timestamps), formatValues(s.Values)) + } + fmt.Fprintf(&sb, "]\n") + return sb.String() +} + +func formatValues(vs []float64) string { + if len(vs) == 0 { + return "[]" + } + var sb strings.Builder + fmt.Fprintf(&sb, "[ ") + for i, v := range vs { + if math.IsNaN(v) { + fmt.Fprintf(&sb, "nan") + } else { + fmt.Fprintf(&sb, "%g", v) + } + if i != len(vs)-1 { + fmt.Fprintf(&sb, ", ") + } + } + fmt.Fprintf(&sb, " ]") + return sb.String() +} + +func formatTimestamps(tss []int64) string { + if len(tss) == 0 { + return "[]" + } + var sb strings.Builder + fmt.Fprintf(&sb, "[ ") + for i, ts := range tss { + fmt.Fprintf(&sb, "%d", ts) + if i != len(tss)-1 { + fmt.Fprintf(&sb, ", ") + } + } + fmt.Fprintf(&sb, " ]") + return sb.String() +} diff --git a/app/vmselect/graphite/functions.json b/app/vmselect/graphite/functions.json new file mode 100644 index 000000000..3279a3b7a --- /dev/null +++ b/app/vmselect/graphite/functions.json @@ -0,0 +1,3389 @@ +{ + "aggregate": { + "name": "aggregate", + "function": "aggregate(seriesList, func, xFilesFactor=None)", + "description": "Aggregate series using the specified function.\n\nExample:\n\n.. code-block:: none\n\n &target=aggregate(host.cpu-[0-7].cpu-{user,system}.value, \"sum\")\n\nThis would be the equivalent of\n\n.. code-block:: none\n\n &target=sumSeries(host.cpu-[0-7].cpu-{user,system}.value)\n\nThis function can be used with aggregation functions ``average`` (or ``avg``), ``avg_zero``,\n``median``, ``sum`` (or ``total``), ``min``, ``max``, ``diff``, ``stddev``, ``count``,\n``range`` (or ``rangeOf``) , ``multiply`` & ``last`` (or ``current``).", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "required": true, + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "aggregateWithWildcards": { + "name": "aggregateWithWildcards", + "function": "aggregateWithWildcards(seriesList, func, *positions)", + "description": "Call aggregator after inserting wildcards at the given position(s).\n\nExample:\n\n.. code-block:: none\n\n &target=aggregateWithWildcards(host.cpu-[0-7].cpu-{user,system}.value, \"sum\", 1)\n\nThis would be the equivalent of\n\n.. code-block:: none\n\n &target=sumSeries(host.cpu-[0-7].cpu-user.value)&target=sumSeries(host.cpu-[0-7].cpu-system.value)\n # or\n &target=aggregate(host.cpu-[0-7].cpu-user.value,\"sum\")&target=aggregate(host.cpu-[0-7].cpu-system.value,\"sum\")\n\nThis function can be used with all aggregation functions supported by\n:py:func:`aggregate `: ``average``, ``median``, ``sum``, ``min``, ``max``, ``diff``,\n``stddev``, ``range`` & ``multiply``.\n\nThis complements :py:func:`groupByNodes ` which takes a list of nodes that must match in each group.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "required": true, + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "position", + "type": "node", + "multiple": true + } + ] + }, + "applyByNode": { + "name": "applyByNode", + "function": "applyByNode(seriesList, nodeNum, templateFunction, newName=None)", + "description": "Takes a seriesList and applies some complicated function (described by a string), replacing templates with unique\nprefixes of keys from the seriesList (the key is all nodes up to the index given as `nodeNum`).\n\nIf the `newName` parameter is provided, the name of the resulting series will be given by that parameter, with any\n\"%\" characters replaced by the unique prefix.\n\nExample:\n\n.. code-block:: none\n\n &target=applyByNode(servers.*.disk.bytes_free,1,\"divideSeries(%.disk.bytes_free,sumSeries(%.disk.bytes_*))\")\n\nWould find all series which match `servers.*.disk.bytes_free`, then trim them down to unique series up to the node\ngiven by nodeNum, then fill them into the template function provided (replacing % by the prefixes).\n\nAdditional Examples:\n\nGiven keys of\n\n- `stats.counts.haproxy.web.2XX`\n- `stats.counts.haproxy.web.3XX`\n- `stats.counts.haproxy.web.5XX`\n- `stats.counts.haproxy.microservice.2XX`\n- `stats.counts.haproxy.microservice.3XX`\n- `stats.counts.haproxy.microservice.5XX`\n\nThe following will return the rate of 5XX's per service:\n\n.. code-block:: none\n\n applyByNode(stats.counts.haproxy.*.*XX, 3, \"asPercent(%.5XX, sumSeries(%.*XX))\", \"%.pct_5XX\")\n\nThe output series would have keys `stats.counts.haproxy.web.pct_5XX` and `stats.counts.haproxy.microservice.pct_5XX`.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "nodeNum", + "type": "node", + "required": true + }, + { + "name": "templateFunction", + "type": "string", + "required": true + }, + { + "name": "newName", + "type": "string" + } + ] + }, + "asPercent": { + "name": "asPercent", + "function": "asPercent(seriesList, total=None, *nodes)", + "description": "Calculates a percentage of the total of a wildcard series. If `total` is specified,\neach series will be calculated as a percentage of that total. If `total` is not specified,\nthe sum of all points in the wildcard series will be used instead.\n\nA list of nodes can optionally be provided, if so they will be used to match series with their\ncorresponding totals following the same logic as :py:func:`groupByNodes `.\n\nWhen passing `nodes` the `total` parameter may be a series list or `None`. If it is `None` then\nfor each series in `seriesList` the percentage of the sum of series in that group will be returned.\n\nWhen not passing `nodes`, the `total` parameter may be a single series, reference the same number\nof series as `seriesList` or be a numeric value.\n\nExample:\n\n.. code-block:: none\n\n # Server01 connections failed and succeeded as a percentage of Server01 connections attempted\n &target=asPercent(Server01.connections.{failed,succeeded}, Server01.connections.attempted)\n\n # For each server, its connections failed as a percentage of its connections attempted\n &target=asPercent(Server*.connections.failed, Server*.connections.attempted)\n\n # For each server, its connections failed and succeeded as a percentage of its connections attemped\n &target=asPercent(Server*.connections.{failed,succeeded}, Server*.connections.attempted, 0)\n\n # apache01.threads.busy as a percentage of 1500\n &target=asPercent(apache01.threads.busy,1500)\n\n # Server01 cpu stats as a percentage of its total\n &target=asPercent(Server01.cpu.*.jiffies)\n\n # cpu stats for each server as a percentage of its total\n &target=asPercent(Server*.cpu.*.jiffies, None, 0)\n\nWhen using `nodes`, any series or totals that can't be matched will create output series with\nnames like ``asPercent(someSeries,MISSING)`` or ``asPercent(MISSING,someTotalSeries)`` and all\nvalues set to None. If desired these series can be filtered out by piping the result through\n``|exclude(\"MISSING\")`` as shown below:\n\n.. code-block:: none\n\n &target=asPercent(Server{1,2}.memory.used,Server{1,3}.memory.total,0)\n\n # will produce 3 output series:\n # asPercent(Server1.memory.used,Server1.memory.total) [values will be as expected]\n # asPercent(Server2.memory.used,MISSING) [all values will be None]\n # asPercent(MISSING,Server3.memory.total) [all values will be None]\n\n &target=asPercent(Server{1,2}.memory.used,Server{1,3}.memory.total,0)|exclude(\"MISSING\")\n\n # will produce 1 output series:\n # asPercent(Server1.memory.used,Server1.memory.total) [values will be as expected]\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.\n\n.. note::\n\n When `total` is a seriesList, specifying `nodes` to match series with the corresponding total\n series will increase reliability.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "total", + "type": "any" + }, + { + "name": "nodes", + "type": "nodeOrTag", + "multiple": true + } + ] + }, + "averageSeries": { + "name": "averageSeries", + "function": "averageSeries(*seriesLists)", + "description": "Short Alias: avg()\n\nTakes one metric or a wildcard seriesList.\nDraws the average value of all metrics passed at each time.\n\nExample:\n\n.. code-block:: none\n\n &target=averageSeries(company.server.*.threads.busy)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``average``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "averageSeriesWithWildcards": { + "name": "averageSeriesWithWildcards", + "function": "averageSeriesWithWildcards(seriesList, *position)", + "description": "Call averageSeries after inserting wildcards at the given position(s).\n\nExample:\n\n.. code-block:: none\n\n &target=averageSeriesWithWildcards(host.cpu-[0-7].cpu-{user,system}.value, 1)\n\nThis would be the equivalent of\n\n.. code-block:: none\n\n &target=averageSeries(host.*.cpu-user.value)&target=averageSeries(host.*.cpu-system.value)\n\nThis is an alias for :py:func:`aggregateWithWildcards ` with aggregation ``average``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "position", + "type": "node", + "multiple": true + } + ] + }, + "avg": { + "name": "avg", + "function": "avg(*seriesLists)", + "description": "Short Alias: avg()\n\nTakes one metric or a wildcard seriesList.\nDraws the average value of all metrics passed at each time.\n\nExample:\n\n.. code-block:: none\n\n &target=averageSeries(company.server.*.threads.busy)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``average``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "countSeries": { + "name": "countSeries", + "function": "countSeries(*seriesLists)", + "description": "Draws a horizontal line representing the number of nodes found in the seriesList.\n\n.. code-block:: none\n\n &target=countSeries(carbon.agents.*.*)", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "multiple": true + } + ] + }, + "diffSeries": { + "name": "diffSeries", + "function": "diffSeries(*seriesLists)", + "description": "Subtracts series 2 through n from series 1.\n\nExample:\n\n.. code-block:: none\n\n &target=diffSeries(service.connections.total,service.connections.failed)\n\nTo diff a series and a constant, one should use offset instead of (or in\naddition to) diffSeries\n\nExample:\n\n.. code-block:: none\n\n &target=offset(service.connections.total,-5)\n\n &target=offset(diffSeries(service.connections.total,service.connections.failed),-4)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``diff``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "divideSeries": { + "name": "divideSeries", + "function": "divideSeries(dividendSeriesList, divisorSeries)", + "description": "Takes a dividend metric and a divisor metric and draws the division result.\nA constant may *not* be passed. To divide by a constant, use the scale()\nfunction (which is essentially a multiplication operation) and use the inverse\nof the dividend. (Division by 8 = multiplication by 1/8 or 0.125)\n\nExample:\n\n.. code-block:: none\n\n &target=divideSeries(Series.dividends,Series.divisors)", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "dividendSeriesList", + "type": "seriesList", + "required": true + }, + { + "name": "divisorSeries", + "type": "seriesList", + "required": true + } + ] + }, + "divideSeriesLists": { + "name": "divideSeriesLists", + "function": "divideSeriesLists(dividendSeriesList, divisorSeriesList)", + "description": "Iterates over a two lists and divides list1[0] by list2[0], list1[1] by list2[1] and so on.\nThe lists need to be the same length", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "dividendSeriesList", + "type": "seriesList", + "required": true + }, + { + "name": "divisorSeriesList", + "type": "seriesList", + "required": true + } + ] + }, + "group": { + "name": "group", + "function": "group(*seriesLists)", + "description": "Takes an arbitrary number of seriesLists and adds them to a single seriesList. This is used\nto pass multiple seriesLists to a function which only takes one", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "multiple": true + } + ] + }, + "groupByNode": { + "name": "groupByNode", + "function": "groupByNode(seriesList, nodeNum, callback='average')", + "description": "Takes a serieslist and maps a callback to subgroups within as defined by a common node\n\n.. code-block:: none\n\n &target=groupByNode(ganglia.by-function.*.*.cpu.load5,2,\"sumSeries\")\n\nWould return multiple series which are each the result of applying the \"sumSeries\" function\nto groups joined on the second node (0 indexed) resulting in a list of targets like\n\n.. code-block:: none\n\n sumSeries(ganglia.by-function.server1.*.cpu.load5),sumSeries(ganglia.by-function.server2.*.cpu.load5),...\n\nNode may be an integer referencing a node in the series name or a string identifying a tag.\n\nThis is an alias for using :py:func:`groupByNodes ` with a single node.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "nodeNum", + "type": "nodeOrTag", + "required": true + }, + { + "name": "callback", + "type": "aggOrSeriesFunc", + "default": "average", + "options": [ + "average", + "averageSeries", + "averageSeriesWithWildcards", + "avg", + "avg_zero", + "count", + "countSeries", + "current", + "diff", + "diffSeries", + "last", + "max", + "maxSeries", + "median", + "min", + "minSeries", + "multiply", + "multiplySeries", + "multiplySeriesWithWildcards", + "powSeries", + "range", + "rangeOf", + "rangeOfSeries", + "stddev", + "stddevSeries", + "sum", + "sumSeries", + "sumSeriesWithWildcards", + "total" + ] + } + ] + }, + "groupByNodes": { + "name": "groupByNodes", + "function": "groupByNodes(seriesList, callback, *nodes)", + "description": "Takes a serieslist and maps a callback to subgroups within as defined by multiple nodes\n\n.. code-block:: none\n\n &target=groupByNodes(ganglia.server*.*.cpu.load*,\"sum\",1,4)\n\nWould return multiple series which are each the result of applying the \"sum\" aggregation\nto groups joined on the nodes' list (0 indexed) resulting in a list of targets like\n\n.. code-block:: none\n\n sumSeries(ganglia.server1.*.cpu.load5),sumSeries(ganglia.server1.*.cpu.load10),sumSeries(ganglia.server1.*.cpu.load15),sumSeries(ganglia.server2.*.cpu.load5),sumSeries(ganglia.server2.*.cpu.load10),sumSeries(ganglia.server2.*.cpu.load15),...\n\nThis function can be used with all aggregation functions supported by\n:py:func:`aggregate `: ``average``, ``median``, ``sum``, ``min``, ``max``, ``diff``,\n``stddev``, ``range`` & ``multiply``.\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.\n\n.. code-block:: none\n\n &target=seriesByTag(\"name=~cpu.load.*\", \"server=~server[1-9]+\", \"datacenter=~dc[1-9]+\")|groupByNodes(\"average\", \"datacenter\", 1)\n\n # will produce output series like\n # dc1.load5, dc2.load5, dc1.load10, dc2.load10\n\nThis complements :py:func:`aggregateWithWildcards ` which takes a list of wildcard nodes.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "callback", + "type": "aggOrSeriesFunc", + "required": true, + "options": [ + "average", + "averageSeries", + "averageSeriesWithWildcards", + "avg", + "avg_zero", + "count", + "countSeries", + "current", + "diff", + "diffSeries", + "last", + "max", + "maxSeries", + "median", + "min", + "minSeries", + "multiply", + "multiplySeries", + "multiplySeriesWithWildcards", + "powSeries", + "range", + "rangeOf", + "rangeOfSeries", + "stddev", + "stddevSeries", + "sum", + "sumSeries", + "sumSeriesWithWildcards", + "total" + ] + }, + { + "name": "nodes", + "type": "nodeOrTag", + "required": true, + "multiple": true + } + ] + }, + "groupByTags": { + "name": "groupByTags", + "function": "groupByTags(seriesList, callback, *tags)", + "description": "Takes a serieslist and maps a callback to subgroups within as defined by multiple tags\n\n.. code-block:: none\n\n &target=seriesByTag(\"name=cpu\")|groupByTags(\"average\",\"dc\")\n\nWould return multiple series which are each the result of applying the \"averageSeries\" function\nto groups joined on the specified tags resulting in a list of targets like\n\n.. code-block :: none\n\n averageSeries(seriesByTag(\"name=cpu\",\"dc=dc1\")),averageSeries(seriesByTag(\"name=cpu\",\"dc=dc2\")),...\n\nThis function can be used with all aggregation functions supported by\n:py:func:`aggregate `: ``average`` (or ``avg``), ``avg_zero``,\n``median``, ``sum`` (or ``total``), ``min``, ``max``, ``diff``, ``stddev``, ``count``,\n``range`` (or ``rangeOf``) , ``multiply`` & ``last`` (or ``current``).", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "callback", + "type": "aggOrSeriesFunc", + "required": true, + "options": [ + "average", + "averageSeries", + "averageSeriesWithWildcards", + "avg", + "avg_zero", + "count", + "countSeries", + "current", + "diff", + "diffSeries", + "last", + "max", + "maxSeries", + "median", + "min", + "minSeries", + "multiply", + "multiplySeries", + "multiplySeriesWithWildcards", + "powSeries", + "range", + "rangeOf", + "rangeOfSeries", + "stddev", + "stddevSeries", + "sum", + "sumSeries", + "sumSeriesWithWildcards", + "total" + ] + }, + { + "name": "tags", + "type": "tag", + "required": true, + "multiple": true + } + ] + }, + "isNonNull": { + "name": "isNonNull", + "function": "isNonNull(seriesList)", + "description": "Takes a metric or wildcard seriesList and counts up the number of non-null\nvalues. This is useful for understanding the number of metrics that have data\nat a given point in time (i.e. to count which servers are alive).\n\nExample:\n\n.. code-block:: none\n\n &target=isNonNull(webapp.pages.*.views)\n\nReturns a seriesList where 1 is specified for non-null values, and\n0 is specified for null values.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "map": { + "name": "map", + "function": "map(seriesList, *mapNodes)", + "description": "Short form: ``map()``\n\nTakes a seriesList and maps it to a list of seriesList. Each seriesList has the\ngiven mapNodes in common.\n\n.. note:: This function is not very useful alone. It should be used with :py:func:`reduceSeries`\n\n.. code-block:: none\n\n mapSeries(servers.*.cpu.*,1) =>\n\n [\n servers.server1.cpu.*,\n servers.server2.cpu.*,\n ...\n servers.serverN.cpu.*\n ]\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "mapNodes", + "type": "nodeOrTag", + "required": true, + "multiple": true + } + ] + }, + "mapSeries": { + "name": "mapSeries", + "function": "mapSeries(seriesList, *mapNodes)", + "description": "Short form: ``map()``\n\nTakes a seriesList and maps it to a list of seriesList. Each seriesList has the\ngiven mapNodes in common.\n\n.. note:: This function is not very useful alone. It should be used with :py:func:`reduceSeries`\n\n.. code-block:: none\n\n mapSeries(servers.*.cpu.*,1) =>\n\n [\n servers.server1.cpu.*,\n servers.server2.cpu.*,\n ...\n servers.serverN.cpu.*\n ]\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "mapNodes", + "type": "nodeOrTag", + "required": true, + "multiple": true + } + ] + }, + "maxSeries": { + "name": "maxSeries", + "function": "maxSeries(*seriesLists)", + "description": "Takes one metric or a wildcard seriesList.\nFor each datapoint from each metric passed in, pick the maximum value and graph it.\n\nExample:\n\n.. code-block:: none\n\n &target=maxSeries(Server*.connections.total)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``max``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "minSeries": { + "name": "minSeries", + "function": "minSeries(*seriesLists)", + "description": "Takes one metric or a wildcard seriesList.\nFor each datapoint from each metric passed in, pick the minimum value and graph it.\n\nExample:\n\n.. code-block:: none\n\n &target=minSeries(Server*.connections.total)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``min``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "multiplySeries": { + "name": "multiplySeries", + "function": "multiplySeries(*seriesLists)", + "description": "Takes two or more series and multiplies their points. A constant may not be\nused. To multiply by a constant, use the scale() function.\n\nExample:\n\n.. code-block:: none\n\n &target=multiplySeries(Series.dividends,Series.divisors)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``multiply``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "multiplySeriesWithWildcards": { + "name": "multiplySeriesWithWildcards", + "function": "multiplySeriesWithWildcards(seriesList, *position)", + "description": "Call multiplySeries after inserting wildcards at the given position(s).\n\nExample:\n\n.. code-block:: none\n\n &target=multiplySeriesWithWildcards(web.host-[0-7].{avg-response,total-request}.value, 2)\n\nThis would be the equivalent of\n\n.. code-block:: none\n\n &target=multiplySeries(web.host-0.{avg-response,total-request}.value)&target=multiplySeries(web.host-1.{avg-response,total-request}.value)...\n\nThis is an alias for :py:func:`aggregateWithWildcards ` with aggregation ``multiply``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "position", + "type": "node", + "multiple": true + } + ] + }, + "pct": { + "name": "pct", + "function": "pct(seriesList, total=None, *nodes)", + "description": "Calculates a percentage of the total of a wildcard series. If `total` is specified,\neach series will be calculated as a percentage of that total. If `total` is not specified,\nthe sum of all points in the wildcard series will be used instead.\n\nA list of nodes can optionally be provided, if so they will be used to match series with their\ncorresponding totals following the same logic as :py:func:`groupByNodes `.\n\nWhen passing `nodes` the `total` parameter may be a series list or `None`. If it is `None` then\nfor each series in `seriesList` the percentage of the sum of series in that group will be returned.\n\nWhen not passing `nodes`, the `total` parameter may be a single series, reference the same number\nof series as `seriesList` or be a numeric value.\n\nExample:\n\n.. code-block:: none\n\n # Server01 connections failed and succeeded as a percentage of Server01 connections attempted\n &target=asPercent(Server01.connections.{failed,succeeded}, Server01.connections.attempted)\n\n # For each server, its connections failed as a percentage of its connections attempted\n &target=asPercent(Server*.connections.failed, Server*.connections.attempted)\n\n # For each server, its connections failed and succeeded as a percentage of its connections attemped\n &target=asPercent(Server*.connections.{failed,succeeded}, Server*.connections.attempted, 0)\n\n # apache01.threads.busy as a percentage of 1500\n &target=asPercent(apache01.threads.busy,1500)\n\n # Server01 cpu stats as a percentage of its total\n &target=asPercent(Server01.cpu.*.jiffies)\n\n # cpu stats for each server as a percentage of its total\n &target=asPercent(Server*.cpu.*.jiffies, None, 0)\n\nWhen using `nodes`, any series or totals that can't be matched will create output series with\nnames like ``asPercent(someSeries,MISSING)`` or ``asPercent(MISSING,someTotalSeries)`` and all\nvalues set to None. If desired these series can be filtered out by piping the result through\n``|exclude(\"MISSING\")`` as shown below:\n\n.. code-block:: none\n\n &target=asPercent(Server{1,2}.memory.used,Server{1,3}.memory.total,0)\n\n # will produce 3 output series:\n # asPercent(Server1.memory.used,Server1.memory.total) [values will be as expected]\n # asPercent(Server2.memory.used,MISSING) [all values will be None]\n # asPercent(MISSING,Server3.memory.total) [all values will be None]\n\n &target=asPercent(Server{1,2}.memory.used,Server{1,3}.memory.total,0)|exclude(\"MISSING\")\n\n # will produce 1 output series:\n # asPercent(Server1.memory.used,Server1.memory.total) [values will be as expected]\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.\n\n.. note::\n\n When `total` is a seriesList, specifying `nodes` to match series with the corresponding total\n series will increase reliability.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "total", + "type": "any" + }, + { + "name": "nodes", + "type": "nodeOrTag", + "multiple": true + } + ] + }, + "percentileOfSeries": { + "name": "percentileOfSeries", + "function": "percentileOfSeries(seriesList, n, interpolate=False)", + "description": "percentileOfSeries returns a single series which is composed of the n-percentile\nvalues taken across a wildcard series at each point. Unless `interpolate` is\nset to True, percentile values are actual values contained in one of the\nsupplied series.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + }, + { + "name": "interpolate", + "type": "boolean", + "default": false + } + ] + }, + "rangeOfSeries": { + "name": "rangeOfSeries", + "function": "rangeOfSeries(*seriesLists)", + "description": "Takes a wildcard seriesList.\nDistills down a set of inputs into the range of the series\n\nExample:\n\n.. code-block:: none\n\n &target=rangeOfSeries(Server*.connections.total)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``rangeOf``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "reduce": { + "name": "reduce", + "function": "reduce(seriesLists, reduceFunction, reduceNode, *reduceMatchers)", + "description": "Short form: ``reduce()``\n\nTakes a list of seriesLists and reduces it to a list of series by means of the reduceFunction.\n\nReduction is performed by matching the reduceNode in each series against the list of\nreduceMatchers. Then each series is passed to the reduceFunction as arguments in the order\ngiven by reduceMatchers. The reduceFunction should yield a single series.\n\nThe resulting list of series are aliased so that they can easily be nested in other functions.\n\n**Example**: Map/Reduce asPercent(bytes_used,total_bytes) for each server\n\nAssume that metrics in the form below exist:\n\n.. code-block:: none\n\n servers.server1.disk.bytes_used\n servers.server1.disk.total_bytes\n servers.server2.disk.bytes_used\n servers.server2.disk.total_bytes\n servers.server3.disk.bytes_used\n servers.server3.disk.total_bytes\n ...\n servers.serverN.disk.bytes_used\n servers.serverN.disk.total_bytes\n\nTo get the percentage of disk used for each server:\n\n.. code-block:: none\n\n reduceSeries(mapSeries(servers.*.disk.*,1),\"asPercent\",3,\"bytes_used\",\"total_bytes\") =>\n\n alias(asPercent(servers.server1.disk.bytes_used,servers.server1.disk.total_bytes),\"servers.server1.disk.reduce.asPercent\"),\n alias(asPercent(servers.server2.disk.bytes_used,servers.server2.disk.total_bytes),\"servers.server2.disk.reduce.asPercent\"),\n alias(asPercent(servers.server3.disk.bytes_used,servers.server3.disk.total_bytes),\"servers.server3.disk.reduce.asPercent\"),\n ...\n alias(asPercent(servers.serverN.disk.bytes_used,servers.serverN.disk.total_bytes),\"servers.serverN.disk.reduce.asPercent\")\n\nIn other words, we will get back the following metrics::\n\n servers.server1.disk.reduce.asPercent\n servers.server2.disk.reduce.asPercent\n servers.server3.disk.reduce.asPercent\n ...\n servers.serverN.disk.reduce.asPercent\n\n.. seealso:: :py:func:`mapSeries`", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesLists", + "required": true + }, + { + "name": "reduceFunction", + "type": "string", + "required": true + }, + { + "name": "reduceNode", + "type": "node", + "required": true + }, + { + "name": "reduceMatchers", + "type": "string", + "required": true, + "multiple": true + } + ] + }, + "reduceSeries": { + "name": "reduceSeries", + "function": "reduceSeries(seriesLists, reduceFunction, reduceNode, *reduceMatchers)", + "description": "Short form: ``reduce()``\n\nTakes a list of seriesLists and reduces it to a list of series by means of the reduceFunction.\n\nReduction is performed by matching the reduceNode in each series against the list of\nreduceMatchers. Then each series is passed to the reduceFunction as arguments in the order\ngiven by reduceMatchers. The reduceFunction should yield a single series.\n\nThe resulting list of series are aliased so that they can easily be nested in other functions.\n\n**Example**: Map/Reduce asPercent(bytes_used,total_bytes) for each server\n\nAssume that metrics in the form below exist:\n\n.. code-block:: none\n\n servers.server1.disk.bytes_used\n servers.server1.disk.total_bytes\n servers.server2.disk.bytes_used\n servers.server2.disk.total_bytes\n servers.server3.disk.bytes_used\n servers.server3.disk.total_bytes\n ...\n servers.serverN.disk.bytes_used\n servers.serverN.disk.total_bytes\n\nTo get the percentage of disk used for each server:\n\n.. code-block:: none\n\n reduceSeries(mapSeries(servers.*.disk.*,1),\"asPercent\",3,\"bytes_used\",\"total_bytes\") =>\n\n alias(asPercent(servers.server1.disk.bytes_used,servers.server1.disk.total_bytes),\"servers.server1.disk.reduce.asPercent\"),\n alias(asPercent(servers.server2.disk.bytes_used,servers.server2.disk.total_bytes),\"servers.server2.disk.reduce.asPercent\"),\n alias(asPercent(servers.server3.disk.bytes_used,servers.server3.disk.total_bytes),\"servers.server3.disk.reduce.asPercent\"),\n ...\n alias(asPercent(servers.serverN.disk.bytes_used,servers.serverN.disk.total_bytes),\"servers.serverN.disk.reduce.asPercent\")\n\nIn other words, we will get back the following metrics::\n\n servers.server1.disk.reduce.asPercent\n servers.server2.disk.reduce.asPercent\n servers.server3.disk.reduce.asPercent\n ...\n servers.serverN.disk.reduce.asPercent\n\n.. seealso:: :py:func:`mapSeries`", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesLists", + "required": true + }, + { + "name": "reduceFunction", + "type": "string", + "required": true + }, + { + "name": "reduceNode", + "type": "node", + "required": true + }, + { + "name": "reduceMatchers", + "type": "string", + "required": true, + "multiple": true + } + ] + }, + "stddevSeries": { + "name": "stddevSeries", + "function": "stddevSeries(*seriesLists)", + "description": "Takes one metric or a wildcard seriesList.\nDraws the standard deviation of all metrics passed at each time.\n\nExample:\n\n.. code-block:: none\n\n &target=stddevSeries(company.server.*.threads.busy)\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``stddev``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "sum": { + "name": "sum", + "function": "sum(*seriesLists)", + "description": "Short form: sum()\n\nThis will add metrics together and return the sum at each datapoint. (See\nintegral for a sum over time)\n\nExample:\n\n.. code-block:: none\n\n &target=sum(company.server.application*.requestsHandled)\n\nThis would show the sum of all requests handled per minute (provided\nrequestsHandled are collected once a minute). If metrics with different\nretention rates are combined, the coarsest metric is graphed, and the sum\nof the other metrics is averaged for the metrics with finer retention rates.\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``sum``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "sumSeries": { + "name": "sumSeries", + "function": "sumSeries(*seriesLists)", + "description": "Short form: sum()\n\nThis will add metrics together and return the sum at each datapoint. (See\nintegral for a sum over time)\n\nExample:\n\n.. code-block:: none\n\n &target=sum(company.server.application*.requestsHandled)\n\nThis would show the sum of all requests handled per minute (provided\nrequestsHandled are collected once a minute). If metrics with different\nretention rates are combined, the coarsest metric is graphed, and the sum\nof the other metrics is averaged for the metrics with finer retention rates.\n\nThis is an alias for :py:func:`aggregate ` with aggregation ``sum``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "sumSeriesWithWildcards": { + "name": "sumSeriesWithWildcards", + "function": "sumSeriesWithWildcards(seriesList, *position)", + "description": "Call sumSeries after inserting wildcards at the given position(s).\n\nExample:\n\n.. code-block:: none\n\n &target=sumSeriesWithWildcards(host.cpu-[0-7].cpu-{user,system}.value, 1)\n\nThis would be the equivalent of\n\n.. code-block:: none\n\n &target=sumSeries(host.cpu-[0-7].cpu-user.value)&target=sumSeries(host.cpu-[0-7].cpu-system.value)\n\nThis is an alias for :py:func:`aggregateWithWildcards ` with aggregation ``sum``.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "position", + "type": "node", + "multiple": true + } + ] + }, + "weightedAverage": { + "name": "weightedAverage", + "function": "weightedAverage(seriesListAvg, seriesListWeight, *nodes)", + "description": "Takes a series of average values and a series of weights and\nproduces a weighted average for all values.\nThe corresponding values should share one or more zero-indexed nodes and/or tags.\n\nExample:\n\n.. code-block:: none\n\n &target=weightedAverage(*.transactions.mean,*.transactions.count,0)\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.", + "module": "graphite.render.functions", + "group": "Combine", + "params": [ + { + "name": "seriesListAvg", + "type": "seriesList", + "required": true + }, + { + "name": "seriesListWeight", + "type": "seriesList", + "required": true + }, + { + "name": "nodes", + "type": "nodeOrTag", + "multiple": true + } + ] + }, + "add": { + "name": "add", + "function": "add(seriesList, constant)", + "description": "Takes one metric or a wildcard seriesList followed by a constant, and adds the\nconstant to each datapoint. Also works for negative numbers.\n\nExample:\n\n.. code-block:: none\n\n &target=add(Server.instance01.threads.busy, 10)\n &target=add(Server.instance*.threads.busy, 10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "constant", + "type": "float", + "required": true + } + ] + }, + "absolute": { + "name": "absolute", + "function": "absolute(seriesList)", + "description": "Takes one metric or a wildcard seriesList and applies the mathematical abs function to each\ndatapoint transforming it to its absolute value.\n\nExample:\n\n.. code-block:: none\n\n &target=absolute(Server.instance01.threads.busy)\n &target=absolute(Server.instance*.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "delay": { + "name": "delay", + "function": "delay(seriesList, steps)", + "description": "This shifts all samples later by an integer number of steps. This can be\nused for custom derivative calculations, among other things. Note: this\nwill pad the early end of the data with None for every step shifted.\n\nThis complements other time-displacement functions such as timeShift and\ntimeSlice, in that this function is indifferent about the step intervals\nbeing shifted.\n\nExample:\n\n.. code-block:: none\n\n &target=divideSeries(server.FreeSpace,delay(server.FreeSpace,1))\n\nThis computes the change in server free space as a percentage of the previous\nfree space.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "steps", + "type": "integer", + "required": true + } + ] + }, + "derivative": { + "name": "derivative", + "function": "derivative(seriesList)", + "description": "This is the opposite of the integral function. This is useful for taking a\nrunning total metric and calculating the delta between subsequent data points.\n\nThis function does not normalize for periods of time, as a true derivative would.\nInstead see the perSecond() function to calculate a rate of change over time.\n\nExample:\n\n.. code-block:: none\n\n &target=derivative(company.server.application01.ifconfig.TXPackets)\n\nEach time you run ifconfig, the RX and TXPackets are higher (assuming there\nis network traffic.) By applying the derivative function, you can get an\nidea of the packets per minute sent or received, even though you're only\nrecording the total.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "exp": { + "name": "exp", + "function": "exp(seriesList)", + "description": "Raise e to the power of the datapoint,\nwhere e = 2.718281... is the base of natural logarithms.\n\nExample:\n\n.. code-block:: none\n\n &target=exp(Server.instance01.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "hitcount": { + "name": "hitcount", + "function": "hitcount(seriesList, intervalString, alignToInterval=False)", + "description": "Estimate hit counts from a list of time series.\n\nThis function assumes the values in each time series represent\nhits per second. It calculates hits per some larger interval\nsuch as per day or per hour. This function is like summarize(),\nexcept that it compensates automatically for different time scales\n(so that a similar graph results from using either fine-grained\nor coarse-grained records) and handles rarely-occurring events\ngracefully.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "intervalString", + "type": "interval", + "required": true, + "suggestions": [ + "10min", + "1h", + "1d" + ] + }, + { + "name": "alignToInterval", + "type": "boolean", + "default": false + } + ] + }, + "integral": { + "name": "integral", + "function": "integral(seriesList)", + "description": "This will show the sum over time, sort of like a continuous addition function.\nUseful for finding totals or trends in metrics that are collected per minute.\n\nExample:\n\n.. code-block:: none\n\n &target=integral(company.sales.perMinute)\n\nThis would start at zero on the left side of the graph, adding the sales each\nminute, and show the total sales for the time period selected at the right\nside, (time now, or the time specified by '&until=').", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "integralByInterval": { + "name": "integralByInterval", + "function": "integralByInterval(seriesList, intervalUnit)", + "description": "This will do the same as integral() funcion, except resetting the total to 0\nat the given time in the parameter \"from\"\nUseful for finding totals per hour/day/week/..\n\nExample:\n\n.. code-block:: none\n\n &target=integralByInterval(company.sales.perMinute, \"1d\")&from=midnight-10days\n\nThis would start at zero on the left side of the graph, adding the sales each\nminute, and show the evolution of sales per day during the last 10 days.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "intervalUnit", + "type": "string", + "required": true + } + ] + }, + "interpolate": { + "name": "interpolate", + "function": "interpolate(seriesList, limit=inf)", + "description": "Takes one metric or a wildcard seriesList, and optionally a limit to the number of 'None' values to skip over.\nContinues the line with the last received value when gaps ('None' values) appear in your data, rather than breaking your line.\n\nExample:\n\n.. code-block:: none\n\n &target=interpolate(Server01.connections.handled)\n &target=interpolate(Server01.connections.handled, 10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "limit", + "type": "intOrInf", + "default": 1.7976931348623157e+308 + } + ] + }, + "invert": { + "name": "invert", + "function": "invert(seriesList)", + "description": "Takes one metric or a wildcard seriesList, and inverts each datapoint (i.e. 1/x).\n\nExample:\n\n.. code-block:: none\n\n &target=invert(Server.instance01.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "keepLastValue": { + "name": "keepLastValue", + "function": "keepLastValue(seriesList, limit=inf)", + "description": "Takes one metric or a wildcard seriesList, and optionally a limit to the number of 'None' values to skip over.\nContinues the line with the last received value when gaps ('None' values) appear in your data, rather than breaking your line.\n\nExample:\n\n.. code-block:: none\n\n &target=keepLastValue(Server01.connections.handled)\n &target=keepLastValue(Server01.connections.handled, 10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "limit", + "type": "intOrInf", + "default": 1.7976931348623157e+308 + } + ] + }, + "log": { + "name": "log", + "function": "log(seriesList, base=10)", + "description": "Takes one metric or a wildcard seriesList, a base, and draws the y-axis in logarithmic\nformat. If base is omitted, the function defaults to base 10.\n\nExample:\n\n.. code-block:: none\n\n &target=log(carbon.agents.hostname.avgUpdateTime,2)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "base", + "type": "integer", + "default": 10 + } + ] + }, + "logit": { + "name": "logit", + "function": "logit(seriesList)", + "description": "Takes one metric or a wildcard seriesList and applies the logit\nfunction `log(x / (1 - x))` to each datapoint.\n\nExample:\n\n.. code-block:: none\n\n &target=logit(Server.instance01.threads.busy)\n &target=logit(Server.instance*.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "minMax": { + "name": "minMax", + "function": "minMax(seriesList)", + "description": "Applies the popular min max normalization technique, which takes\neach point and applies the following normalization transformation\nto it: normalized = (point - min) / (max - min).\n\nExample:\n\n.. code-block:: none\n\n &target=minMax(Server.instance01.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "nonNegativeDerivative": { + "name": "nonNegativeDerivative", + "function": "nonNegativeDerivative(seriesList, maxValue=None, minValue=None)", + "description": "Same as the derivative function above, but ignores datapoints that trend\ndown. Useful for counters that increase for a long time, then wrap or\nreset. (Such as if a network interface is destroyed and recreated by unloading\nand re-loading a kernel module, common with USB / WiFi cards.\n\nBy default, a null value is returned in place of negative datapoints. When\n``maxValue`` is supplied, the missing value is computed as if the counter\nhad wrapped at ``maxValue``. When ``minValue`` is supplied, the missing\nvalue is computed as if the counter had wrapped to ``minValue``.\n\nExample:\n\n.. code-block:: none\n\n &target=nonNegativederivative(company.server.application01.ifconfig.TXPackets)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "maxValue", + "type": "float" + }, + { + "name": "minValue", + "type": "float" + } + ] + }, + "offset": { + "name": "offset", + "function": "offset(seriesList, factor)", + "description": "Takes one metric or a wildcard seriesList followed by a constant, and adds the constant to\neach datapoint.\n\nExample:\n\n.. code-block:: none\n\n &target=offset(Server.instance01.threads.busy,10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "factor", + "type": "float", + "required": true + } + ] + }, + "offsetToZero": { + "name": "offsetToZero", + "function": "offsetToZero(seriesList)", + "description": "Offsets a metric or wildcard seriesList by subtracting the minimum\nvalue in the series from each datapoint.\n\nUseful to compare different series where the values in each series\nmay be higher or lower on average but you're only interested in the\nrelative difference.\n\nAn example use case is for comparing different round trip time\nresults. When measuring RTT (like pinging a server), different\ndevices may come back with consistently different results due to\nnetwork latency which will be different depending on how many\nnetwork hops between the probe and the device. To compare different\ndevices in the same graph, the network latency to each has to be\nfactored out of the results. This is a shortcut that takes the\nfastest response (lowest number in the series) and sets that to zero\nand then offsets all of the other datapoints in that series by that\namount. This makes the assumption that the lowest response is the\nfastest the device can respond, of course the more datapoints that\nare in the series the more accurate this assumption is.\n\nExample:\n\n.. code-block:: none\n\n &target=offsetToZero(Server.instance01.responseTime)\n &target=offsetToZero(Server.instance*.responseTime)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "perSecond": { + "name": "perSecond", + "function": "perSecond(seriesList, maxValue=None, minValue=None)", + "description": "NonNegativeDerivative adjusted for the series time interval\nThis is useful for taking a running total metric and showing how many requests\nper second were handled.\n\nThe optional ``minValue`` and ``maxValue`` parameters have the same\nmeaning as in ``nonNegativeDerivative``.\n\nExample:\n\n.. code-block:: none\n\n &target=perSecond(company.server.application01.ifconfig.TXPackets)\n\nEach time you run ifconfig, the RX and TXPackets are higher (assuming there\nis network traffic.) By applying the perSecond function, you can get an\nidea of the packets per second sent or received, even though you're only\nrecording the total.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "maxValue", + "type": "float" + }, + { + "name": "minValue", + "type": "float" + } + ] + }, + "pow": { + "name": "pow", + "function": "pow(seriesList, factor)", + "description": "Takes one metric or a wildcard seriesList followed by a constant, and raises the datapoint\nby the power of the constant provided at each point.\n\nExample:\n\n.. code-block:: none\n\n &target=pow(Server.instance01.threads.busy,10)\n &target=pow(Server.instance*.threads.busy,10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "factor", + "type": "float", + "required": true + } + ] + }, + "powSeries": { + "name": "powSeries", + "function": "powSeries(*seriesLists)", + "description": "Takes two or more series and pows their points. A constant line may be\nused.\n\nExample:\n\n.. code-block:: none\n\n &target=powSeries(Server.instance01.app.requests, Server.instance01.app.replies)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "round": { + "name": "round", + "function": "round(seriesList, precision=None)", + "description": "Takes one metric or a wildcard seriesList optionally followed by a precision, and rounds each\ndatapoint to the specified precision.\n\nExample:\n\n.. code-block:: none\n\n &target=round(Server.instance01.threads.busy)\n &target=round(Server.instance01.threads.busy,2)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "precision", + "type": "integer", + "default": 0 + } + ] + }, + "scale": { + "name": "scale", + "function": "scale(seriesList, factor)", + "description": "Takes one metric or a wildcard seriesList followed by a constant, and multiplies the datapoint\nby the constant provided at each point.\n\nExample:\n\n.. code-block:: none\n\n &target=scale(Server.instance01.threads.busy,10)\n &target=scale(Server.instance*.threads.busy,10)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "factor", + "type": "float", + "required": true + } + ] + }, + "scaleToSeconds": { + "name": "scaleToSeconds", + "function": "scaleToSeconds(seriesList, seconds)", + "description": "Takes one metric or a wildcard seriesList and returns \"value per seconds\" where\nseconds is a last argument to this functions.\n\nUseful in conjunction with derivative or integral function if you want\nto normalize its result to a known resolution for arbitrary retentions", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "seconds", + "type": "float", + "required": true + } + ] + }, + "sigmoid": { + "name": "sigmoid", + "function": "sigmoid(seriesList)", + "description": "Takes one metric or a wildcard seriesList and applies the sigmoid\nfunction `1 / (1 + exp(-x))` to each datapoint.\n\nExample:\n\n.. code-block:: none\n\n &target=sigmoid(Server.instance01.threads.busy)\n &target=sigmoid(Server.instance*.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "smartSummarize": { + "name": "smartSummarize", + "function": "smartSummarize(seriesList, intervalString, func='sum', alignTo=None)", + "description": "Smarter version of summarize.\n\nThe alignToFrom boolean parameter has been replaced by alignTo and no longer has any effect.\nAlignment can be to years, months, weeks, days, hours, and minutes.\n\nThis function can be used with aggregation functions ``average``, ``median``, ``sum``, ``min``,\n``max``, ``diff``, ``stddev``, ``count``, ``range``, ``multiply`` & ``last``.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "intervalString", + "type": "interval", + "required": true, + "suggestions": [ + "10min", + "1h", + "1d" + ] + }, + { + "name": "func", + "type": "aggFunc", + "default": "sum", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "alignTo", + "type": "string", + "options": [ + null, + "days", + "hours", + "minutes", + "months", + "seconds", + "weeks", + "years" + ] + } + ] + }, + "squareRoot": { + "name": "squareRoot", + "function": "squareRoot(seriesList)", + "description": "Takes one metric or a wildcard seriesList, and computes the square root of each datapoint.\n\nExample:\n\n.. code-block:: none\n\n &target=squareRoot(Server.instance01.threads.busy)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "summarize": { + "name": "summarize", + "function": "summarize(seriesList, intervalString, func='sum', alignToFrom=False)", + "description": "Summarize the data into interval buckets of a certain size.\n\nBy default, the contents of each interval bucket are summed together. This is\nuseful for counters where each increment represents a discrete event and\nretrieving a \"per X\" value requires summing all the events in that interval.\n\nSpecifying 'average' instead will return the mean for each bucket, which can be more\nuseful when the value is a gauge that represents a certain value in time.\n\nThis function can be used with aggregation functions ``average``, ``median``, ``sum``, ``min``,\n``max``, ``diff``, ``stddev``, ``count``, ``range``, ``multiply`` & ``last``.\n\nBy default, buckets are calculated by rounding to the nearest interval. This\nworks well for intervals smaller than a day. For example, 22:32 will end up\nin the bucket 22:00-23:00 when the interval=1hour.\n\nPassing alignToFrom=true will instead create buckets starting at the from\ntime. In this case, the bucket for 22:32 depends on the from time. If\nfrom=6:30 then the 1hour bucket for 22:32 is 22:30-23:30.\n\nExample:\n\n.. code-block:: none\n\n &target=summarize(counter.errors, \"1hour\") # total errors per hour\n &target=summarize(nonNegativeDerivative(gauge.num_users), \"1week\") # new users per week\n &target=summarize(queue.size, \"1hour\", \"avg\") # average queue size per hour\n &target=summarize(queue.size, \"1hour\", \"max\") # maximum queue size during each hour\n &target=summarize(metric, \"13week\", \"avg\", true)&from=midnight+20100101 # 2010 Q1-4", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "intervalString", + "type": "interval", + "required": true, + "suggestions": [ + "10min", + "1h", + "1d" + ] + }, + { + "name": "func", + "type": "aggFunc", + "default": "sum", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "alignToFrom", + "type": "boolean", + "default": false + } + ] + }, + "timeShift": { + "name": "timeShift", + "function": "timeShift(seriesList, timeShift, resetEnd=True, alignDST=False)", + "description": "Takes one metric or a wildcard seriesList, followed by a quoted string with the\nlength of time (See ``from / until`` in the :doc:`Render API ` for examples of time formats).\n\nDraws the selected metrics shifted in time. If no sign is given, a minus sign ( - ) is\nimplied which will shift the metric back in time. If a plus sign ( + ) is given, the\nmetric will be shifted forward in time.\n\nWill reset the end date range automatically to the end of the base stat unless\nresetEnd is False. Example case is when you timeshift to last week and have the graph\ndate range set to include a time in the future, will limit this timeshift to pretend\nending at the current time. If resetEnd is False, will instead draw full range including\nfuture time.\n\nBecause time is shifted by a fixed number of seconds, comparing a time period with DST to\na time period without DST, and vice-versa, will result in an apparent misalignment. For\nexample, 8am might be overlaid with 7am. To compensate for this, use the alignDST option.\n\nUseful for comparing a metric against itself at a past periods or correcting data\nstored at an offset.\n\nExample:\n\n.. code-block:: none\n\n &target=timeShift(Sales.widgets.largeBlue,\"7d\")\n &target=timeShift(Sales.widgets.largeBlue,\"-7d\")\n &target=timeShift(Sales.widgets.largeBlue,\"+1h\")", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "timeShift", + "type": "interval", + "required": true, + "suggestions": [ + "1h", + "6h", + "12h", + "1d", + "2d", + "7d", + "14d", + "30d" + ] + }, + { + "name": "resetEnd", + "type": "boolean", + "default": true + }, + { + "name": "alignDst", + "type": "boolean", + "default": false + } + ] + }, + "timeSlice": { + "name": "timeSlice", + "function": "timeSlice(seriesList, startSliceAt, endSliceAt='now')", + "description": "Takes one metric or a wildcard metric, followed by a quoted string with the\ntime to start the line and another quoted string with the time to end the line.\nThe start and end times are inclusive. See ``from / until`` in the :doc:`Render API `\nfor examples of time formats.\n\nUseful for filtering out a part of a series of data from a wider range of\ndata.\n\nExample:\n\n.. code-block:: none\n\n &target=timeSlice(network.core.port1,\"00:00 20140101\",\"11:59 20140630\")\n &target=timeSlice(network.core.port1,\"12:00 20140630\",\"now\")", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "startSliceAt", + "type": "date", + "required": true + }, + { + "name": "endSliceAt", + "type": "date", + "default": "now" + } + ] + }, + "timeStack": { + "name": "timeStack", + "function": "timeStack(seriesList, timeShiftUnit='1d', timeShiftStart=0, timeShiftEnd=7)", + "description": "Takes one metric or a wildcard seriesList, followed by a quoted string with the\nlength of time (See ``from / until`` in the :doc:`Render API ` for examples of time formats).\nAlso takes a start multiplier and end multiplier for the length of time\n\ncreate a seriesList which is composed the original metric series stacked with time shifts\nstarting time shifts from the start multiplier through the end multiplier\n\nUseful for looking at history, or feeding into averageSeries or stddevSeries.\n\nExample:\n\n.. code-block:: none\n\n &target=timeStack(Sales.widgets.largeBlue,\"1d\",0,7) # create a series for today and each of the previous 7 days", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "timeShiftUnit", + "type": "interval", + "default": "1d", + "suggestions": [ + "1h", + "6h", + "12h", + "1d", + "2d", + "7d", + "14d", + "30d" + ] + }, + { + "name": "timeShiftStart", + "type": "integer", + "default": 0 + }, + { + "name": "timeShiftEnd", + "type": "integer", + "default": 7 + } + ] + }, + "transformNull": { + "name": "transformNull", + "function": "transformNull(seriesList, default=0, referenceSeries=None)", + "description": "Takes a metric or wildcard seriesList and replaces null values with the value\nspecified by `default`. The value 0 used if not specified. The optional\nreferenceSeries, if specified, is a metric or wildcard series list that governs\nwhich time intervals nulls should be replaced. If specified, nulls are replaced\nonly in intervals where a non-null is found for the same interval in any of\nreferenceSeries. This method compliments the drawNullAsZero function in\ngraphical mode, but also works in text-only mode.\n\nExample:\n\n.. code-block:: none\n\n &target=transformNull(webapp.pages.*.views,-1)\n\nThis would take any page that didn't have values and supply negative 1 as a default.\nAny other numeric value may be used as well.", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "default", + "type": "float", + "default": 0 + }, + { + "name": "referenceSeries", + "type": "seriesList" + } + ] + }, + "aggregateLine": { + "name": "aggregateLine", + "function": "aggregateLine(seriesList, func='average', keepStep=False)", + "description": "Takes a metric or wildcard seriesList and draws a horizontal line\nbased on the function applied to each series.\n\nIf the optional keepStep parameter is set to True, the result will\nhave the same time period and step as the source series.\n\nNote: By default, the graphite renderer consolidates data points by\naveraging data points over time. If you are using the 'min' or 'max'\nfunction for aggregateLine, this can cause an unusual gap in the\nline drawn by this function and the data itself. To fix this, you\nshould use the consolidateBy() function with the same function\nargument you are using for aggregateLine. This will ensure that the\nproper data points are retained and the graph should line up\ncorrectly.\n\nExample:\n\n.. code-block:: none\n\n &target=aggregateLine(server01.connections.total, 'avg')\n &target=aggregateLine(server*.connections.total, 'avg')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "default": "average", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "keepStep", + "type": "boolean", + "default": false + } + ] + }, + "exponentialMovingAverage": { + "name": "exponentialMovingAverage", + "function": "exponentialMovingAverage(seriesList, windowSize)", + "description": "Takes a series of values and a window size and produces an exponential moving\naverage utilizing the following formula:\n\n.. code-block:: none\n\n ema(current) = constant * (Current Value) + (1 - constant) * ema(previous)\n\nThe Constant is calculated as:\n\n.. code-block:: none\n\n constant = 2 / (windowSize + 1)\n\nThe first period EMA uses a simple moving average for its value.\n\nExample:\n\n.. code-block:: none\n\n &target=exponentialMovingAverage(*.transactions.count, 10)\n &target=exponentialMovingAverage(*.transactions.count, '-10s')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + } + ] + }, + "holtWintersAberration": { + "name": "holtWintersAberration", + "function": "holtWintersAberration(seriesList, delta=3, bootstrapInterval='7d', seasonality='1d')", + "description": "Performs a Holt-Winters forecast using the series as input data and plots the\npositive or negative deviation of the series data from the forecast.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "delta", + "type": "integer", + "default": 3 + }, + { + "name": "bootstrapInterval", + "type": "interval", + "default": "7d", + "suggestions": [ + "7d", + "30d" + ] + }, + { + "name": "seasonality", + "type": "interval", + "default": "1d", + "suggestions": [ + "1d", + "7d" + ] + } + ] + }, + "holtWintersConfidenceArea": { + "name": "holtWintersConfidenceArea", + "function": "holtWintersConfidenceArea(seriesList, delta=3, bootstrapInterval='7d', seasonality='1d')", + "description": "Performs a Holt-Winters forecast using the series as input data and plots the\narea between the upper and lower bands of the predicted forecast deviations.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "delta", + "type": "integer", + "default": 3 + }, + { + "name": "bootstrapInterval", + "type": "interval", + "default": "7d", + "suggestions": [ + "7d", + "30d" + ] + }, + { + "name": "seasonality", + "type": "interval", + "default": "1d", + "suggestions": [ + "1d", + "7d" + ] + } + ] + }, + "holtWintersConfidenceBands": { + "name": "holtWintersConfidenceBands", + "function": "holtWintersConfidenceBands(seriesList, delta=3, bootstrapInterval='7d', seasonality='1d')", + "description": "Performs a Holt-Winters forecast using the series as input data and plots\nupper and lower bands with the predicted forecast deviations.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "delta", + "type": "integer", + "default": 3 + }, + { + "name": "bootstrapInterval", + "type": "interval", + "default": "7d", + "suggestions": [ + "7d", + "30d" + ] + }, + { + "name": "seasonality", + "type": "interval", + "default": "1d", + "suggestions": [ + "1d", + "7d" + ] + } + ] + }, + "holtWintersForecast": { + "name": "holtWintersForecast", + "function": "holtWintersForecast(seriesList, bootstrapInterval='7d', seasonality='1d')", + "description": "Performs a Holt-Winters forecast using the series as input data. Data from\n`bootstrapInterval` (one week by default) previous to the series is used to bootstrap the initial forecast.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "bootstrapInterval", + "type": "interval", + "default": "7d", + "suggestions": [ + "7d", + "30d" + ] + }, + { + "name": "seasonality", + "type": "interval", + "default": "1d", + "suggestions": [ + "1d", + "7d" + ] + } + ] + }, + "linearRegression": { + "name": "linearRegression", + "function": "linearRegression(seriesList, startSourceAt=None, endSourceAt=None)", + "description": "Graphs the linear regression function by least squares method.\n\nTakes one metric or a wildcard seriesList, followed by a quoted string with the\ntime to start the line and another quoted string with the time to end the line.\nThe start and end times are inclusive (default range is from to until). See\n``from / until`` in the :doc:`Render API ` for examples of time formats. Datapoints\nin the range is used to regression.\n\nExample:\n\n.. code-block:: none\n\n &target=linearRegression(Server.instance01.threads.busy, '-1d')\n &target=linearRegression(Server.instance*.threads.busy, \"00:00 20140101\",\"11:59 20140630\")", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "startSourceAt", + "type": "date" + }, + { + "name": "endSourceAt", + "type": "date" + } + ] + }, + "movingAverage": { + "name": "movingAverage", + "function": "movingAverage(seriesList, windowSize, xFilesFactor=None)", + "description": "Graphs the moving average of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList followed by a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), and an xFilesFactor value to specify\nhow many points in the window must be non-null for the output to be considered valid. Graphs the\naverage of the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingAverage(Server.instance01.threads.busy,10)\n &target=movingAverage(Server.instance*.threads.idle,'5min')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "movingMax": { + "name": "movingMax", + "function": "movingMax(seriesList, windowSize, xFilesFactor=None)", + "description": "Graphs the moving maximum of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList followed by a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), and an xFilesFactor value to specify\nhow many points in the window must be non-null for the output to be considered valid. Graphs the\nmaximum of the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingMax(Server.instance01.requests,10)\n &target=movingMax(Server.instance*.errors,'5min')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "movingMedian": { + "name": "movingMedian", + "function": "movingMedian(seriesList, windowSize, xFilesFactor=None)", + "description": "Graphs the moving median of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList followed by a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), and an xFilesFactor value to specify\nhow many points in the window must be non-null for the output to be considered valid. Graphs the\nmedian of the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingMedian(Server.instance01.threads.busy,10)\n &target=movingMedian(Server.instance*.threads.idle,'5min')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "movingMin": { + "name": "movingMin", + "function": "movingMin(seriesList, windowSize, xFilesFactor=None)", + "description": "Graphs the moving minimum of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList followed by a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), and an xFilesFactor value to specify\nhow many points in the window must be non-null for the output to be considered valid. Graphs the\nminimum of the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingMin(Server.instance01.requests,10)\n &target=movingMin(Server.instance*.errors,'5min')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "movingSum": { + "name": "movingSum", + "function": "movingSum(seriesList, windowSize, xFilesFactor=None)", + "description": "Graphs the moving sum of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList followed by a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), and an xFilesFactor value to specify\nhow many points in the window must be non-null for the output to be considered valid. Graphs the\nsum of the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingSum(Server.instance01.requests,10)\n &target=movingSum(Server.instance*.errors,'5min')", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "movingWindow": { + "name": "movingWindow", + "function": "movingWindow(seriesList, windowSize, func='average', xFilesFactor=None)", + "description": "Graphs a moving window function of a metric (or metrics) over a fixed number of\npast points, or a time interval.\n\nTakes one metric or a wildcard seriesList, a number N of datapoints\nor a quoted string with a length of time like '1hour' or '5min' (See ``from /\nuntil`` in the :doc:`Render API ` for examples of time formats), a function to apply to the points\nin the window to produce the output, and an xFilesFactor value to specify how many points in the\nwindow must be non-null for the output to be considered valid. Graphs the\noutput of the function for the preceeding datapoints for each point on the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=movingWindow(Server.instance01.threads.busy,10)\n &target=movingWindow(Server.instance*.threads.idle,'5min','median',0.5)\n\n.. note::\n\n `xFilesFactor` follows the same semantics as in Whisper storage schemas. Setting it to 0 (the\n default) means that only a single value in a given interval needs to be non-null, setting it to\n 1 means that all values in the interval must be non-null. A setting of 0.5 means that at least\n half the values in the interval must be non-null.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "windowSize", + "type": "intOrInterval", + "required": true, + "suggestions": [ + 5, + 7, + 10, + "1min", + "5min", + "10min", + "30min", + "1hour" + ] + }, + { + "name": "func", + "type": "aggFunc", + "default": "average", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "nPercentile": { + "name": "nPercentile", + "function": "nPercentile(seriesList, n)", + "description": "Returns n-percent of each series in the seriesList.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "stdev": { + "name": "stdev", + "function": "stdev(seriesList, points, windowTolerance=0.1)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nDraw the Standard Deviation of all metrics passed for the past N datapoints.\nIf the ratio of null points in the window is greater than windowTolerance,\nskip the calculation. The default for windowTolerance is 0.1 (up to 10% of points\nin the window can be missing). Note that if this is set to 0.0, it will cause large\ngaps in the output anywhere a single point is missing.\n\nExample:\n\n.. code-block:: none\n\n &target=stdev(server*.instance*.threads.busy,30)\n &target=stdev(server*.instance*.cpu.system,30,0.0)", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "points", + "type": "integer", + "required": true + }, + { + "name": "windowTolerance", + "type": "float", + "default": 0.1 + } + ] + }, + "averageAbove": { + "name": "averageAbove", + "function": "averageAbove(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant N.\nOut of all metrics passed, draws only the metrics with an average value\nabove N for the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=averageAbove(server*.instance*.threads.busy,25)\n\nDraws the servers with average values above 25.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "averageBelow": { + "name": "averageBelow", + "function": "averageBelow(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant N.\nOut of all metrics passed, draws only the metrics with an average value\nbelow N for the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=averageBelow(server*.instance*.threads.busy,25)\n\nDraws the servers with average values below 25.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "averageOutsidePercentile": { + "name": "averageOutsidePercentile", + "function": "averageOutsidePercentile(seriesList, n)", + "description": "Removes series lying inside an average percentile interval", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "currentAbove": { + "name": "currentAbove", + "function": "currentAbove(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant N.\nOut of all metrics passed, draws only the metrics whose value is above N\nat the end of the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=currentAbove(server*.instance*.threads.busy,50)\n\nDraws the servers with more than 50 busy threads.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "currentBelow": { + "name": "currentBelow", + "function": "currentBelow(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant N.\nOut of all metrics passed, draws only the metrics whose value is below N\nat the end of the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=currentBelow(server*.instance*.threads.busy,3)\n\nDraws the servers with less than 3 busy threads.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "exclude": { + "name": "exclude", + "function": "exclude(seriesList, pattern)", + "description": "Takes a metric or a wildcard seriesList, followed by a regular expression\nin double quotes. Excludes metrics that match the regular expression.\n\nExample:\n\n.. code-block:: none\n\n &target=exclude(servers*.instance*.threads.busy,\"server02\")", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "pattern", + "type": "string", + "required": true + } + ] + }, + "filterSeries": { + "name": "filterSeries", + "function": "filterSeries(seriesList, func, operator, threshold)", + "description": "Takes one metric or a wildcard seriesList followed by a consolidation function, an operator and a threshold.\nDraws only the metrics which match the filter expression.\n\nExample:\n\n.. code-block:: none\n\n &target=filterSeries(system.interface.eth*.packetsSent, 'max', '>', 1000)\n\nThis would only display interfaces which has a peak throughput higher than 1000 packets/min.\n\nSupported aggregation functions: ``average``, ``median``, ``sum``, ``min``,\n``max``, ``diff``, ``stddev``, ``range``, ``multiply`` & ``last``.\n\nSupported operators: ``=``, ``!=``, ``>``, ``>=``, ``<`` & ``<=``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "required": true, + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "operator", + "type": "string", + "required": true, + "options": [ + "!=", + "<", + "<=", + "=", + ">", + ">=" + ] + }, + { + "name": "threshold", + "type": "float", + "required": true + } + ] + }, + "grep": { + "name": "grep", + "function": "grep(seriesList, pattern)", + "description": "Takes a metric or a wildcard seriesList, followed by a regular expression\nin double quotes. Excludes metrics that don't match the regular expression.\n\nExample:\n\n.. code-block:: none\n\n &target=grep(servers*.instance*.threads.busy,\"server02\")", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "pattern", + "type": "string", + "required": true + } + ] + }, + "highest": { + "name": "highest", + "function": "highest(seriesList, n=1, func='average')", + "description": "Takes one metric or a wildcard seriesList followed by an integer N and an aggregation function.\nOut of all metrics passed, draws only the N metrics with the highest aggregated value over the\ntime period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=highest(server*.instance*.threads.busy,5,'max')\n\nDraws the 5 servers with the highest number of busy threads.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "default": "average", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + } + ] + }, + "highestAverage": { + "name": "highestAverage", + "function": "highestAverage(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nOut of all metrics passed, draws only the top N metrics with the highest\naverage value for the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=highestAverage(server*.instance*.threads.busy,5)\n\nDraws the top 5 servers with the highest average value.\n\nThis is an alias for :py:func:`highest ` with aggregation ``average``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "highestCurrent": { + "name": "highestCurrent", + "function": "highestCurrent(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nOut of all metrics passed, draws only the N metrics with the highest value\nat the end of the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=highestCurrent(server*.instance*.threads.busy,5)\n\nDraws the 5 servers with the highest busy threads.\n\nThis is an alias for :py:func:`highest ` with aggregation ``current``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "highestMax": { + "name": "highestMax", + "function": "highestMax(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\n\nOut of all metrics passed, draws only the N metrics with the highest maximum\nvalue in the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=highestMax(server*.instance*.threads.busy,5)\n\nDraws the top 5 servers who have had the most busy threads during the time\nperiod specified.\n\nThis is an alias for :py:func:`highest ` with aggregation ``max``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "limit": { + "name": "limit", + "function": "limit(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\n\nOnly draw the first N metrics. Useful when testing a wildcard in a metric.\n\nExample:\n\n.. code-block:: none\n\n &target=limit(server*.instance*.memory.free,5)\n\nDraws only the first 5 instance's memory free.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "lowest": { + "name": "lowest", + "function": "lowest(seriesList, n=1, func='average')", + "description": "Takes one metric or a wildcard seriesList followed by an integer N and an aggregation function.\nOut of all metrics passed, draws only the N metrics with the lowest aggregated value over the\ntime period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=lowest(server*.instance*.threads.busy,5,'min')\n\nDraws the 5 servers with the lowest number of busy threads.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "default": "average", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + } + ] + }, + "lowestAverage": { + "name": "lowestAverage", + "function": "lowestAverage(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nOut of all metrics passed, draws only the bottom N metrics with the lowest\naverage value for the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=lowestAverage(server*.instance*.threads.busy,5)\n\nDraws the bottom 5 servers with the lowest average value.\n\nThis is an alias for :py:func:`lowest ` with aggregation ``average``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "lowestCurrent": { + "name": "lowestCurrent", + "function": "lowestCurrent(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nOut of all metrics passed, draws only the N metrics with the lowest value at\nthe end of the time period specified.\n\nExample:\n\n.. code-block:: none\n\n &target=lowestCurrent(server*.instance*.threads.busy,5)\n\nDraws the 5 servers with the least busy threads right now.\n\nThis is an alias for :py:func:`lowest ` with aggregation ``current``.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "maximumAbove": { + "name": "maximumAbove", + "function": "maximumAbove(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant n.\nDraws only the metrics with a maximum value above n.\n\nExample:\n\n.. code-block:: none\n\n &target=maximumAbove(system.interface.eth*.packetsSent,1000)\n\nThis would only display interfaces which sent more than 1000 packets/min.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "maximumBelow": { + "name": "maximumBelow", + "function": "maximumBelow(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant n.\nDraws only the metrics with a maximum value below n.\n\nExample:\n\n.. code-block:: none\n\n &target=maximumBelow(system.interface.eth*.packetsSent,1000)\n\nThis would only display interfaces which sent less than 1000 packets/min.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "minimumAbove": { + "name": "minimumAbove", + "function": "minimumAbove(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant n.\nDraws only the metrics with a minimum value above n.\n\nExample:\n\n.. code-block:: none\n\n &target=minimumAbove(system.interface.eth*.packetsSent,1000)\n\nThis would only display interfaces which sent more than 1000 packets/min.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "minimumBelow": { + "name": "minimumBelow", + "function": "minimumBelow(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by a constant n.\nDraws only the metrics with a minimum value below or equal to n.\n\nExample:\n\n.. code-block:: none\n\n &target=minimumBelow(system.interface.eth*.packetsSent,1000)\n\nThis would only display interfaces which at one point sent less than 1000 packets/min.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "mostDeviant": { + "name": "mostDeviant", + "function": "mostDeviant(seriesList, n)", + "description": "Takes one metric or a wildcard seriesList followed by an integer N.\nDraws the N most deviant metrics.\nTo find the deviants, the standard deviation (sigma) of each series\nis taken and ranked. The top N standard deviations are returned.\n\n Example:\n\n.. code-block:: none\n\n &target=mostDeviant(server*.instance*.memory.free, 5)\n\nDraws the 5 instances furthest from the average memory free.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "integer", + "required": true + } + ] + }, + "removeBetweenPercentile": { + "name": "removeBetweenPercentile", + "function": "removeBetweenPercentile(seriesList, n)", + "description": "Removes series that do not have an value lying in the x-percentile of all the values at a moment", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "removeEmptySeries": { + "name": "removeEmptySeries", + "function": "removeEmptySeries(seriesList, xFilesFactor=None)", + "description": "Takes one metric or a wildcard seriesList.\nOut of all metrics passed, draws only the metrics with not empty data\n\nExample:\n\n.. code-block:: none\n\n &target=removeEmptySeries(server*.instance*.threads.busy)\n\nDraws only live servers with not empty data.\n\n`xFilesFactor` follows the same semantics as in Whisper storage schemas. Setting it to 0 (the\ndefault) means that only a single value in the series needs to be non-null for it to be\nconsidered non-empty, setting it to 1 means that all values in the series must be non-null.\nA setting of 0.5 means that at least half the values in the series must be non-null.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "xFilesFactor", + "type": "float" + } + ] + }, + "unique": { + "name": "unique", + "function": "unique(*seriesLists)", + "description": "Takes an arbitrary number of seriesLists and returns unique series, filtered by name.\n\nExample:\n\n.. code-block:: none\n\n &target=unique(mostDeviant(server.*.disk_free,5),lowestCurrent(server.*.disk_free,5))\n\nDraws servers with low disk space, and servers with highly deviant disk space, but never the same series twice.", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesLists", + "type": "seriesList", + "required": true, + "multiple": true + } + ] + }, + "useSeriesAbove": { + "name": "useSeriesAbove", + "function": "useSeriesAbove(seriesList, value, search, replace)", + "description": "Compares the maximum of each series against the given `value`. If the series\nmaximum is greater than `value`, the regular expression search and replace is\napplied against the series name to plot a related metric\n\ne.g. given useSeriesAbove(ganglia.metric1.reqs,10,'reqs','time'),\nthe response time metric will be plotted only when the maximum value of the\ncorresponding request/s metric is > 10\n\n.. code-block:: none\n\n &target=useSeriesAbove(ganglia.metric1.reqs,10,\"reqs\",\"time\")", + "module": "graphite.render.functions", + "group": "Filter Series", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "value", + "type": "float", + "required": true + }, + { + "name": "search", + "type": "string", + "required": true + }, + { + "name": "replace", + "type": "string", + "required": true + } + ] + }, + "removeAbovePercentile": { + "name": "removeAbovePercentile", + "function": "removeAbovePercentile(seriesList, n)", + "description": "Removes data above the nth percentile from the series or list of series provided.\nValues above this percentile are assigned a value of None.", + "module": "graphite.render.functions", + "group": "Filter Data", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "removeAboveValue": { + "name": "removeAboveValue", + "function": "removeAboveValue(seriesList, n)", + "description": "Removes data above the given threshold from the series or list of series provided.\nValues above this threshold are assigned a value of None.", + "module": "graphite.render.functions", + "group": "Filter Data", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "removeBelowPercentile": { + "name": "removeBelowPercentile", + "function": "removeBelowPercentile(seriesList, n)", + "description": "Removes data below the nth percentile from the series or list of series provided.\nValues below this percentile are assigned a value of None.", + "module": "graphite.render.functions", + "group": "Filter Data", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "removeBelowValue": { + "name": "removeBelowValue", + "function": "removeBelowValue(seriesList, n)", + "description": "Removes data below the given threshold from the series or list of series provided.\nValues below this threshold are assigned a value of None.", + "module": "graphite.render.functions", + "group": "Filter Data", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "n", + "type": "float", + "required": true + } + ] + }, + "sortBy": { + "name": "sortBy", + "function": "sortBy(seriesList, func='average', reverse=False)", + "description": "Takes one metric or a wildcard seriesList followed by an aggregation function and an\noptional ``reverse`` parameter.\n\nReturns the metrics sorted according to the specified function.\n\nExample:\n\n.. code-block:: none\n\n &target=sortBy(server*.instance*.threads.busy,'max')\n\nDraws the servers in ascending order by maximum.", + "module": "graphite.render.functions", + "group": "Sorting", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "func", + "type": "aggFunc", + "default": "average", + "options": [ + "average", + "avg", + "avg_zero", + "count", + "current", + "diff", + "last", + "max", + "median", + "min", + "multiply", + "range", + "rangeOf", + "stddev", + "sum", + "total" + ] + }, + { + "name": "reverse", + "type": "boolean", + "default": false + } + ] + }, + "sortByMaxima": { + "name": "sortByMaxima", + "function": "sortByMaxima(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\n\nSorts the list of metrics in descending order by the maximum value across the time period\nspecified. Useful with the &areaMode=all parameter, to keep the\nlowest value lines visible.\n\nExample:\n\n.. code-block:: none\n\n &target=sortByMaxima(server*.instance*.memory.free)", + "module": "graphite.render.functions", + "group": "Sorting", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "sortByMinima": { + "name": "sortByMinima", + "function": "sortByMinima(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\n\nSorts the list of metrics by the lowest value across the time period\nspecified, including only series that have a maximum value greater than 0.\n\nExample:\n\n.. code-block:: none\n\n &target=sortByMinima(server*.instance*.memory.free)", + "module": "graphite.render.functions", + "group": "Sorting", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "sortByName": { + "name": "sortByName", + "function": "sortByName(seriesList, natural=False, reverse=False)", + "description": "Takes one metric or a wildcard seriesList.\nSorts the list of metrics by the metric name using either alphabetical order or natural sorting.\nNatural sorting allows names containing numbers to be sorted more naturally, e.g:\n- Alphabetical sorting: server1, server11, server12, server2\n- Natural sorting: server1, server2, server11, server12", + "module": "graphite.render.functions", + "group": "Sorting", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "natural", + "type": "boolean", + "default": false + }, + { + "name": "reverse", + "type": "boolean", + "default": false + } + ] + }, + "sortByTotal": { + "name": "sortByTotal", + "function": "sortByTotal(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\n\nSorts the list of metrics in descending order by the sum of values across the time period\nspecified.", + "module": "graphite.render.functions", + "group": "Sorting", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "alias": { + "name": "alias", + "function": "alias(seriesList, newName)", + "description": "Takes one metric or a wildcard seriesList and a string in quotes.\nPrints the string instead of the metric name in the legend.\n\n.. code-block:: none\n\n &target=alias(Sales.widgets.largeBlue,\"Large Blue Widgets\")", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "newName", + "type": "string", + "required": true + } + ] + }, + "aliasByMetric": { + "name": "aliasByMetric", + "function": "aliasByMetric(seriesList)", + "description": "Takes a seriesList and applies an alias derived from the base metric name.\n\n.. code-block:: none\n\n &target=aliasByMetric(carbon.agents.graphite.creates)", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "aliasByNode": { + "name": "aliasByNode", + "function": "aliasByNode(seriesList, *nodes)", + "description": "Takes a seriesList and applies an alias derived from one or more \"node\"\nportion/s of the target name or tags. Node indices are 0 indexed.\n\n.. code-block:: none\n\n &target=aliasByNode(ganglia.*.cpu.load5,1)\n\nEach node may be an integer referencing a node in the series name or a string identifying a tag.\n\n.. code-block:: none\n\n &target=seriesByTag(\"name=~cpu.load.*\", \"server=~server[1-9]+\", \"datacenter=dc1\")|aliasByNode(\"datacenter\", \"server\", 1)\n\n # will produce output series like\n # dc1.server1.load5, dc1.server2.load5, dc1.server1.load10, dc1.server2.load10", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "nodes", + "type": "nodeOrTag", + "required": true, + "multiple": true + } + ] + }, + "aliasByTags": { + "name": "aliasByTags", + "function": "aliasByTags(seriesList, *tags)", + "description": "Takes a seriesList and applies an alias derived from one or more tags and/or nodes\n\n.. code-block:: none\n\n &target=seriesByTag(\"name=cpu\")|aliasByTags(\"server\",\"name\")\n\nThis is an alias for :py:func:`aliasByNode `.", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "tags", + "type": "nodeOrTag", + "required": true, + "multiple": true + } + ] + }, + "aliasQuery": { + "name": "aliasQuery", + "function": "aliasQuery(seriesList, search, replace, newName)", + "description": "Performs a query to alias the metrics in seriesList.\n\n.. code-block:: none\n\n &target=aliasQuery(channel.power.*,\"channel\\.power\\.([0-9]+)\",\"channel.frequency.\\1\", \"Channel %d MHz\")\n\nThe series in seriesList will be aliased by first translating the series names using\nthe search & replace parameters, then using the last value of the resulting series\nto construct the alias using sprintf-style syntax.", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "search", + "type": "string", + "required": true + }, + { + "name": "replace", + "type": "string", + "required": true + }, + { + "name": "newName", + "type": "string", + "required": true + } + ] + }, + "aliasSub": { + "name": "aliasSub", + "function": "aliasSub(seriesList, search, replace)", + "description": "Runs series names through a regex search/replace.\n\n.. code-block:: none\n\n &target=aliasSub(ip.*TCP*,\"^.*TCP(\\d+)\",\"\\1\")", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "search", + "type": "string", + "required": true + }, + { + "name": "replace", + "type": "string", + "required": true + } + ] + }, + "legendValue": { + "name": "legendValue", + "function": "legendValue(seriesList, *valueTypes)", + "description": "Takes one metric or a wildcard seriesList and a string in quotes.\nAppends a value to the metric name in the legend. Currently one or several of: `last`, `avg`,\n`total`, `min`, `max`.\nThe last argument can be `si` (default) or `binary`, in that case values will be formatted in the\ncorresponding system.\n\n.. code-block:: none\n\n &target=legendValue(Sales.widgets.largeBlue, 'avg', 'max', 'si')", + "module": "graphite.render.functions", + "group": "Alias", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "valuesTypes", + "type": "string", + "multiple": true, + "options": [ + "average", + "averageSeries", + "avg", + "avgSeries", + "avg_zero", + "avg_zeroSeries", + "binary", + "count", + "countSeries", + "current", + "currentSeries", + "diff", + "diffSeries", + "last", + "lastSeries", + "max", + "maxSeries", + "median", + "medianSeries", + "min", + "minSeries", + "multiply", + "multiplySeries", + "range", + "rangeOf", + "rangeOfSeries", + "rangeSeries", + "si", + "stddev", + "stddevSeries", + "sum", + "sumSeries", + "total", + "totalSeries" + ] + } + ] + }, + "alpha": { + "name": "alpha", + "function": "alpha(seriesList, alpha)", + "description": "Assigns the given alpha transparency setting to the series. Takes a float value between 0 and 1.", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "alpha", + "type": "float", + "required": true + } + ] + }, + "areaBetween": { + "name": "areaBetween", + "function": "areaBetween(seriesList)", + "description": "Draws the vertical area in between the two series in seriesList. Useful for\nvisualizing a range such as the minimum and maximum latency for a service.\n\nareaBetween expects **exactly one argument** that results in exactly two series\n(see example below). The order of the lower and higher values series does not\nmatter. The visualization only works when used in conjunction with\n``areaMode=stacked``.\n\nMost likely use case is to provide a band within which another metric should\nmove. In such case applying an ``alpha()``, as in the second example, gives\nbest visual results.\n\nExample:\n\n.. code-block:: none\n\n &target=areaBetween(service.latency.{min,max})&areaMode=stacked\n\n &target=alpha(areaBetween(service.latency.{min,max}),0.3)&areaMode=stacked\n\nIf for instance, you need to build a seriesList, you should use the ``group``\nfunction, like so:\n\n.. code-block:: none\n\n &target=areaBetween(group(minSeries(a.*.min),maxSeries(a.*.max)))", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "color": { + "name": "color", + "function": "color(seriesList, theColor)", + "description": "Assigns the given color to the seriesList\n\nExample:\n\n.. code-block:: none\n\n &target=color(collectd.hostname.cpu.0.user, 'green')\n &target=color(collectd.hostname.cpu.0.system, 'ff0000')\n &target=color(collectd.hostname.cpu.0.idle, 'gray')\n &target=color(collectd.hostname.cpu.0.idle, '6464ffaa')", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "theColor", + "type": "string", + "required": true + } + ] + }, + "dashed": { + "name": "dashed", + "function": "dashed(seriesList, dashLength=5)", + "description": "Takes one metric or a wildcard seriesList, followed by a float F.\n\nDraw the selected metrics with a dotted line with segments of length F\nIf omitted, the default length of the segments is 5.0\n\nExample:\n\n.. code-block:: none\n\n &target=dashed(server01.instance01.memory.free,2.5)", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "dashLength", + "type": "integer", + "default": 5 + } + ] + }, + "drawAsInfinite": { + "name": "drawAsInfinite", + "function": "drawAsInfinite(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\nIf the value is zero, draw the line at 0. If the value is above zero, draw\nthe line at infinity. If the value is null or less than zero, do not draw\nthe line.\n\nUseful for displaying on/off metrics, such as exit codes. (0 = success,\nanything else = failure.)\n\nExample:\n\n.. code-block:: none\n\n drawAsInfinite(Testing.script.exitCode)", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "lineWidth": { + "name": "lineWidth", + "function": "lineWidth(seriesList, width)", + "description": "Takes one metric or a wildcard seriesList, followed by a float F.\n\nDraw the selected metrics with a line width of F, overriding the default\nvalue of 1, or the &lineWidth=X.X parameter.\n\nUseful for highlighting a single metric out of many, or having multiple\nline widths in one graph.\n\nExample:\n\n.. code-block:: none\n\n &target=lineWidth(server01.instance01.memory.free,5)", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "width", + "type": "float", + "required": true + } + ] + }, + "secondYAxis": { + "name": "secondYAxis", + "function": "secondYAxis(seriesList)", + "description": "Graph the series on the secondary Y axis.", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "stacked": { + "name": "stacked", + "function": "stacked(seriesLists, stackName='__DEFAULT__')", + "description": "Takes one metric or a wildcard seriesList and change them so they are\nstacked. This is a way of stacking just a couple of metrics without having\nto use the stacked area mode (that stacks everything). By means of this a mixed\nstacked and non stacked graph can be made\n\nIt can also take an optional argument with a name of the stack, in case there is\nmore than one, e.g. for input and output metrics.\n\nExample:\n\n.. code-block:: none\n\n &target=stacked(company.server.application01.ifconfig.TXPackets, 'tx')", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "stack", + "type": "string" + } + ] + }, + "threshold": { + "name": "threshold", + "function": "threshold(value, label=None, color=None)", + "description": "Takes a float F, followed by a label (in double quotes) and a color.\n(See ``bgcolor`` in the :doc:`Render API ` for valid color names & formats.)\n\nDraws a horizontal line at value F across the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=threshold(123.456, \"omgwtfbbq\", \"red\")", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "value", + "type": "float", + "required": true + }, + { + "name": "label", + "type": "string" + }, + { + "name": "color", + "type": "string" + } + ] + }, + "verticalLine": { + "name": "verticalLine", + "function": "verticalLine(ts, label=None, color=None)", + "description": "Takes a timestamp string ts.\n\nDraws a vertical line at the designated timestamp with optional\n'label' and 'color'. Supported timestamp formats include both\nrelative (e.g. -3h) and absolute (e.g. 16:00_20110501) strings,\nsuch as those used with ``from`` and ``until`` parameters. When\nset, the 'label' will appear in the graph legend.\n\nNote: Any timestamps defined outside the requested range will\nraise a 'ValueError' exception.\n\nExample:\n\n.. code-block:: none\n\n &target=verticalLine(\"12:3420131108\",\"event\",\"blue\")\n &target=verticalLine(\"16:00_20110501\",\"event\")\n &target=verticalLine(\"-5mins\")", + "module": "graphite.render.functions", + "group": "Graph", + "params": [ + { + "name": "ts", + "type": "date", + "required": true + }, + { + "name": "label", + "type": "string" + }, + { + "name": "color", + "type": "string" + } + ] + }, + "cactiStyle": { + "name": "cactiStyle", + "function": "cactiStyle(seriesList, system=None, units=None)", + "description": "Takes a series list and modifies the aliases to provide column aligned\noutput with Current, Max, and Min values in the style of cacti. Optionally\ntakes a \"system\" value to apply unit formatting in the same style as the\nY-axis, or a \"unit\" string to append an arbitrary unit suffix.\n\n.. code-block:: none\n\n &target=cactiStyle(ganglia.*.net.bytes_out,\"si\")\n &target=cactiStyle(ganglia.*.net.bytes_out,\"si\",\"b\")\n\nA possible value for ``system`` is ``si``, which would express your values in\nmultiples of a thousand. A second option is to use ``binary`` which will\ninstead express your values in multiples of 1024 (useful for network devices).\n\nColumn alignment of the Current, Max, Min values works under two conditions:\nyou use a monospace font such as terminus and use a single cactiStyle call, as\nseparate cactiStyle calls are not aware of each other. In case you have\ndifferent targets for which you would like to have cactiStyle to line up, you\ncan use ``group()`` to combine them before applying cactiStyle, such as:\n\n.. code-block:: none\n\n &target=cactiStyle(group(metricA,metricB))", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "system", + "type": "string", + "options": [ + "binary", + "si" + ] + }, + { + "name": "units", + "type": "string" + } + ] + }, + "changed": { + "name": "changed", + "function": "changed(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\nOutput 1 when the value changed, 0 when null or the same\n\nExample:\n\n.. code-block:: none\n\n &target=changed(Server01.connections.handled)", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "consolidateBy": { + "name": "consolidateBy", + "function": "consolidateBy(seriesList, consolidationFunc)", + "description": "Takes one metric or a wildcard seriesList and a consolidation function name.\n\nValid function names are 'sum', 'average'/'avg', 'min', 'max', 'first' & 'last'.\n\nWhen a graph is drawn where width of the graph size in pixels is smaller than\nthe number of datapoints to be graphed, Graphite consolidates the values to\nto prevent line overlap. The consolidateBy() function changes the consolidation\nfunction from the default of 'average' to one of 'sum', 'max', 'min', 'first', or 'last'.\nThis is especially useful in sales graphs, where fractional values make no sense and a 'sum'\nof consolidated values is appropriate.\n\n.. code-block:: none\n\n &target=consolidateBy(Sales.widgets.largeBlue, 'sum')\n &target=consolidateBy(Servers.web01.sda1.free_space, 'max')", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "consolidationFunc", + "type": "string", + "required": true, + "options": [ + "average", + "avg", + "avg_zero", + "first", + "last", + "max", + "min", + "sum" + ] + } + ] + }, + "constantLine": { + "name": "constantLine", + "function": "constantLine(value)", + "description": "Takes a float F.\n\nDraws a horizontal line at value F across the graph.\n\nExample:\n\n.. code-block:: none\n\n &target=constantLine(123.456)", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "value", + "type": "float", + "required": true + } + ] + }, + "events": { + "name": "events", + "function": "events(*tags)", + "description": "Returns the number of events at this point in time. Usable with\ndrawAsInfinite.\n\nExample:\n\n.. code-block:: none\n\n &target=events(\"tag-one\", \"tag-two\")\n &target=events(\"*\")\n\nReturns all events tagged as \"tag-one\" and \"tag-two\" and the second one\nreturns all events.", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "tags", + "type": "string", + "required": true, + "multiple": true + } + ] + }, + "cumulative": { + "name": "cumulative", + "function": "cumulative(seriesList)", + "description": "Takes one metric or a wildcard seriesList.\n\nWhen a graph is drawn where width of the graph size in pixels is smaller than\nthe number of datapoints to be graphed, Graphite consolidates the values to\nto prevent line overlap. The cumulative() function changes the consolidation\nfunction from the default of 'average' to 'sum'. This is especially useful in\nsales graphs, where fractional values make no sense and a 'sum' of consolidated\nvalues is appropriate.\n\nAlias for :func:`consolidateBy(series, 'sum') `\n\n.. code-block:: none\n\n &target=cumulative(Sales.widgets.largeBlue)", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + } + ] + }, + "fallbackSeries": { + "name": "fallbackSeries", + "function": "fallbackSeries(seriesList, fallback)", + "description": "Takes a wildcard seriesList, and a second fallback metric.\nIf the wildcard does not match any series, draws the fallback metric.\n\nExample:\n\n.. code-block:: none\n\n &target=fallbackSeries(server*.requests_per_second, constantLine(0))\n\nDraws a 0 line when server metric does not exist.", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "fallback", + "type": "seriesList", + "required": true + } + ] + }, + "identity": { + "name": "identity", + "function": "identity(name)", + "description": "Identity function:\nReturns datapoints where the value equals the timestamp of the datapoint.\nUseful when you have another series where the value is a timestamp, and\nyou want to compare it to the time of the datapoint, to render an age\n\nExample:\n\n.. code-block:: none\n\n &target=identity(\"The.time.series\")\n\nThis would create a series named \"The.time.series\" that contains points where\nx(t) == t.", + "module": "graphite.render.functions", + "group": "Calculate", + "params": [ + { + "name": "name", + "type": "string", + "required": true + } + ] + }, + "randomWalk": { + "name": "randomWalk", + "function": "randomWalk(name, step=60)", + "description": "Short Alias: randomWalk()\n\nReturns a random walk starting at 0. This is great for testing when there is\nno real data in whisper.\n\nExample:\n\n.. code-block:: none\n\n &target=randomWalk(\"The.time.series\")\n\nThis would create a series named \"The.time.series\" that contains points where\nx(t) == x(t-1)+random()-0.5, and x(0) == 0.\nAccepts optional second argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "randomWalkFunction": { + "name": "randomWalkFunction", + "function": "randomWalkFunction(name, step=60)", + "description": "Short Alias: randomWalk()\n\nReturns a random walk starting at 0. This is great for testing when there is\nno real data in whisper.\n\nExample:\n\n.. code-block:: none\n\n &target=randomWalk(\"The.time.series\")\n\nThis would create a series named \"The.time.series\" that contains points where\nx(t) == x(t-1)+random()-0.5, and x(0) == 0.\nAccepts optional second argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "setXFilesFactor": { + "name": "setXFilesFactor", + "function": "setXFilesFactor(seriesList, xFilesFactor)", + "description": "Short form: xFilesFactor()\n\nTakes one metric or a wildcard seriesList and an xFilesFactor value between 0 and 1\n\nWhen a series needs to be consolidated, this sets the fraction of values in an interval that must\nnot be null for the consolidation to be considered valid. If there are not enough values then\nNone will be returned for that interval.\n\n.. code-block:: none\n\n &target=xFilesFactor(Sales.widgets.largeBlue, 0.5)\n &target=Servers.web01.sda1.free_space|consolidateBy('max')|xFilesFactor(0.5)\n\nThe `xFilesFactor` set via this function is used as the default for all functions that accept an\n`xFilesFactor` parameter, all functions that aggregate data across multiple series and/or\nintervals, and `maxDataPoints `_ consolidation.\n\nA default for the entire render request can also be set using the\n`xFilesFactor `_ query parameter.\n\n.. note::\n\n `xFilesFactor` follows the same semantics as in Whisper storage schemas. Setting it to 0 (the\n default) means that only a single value in a given interval needs to be non-null, setting it to\n 1 means that all values in the interval must be non-null. A setting of 0.5 means that at least\n half the values in the interval must be non-null.", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "xFilesFactor", + "type": "float", + "required": true + } + ] + }, + "sin": { + "name": "sin", + "function": "sin(name, amplitude=1, step=60)", + "description": "Short Alias: sin()\n\nJust returns the sine of the current time. The optional amplitude parameter\nchanges the amplitude of the wave.\n\nExample:\n\n.. code-block:: none\n\n &target=sin(\"The.time.series\", 2)\n\nThis would create a series named \"The.time.series\" that contains sin(x)*2.\nAccepts optional second argument as 'amplitude' parameter (default amplitude is 1)\nAccepts optional third argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "amplitude", + "type": "integer", + "default": 1 + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "sinFunction": { + "name": "sinFunction", + "function": "sinFunction(name, amplitude=1, step=60)", + "description": "Short Alias: sin()\n\nJust returns the sine of the current time. The optional amplitude parameter\nchanges the amplitude of the wave.\n\nExample:\n\n.. code-block:: none\n\n &target=sin(\"The.time.series\", 2)\n\nThis would create a series named \"The.time.series\" that contains sin(x)*2.\nAccepts optional second argument as 'amplitude' parameter (default amplitude is 1)\nAccepts optional third argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "amplitude", + "type": "integer", + "default": 1 + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "seriesByTag": { + "name": "seriesByTag", + "function": "seriesByTag(*tagExpressions)", + "description": "Returns a SeriesList of series matching all the specified tag expressions.\n\nExample:\n\n.. code-block:: none\n\n &target=seriesByTag(\"tag1=value1\",\"tag2!=value2\")\n\nReturns a seriesList of all series that have tag1 set to value1, AND do not have tag2 set to value2.\n\nTags specifiers are strings, and may have the following formats:\n\n.. code-block:: none\n\n tag=spec tag value exactly matches spec\n tag!=spec tag value does not exactly match spec\n tag=~value tag value matches the regular expression spec\n tag!=~spec tag value does not match the regular expression spec\n\nAny tag spec that matches an empty value is considered to match series that don't have that tag.\n\nAt least one tag spec must require a non-empty value.\n\nRegular expression conditions are treated as being anchored at the start of the value.\n\nSee :ref:`querying tagged series ` for more detail.", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "tagExpressions", + "type": "string", + "required": true, + "multiple": true + } + ] + }, + "substr": { + "name": "substr", + "function": "substr(seriesList, start=0, stop=0)", + "description": "Takes one metric or a wildcard seriesList followed by 1 or 2 integers. Assume that the\nmetric name is a list or array, with each element separated by dots. Prints\nn - length elements of the array (if only one integer n is passed) or n - m\nelements of the array (if two integers n and m are passed). The list starts\nwith element 0 and ends with element (length - 1).\n\nExample:\n\n.. code-block:: none\n\n &target=substr(carbon.agents.hostname.avgUpdateTime,2,4)\n\nThe label would be printed as \"hostname.avgUpdateTime\".", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "start", + "type": "node", + "default": 0 + }, + { + "name": "stop", + "type": "node", + "default": 0 + } + ] + }, + "time": { + "name": "time", + "function": "time(name, step=60)", + "description": "Short Alias: time()\n\nJust returns the timestamp for each X value. T\n\nExample:\n\n.. code-block:: none\n\n &target=time(\"The.time.series\")\n\nThis would create a series named \"The.time.series\" that contains in Y the same\nvalue (in seconds) as X.\nAccepts optional second argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "timeFunction": { + "name": "timeFunction", + "function": "timeFunction(name, step=60)", + "description": "Short Alias: time()\n\nJust returns the timestamp for each X value. T\n\nExample:\n\n.. code-block:: none\n\n &target=time(\"The.time.series\")\n\nThis would create a series named \"The.time.series\" that contains in Y the same\nvalue (in seconds) as X.\nAccepts optional second argument as 'step' parameter (default step is 60 sec)", + "module": "graphite.render.functions", + "group": "Transform", + "params": [ + { + "name": "name", + "type": "string", + "required": true + }, + { + "name": "step", + "type": "integer", + "default": 60 + } + ] + }, + "xFilesFactor": { + "name": "xFilesFactor", + "function": "xFilesFactor(seriesList, xFilesFactor)", + "description": "Short form: xFilesFactor()\n\nTakes one metric or a wildcard seriesList and an xFilesFactor value between 0 and 1\n\nWhen a series needs to be consolidated, this sets the fraction of values in an interval that must\nnot be null for the consolidation to be considered valid. If there are not enough values then\nNone will be returned for that interval.\n\n.. code-block:: none\n\n &target=xFilesFactor(Sales.widgets.largeBlue, 0.5)\n &target=Servers.web01.sda1.free_space|consolidateBy('max')|xFilesFactor(0.5)\n\nThe `xFilesFactor` set via this function is used as the default for all functions that accept an\n`xFilesFactor` parameter, all functions that aggregate data across multiple series and/or\nintervals, and `maxDataPoints `_ consolidation.\n\nA default for the entire render request can also be set using the\n`xFilesFactor `_ query parameter.\n\n.. note::\n\n `xFilesFactor` follows the same semantics as in Whisper storage schemas. Setting it to 0 (the\n default) means that only a single value in a given interval needs to be non-null, setting it to\n 1 means that all values in the interval must be non-null. A setting of 0.5 means that at least\n half the values in the interval must be non-null.", + "module": "graphite.render.functions", + "group": "Special", + "params": [ + { + "name": "seriesList", + "type": "seriesList", + "required": true + }, + { + "name": "xFilesFactor", + "type": "float", + "required": true + } + ] + } +} diff --git a/app/vmselect/graphite/functions_api.go b/app/vmselect/graphite/functions_api.go new file mode 100644 index 000000000..a8140b3e1 --- /dev/null +++ b/app/vmselect/graphite/functions_api.go @@ -0,0 +1,88 @@ +package graphite + +import ( + // embed functions.json file + _ "embed" + "encoding/json" + "fmt" + "net/http" + "time" + + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/searchutils" +) + +// FunctionsHandler implements /functions handler. +// +// See https://graphite.readthedocs.io/en/latest/functions.html#function-api +func FunctionsHandler(startTime time.Time, w http.ResponseWriter, r *http.Request) error { + grouped := searchutils.GetBool(r, "grouped") + group := r.FormValue("group") + result := make(map[string]interface{}) + for funcName, fi := range funcs { + if group != "" && fi.Group != group { + continue + } + if grouped { + v := result[fi.Group] + if v == nil { + v = make(map[string]*funcInfo) + result[fi.Group] = v + } + m := v.(map[string]*funcInfo) + m[funcName] = fi + } else { + result[funcName] = fi + } + } + return writeJSON(result, w, r) +} + +// FunctionDetailsHandler implements /functions/ handler. +// +// See https://graphite.readthedocs.io/en/latest/functions.html#function-api +func FunctionDetailsHandler(startTime time.Time, funcName string, w http.ResponseWriter, r *http.Request) error { + result := funcs[funcName] + if result == nil { + return fmt.Errorf("cannot find function %q", funcName) + } + return writeJSON(result, w, r) +} + +func writeJSON(result interface{}, w http.ResponseWriter, r *http.Request) error { + data, err := json.Marshal(result) + if err != nil { + return fmt.Errorf("cannot marshal response to JSON: %w", err) + } + jsonp := r.FormValue("jsonp") + contentType := getContentType(jsonp) + w.Header().Set("Content-Type", contentType) + if jsonp != "" { + fmt.Fprintf(w, "%s(", jsonp) + } + w.Write(data) + if jsonp != "" { + fmt.Fprintf(w, ")") + } + return nil +} + +//go:embed functions.json +var funcsJSON []byte + +type funcInfo struct { + Name string `json:"name"` + Function string `json:"function"` + Description string `json:"description"` + Module string `json:"module"` + Group string `json:"group"` + Params json.RawMessage `json:"params"` +} + +var funcs = func() map[string]*funcInfo { + var m map[string]*funcInfo + if err := json.Unmarshal(funcsJSON, &m); err != nil { + // Do not use logger.Panicf, since it isn't ready yet. + panic(fmt.Errorf("cannot parse funcsJSON: %s", err)) + } + return m +}() diff --git a/app/vmselect/graphite/natural_compare.go b/app/vmselect/graphite/natural_compare.go new file mode 100644 index 000000000..7d25e1502 --- /dev/null +++ b/app/vmselect/graphite/natural_compare.go @@ -0,0 +1,48 @@ +package graphite + +import ( + "strconv" +) + +func naturalLess(a, b string) bool { + for { + var aPrefix, bPrefix string + aPrefix, a = getNonNumPrefix(a) + bPrefix, b = getNonNumPrefix(b) + if aPrefix != bPrefix { + return aPrefix < bPrefix + } + if len(a) == 0 || len(b) == 0 { + return a < b + } + var aNum, bNum int + aNum, a = getNumPrefix(a) + bNum, b = getNumPrefix(b) + if aNum != bNum { + return aNum < bNum + } + } +} + +func getNonNumPrefix(s string) (prefix string, tail string) { + for i := 0; i < len(s); i++ { + ch := s[i] + if ch >= '0' && ch <= '9' { + return s[:i], s[i:] + } + } + return s, "" +} + +func getNumPrefix(s string) (prefix int, tail string) { + i := 0 + for i < len(s) { + ch := s[i] + if ch < '0' || ch > '9' { + break + } + i++ + } + prefix, _ = strconv.Atoi(s[:i]) + return prefix, s[i:] +} diff --git a/app/vmselect/graphite/natural_compare_test.go b/app/vmselect/graphite/natural_compare_test.go new file mode 100644 index 000000000..f11bbbfcc --- /dev/null +++ b/app/vmselect/graphite/natural_compare_test.go @@ -0,0 +1,29 @@ +package graphite + +import ( + "testing" +) + +func TestNaturalLess(t *testing.T) { + f := func(a, b string, okExpected bool) { + t.Helper() + ok := naturalLess(a, b) + if ok != okExpected { + t.Fatalf("unexpected result for naturalLess(%q, %q); got %v; want %v", a, b, ok, okExpected) + } + } + f("", "", false) + f("a", "b", true) + f("", "foo", true) + f("foo", "", false) + f("foo", "foo", false) + f("b", "a", false) + f("1", "2", true) + f("10", "2", false) + f("foo100", "foo12", false) + f("foo12", "foo100", true) + f("10foo2", "10foo10", true) + f("10foo10", "10foo2", false) + f("foo1bar10", "foo1bar2aa", false) + f("foo1bar2aa", "foo1bar10aa", true) +} diff --git a/app/vmselect/graphite/render_api.go b/app/vmselect/graphite/render_api.go new file mode 100644 index 000000000..f5c576c36 --- /dev/null +++ b/app/vmselect/graphite/render_api.go @@ -0,0 +1,273 @@ +package graphite + +import ( + "flag" + "fmt" + "net/http" + "strconv" + "strings" + "time" + + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/bufferedwriter" + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/searchutils" + "github.com/VictoriaMetrics/metrics" +) + +var ( + storageStep = flag.Duration("search.graphiteStorageStep", 10*time.Second, "The interval between datapoints stored in the database. "+ + "It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. "+ + "It can be overridden by sending 'storage_step' query arg to /render API or "+ + "by sending the desired interval via 'Storage-Step' http header during querying /render API") + maxPointsPerSeries = flag.Int("search.graphiteMaxPointsPerSeries", 1e6, "The maximum number of points per series Graphite render API can return") +) + +// RenderHandler implements /render endpoint from Graphite Render API. +// +// See https://graphite.readthedocs.io/en/stable/render_api.html +func RenderHandler(startTime time.Time, w http.ResponseWriter, r *http.Request) error { + deadline := searchutils.GetDeadlineForQuery(r, startTime) + format := r.FormValue("format") + if format != "json" { + return fmt.Errorf("unsupported format=%q; supported values: json", format) + } + xFilesFactor := float64(0) + if xff := r.FormValue("xFilesFactor"); len(xff) > 0 { + f, err := strconv.ParseFloat(xff, 64) + if err != nil { + return fmt.Errorf("cannot parse xFilesFactor=%q: %w", xff, err) + } + xFilesFactor = f + } + from := r.FormValue("from") + fromTime := startTime.UnixNano()/1e6 - 24*3600*1000 + if len(from) != 0 { + fv, err := parseTime(startTime, from) + if err != nil { + return fmt.Errorf("cannot parse from=%q: %w", from, err) + } + fromTime = fv + } + until := r.FormValue("until") + untilTime := startTime.UnixNano() / 1e6 + if len(until) != 0 { + uv, err := parseTime(startTime, until) + if err != nil { + return fmt.Errorf("cannot parse until=%q: %w", until, err) + } + untilTime = uv + } + storageStep, err := getStorageStep(r) + if err != nil { + return err + } + fromAlign := fromTime % storageStep + fromTime -= fromAlign + if fromAlign > 0 { + fromTime += storageStep + } + untilAlign := untilTime % storageStep + untilTime -= untilAlign + if untilAlign > 0 { + untilTime += storageStep + } + if untilTime < fromTime { + return fmt.Errorf("from=%s cannot exceed until=%s", from, until) + } + pointsPerSeries := (untilTime - fromTime) / storageStep + if pointsPerSeries > int64(*maxPointsPerSeries) { + return fmt.Errorf("too many points per series must be returned on the given [from=%s ... until=%s] time range and the given storageStep=%d: %d; "+ + "either reduce the time range or increase -search.graphiteMaxPointsPerSeries=%d", from, until, storageStep, pointsPerSeries, *maxPointsPerSeries) + } + maxDataPoints := 0 + if s := r.FormValue("maxDataPoints"); len(s) > 0 { + n, err := strconv.ParseFloat(s, 64) + if err != nil { + return fmt.Errorf("cannot parse maxDataPoints=%q: %w", maxDataPoints, err) + } + if n <= 0 { + return fmt.Errorf("maxDataPoints must be greater than 0; got %f", n) + } + maxDataPoints = int(n) + } + etfs, err := searchutils.GetExtraTagFilters(r) + if err != nil { + return fmt.Errorf("cannot setup tag filters: %w", err) + } + var nextSeriess []nextSeriesFunc + targets := r.Form["target"] + for _, target := range targets { + ec := &evalConfig{ + startTime: fromTime, + endTime: untilTime, + storageStep: storageStep, + deadline: deadline, + currentTime: startTime, + xFilesFactor: xFilesFactor, + etfs: etfs, + originalQuery: target, + } + nextSeries, err := execExpr(ec, target) + if err != nil { + for _, f := range nextSeriess { + _, _ = drainAllSeries(f) + } + return fmt.Errorf("cannot eval target=%q: %w", target, err) + } + // do not use nextSeriesConcurrentWrapper here in order to preserve series order. + if maxDataPoints > 0 { + step := (ec.endTime - ec.startTime) / int64(maxDataPoints) + nextSeries = nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + aggrFunc := s.consolidateFunc + if aggrFunc == nil { + aggrFunc = aggrAvg + } + xFilesFactor := s.xFilesFactor + if s.xFilesFactor <= 0 { + xFilesFactor = ec.xFilesFactor + } + if len(s.Values) > maxDataPoints { + s.summarize(aggrFunc, ec.startTime, ec.endTime, step, xFilesFactor) + } + return s, nil + }) + } + nextSeriess = append(nextSeriess, nextSeries) + } + f := nextSeriesGroup(nextSeriess, nil) + jsonp := r.FormValue("jsonp") + contentType := getContentType(jsonp) + w.Header().Set("Content-Type", contentType) + bw := bufferedwriter.Get(w) + defer bufferedwriter.Put(bw) + WriteRenderJSONResponse(bw, f, jsonp) + if err := bw.Flush(); err != nil { + return err + } + renderDuration.UpdateDuration(startTime) + return nil +} + +var renderDuration = metrics.NewSummary(`vm_request_duration_seconds{path="/render"}`) + +const msecsPerDay = 24 * 3600 * 1000 + +// parseTime parses Graphite time in s. +// +// If the time in s is relative, then it is relative to startTime. +func parseTime(startTime time.Time, s string) (int64, error) { + switch s { + case "now": + return startTime.UnixNano() / 1e6, nil + case "today": + ts := startTime.UnixNano() / 1e6 + return ts - ts%msecsPerDay, nil + case "yesterday": + ts := startTime.UnixNano() / 1e6 + return ts - (ts % msecsPerDay) - msecsPerDay, nil + } + // Attempt to parse RFC3339 (YYYY-MM-DDTHH:mm:SSZTZ:00) + if t, err := time.Parse(time.RFC3339, s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse HH:MM_YYYYMMDD + if t, err := time.Parse("15:04_20060102", s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse HH:MMYYYYMMDD + if t, err := time.Parse("15:0420060102", s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse YYYYMMDD + if t, err := time.Parse("20060102", s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse HH:MM YYYYMMDD + if t, err := time.Parse("15:04 20060102", s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse YYYY-MM-DD + if t, err := time.Parse("2006-01-02", s); err == nil { + return t.UnixNano() / 1e6, nil + } + // Attempt to parse MM/DD/YY + if t, err := time.Parse("01/02/06", s); err == nil { + return t.UnixNano() / 1e6, nil + } + + // Attempt to parse time as unix timestamp + if n, err := strconv.ParseInt(s, 10, 64); err == nil { + return n * 1000, nil + } + // Attempt to parse interval + if interval, err := parseInterval(s); err == nil { + return startTime.UnixNano()/1e6 + interval, nil + } + return 0, fmt.Errorf("unsupported time %q", s) +} + +func parseInterval(s string) (int64, error) { + s = strings.TrimSpace(s) + prefix := s + var suffix string + for i := 0; i < len(s); i++ { + ch := s[i] + if ch != '-' && ch != '+' && ch != '.' && (ch < '0' || ch > '9') { + prefix = s[:i] + suffix = s[i:] + break + } + } + n, err := strconv.ParseFloat(prefix, 64) + if err != nil { + return 0, fmt.Errorf("cannot parse interval %q: %w", s, err) + } + suffix = strings.TrimSpace(suffix) + if len(suffix) == 0 { + return 0, fmt.Errorf("missing suffix for interval %q; expecting s, min, h, d, w, mon or y suffix", s) + } + var m float64 + switch { + case strings.HasPrefix(suffix, "ms"): + m = 1 + case strings.HasPrefix(suffix, "s"): + m = 1000 + case strings.HasPrefix(suffix, "mi"), + strings.HasPrefix(suffix, "m") && !strings.HasPrefix(suffix, "mo"): + m = 60 * 1000 + case strings.HasPrefix(suffix, "h"): + m = 3600 * 1000 + case strings.HasPrefix(suffix, "d"): + m = 24 * 3600 * 1000 + case strings.HasPrefix(suffix, "w"): + m = 7 * 24 * 3600 * 1000 + case strings.HasPrefix(suffix, "mo"): + m = 30 * 24 * 3600 * 1000 + case strings.HasPrefix(suffix, "y"): + m = 365 * 24 * 3600 * 1000 + default: + return 0, fmt.Errorf("unsupported interval %q", s) + } + return int64(n * m), nil +} + +func getStorageStep(r *http.Request) (int64, error) { + s := r.FormValue("storage_step") + if len(s) == 0 { + s = r.Header.Get("Storage-Step") + } + if len(s) == 0 { + step := int64(storageStep.Seconds() * 1000) + if step <= 0 { + return 0, fmt.Errorf("the `-search.graphiteStorageStep` command-line flag value must be positive; got %s", storageStep.String()) + } + return step, nil + } + step, err := parseInterval(s) + if err != nil { + return 0, fmt.Errorf("cannot parse datapoints interval %s: %w", s, err) + } + if step <= 0 { + return 0, fmt.Errorf("storage_step cannot be negative; got %s", s) + } + return step, nil +} diff --git a/app/vmselect/graphite/render_api_test.go b/app/vmselect/graphite/render_api_test.go new file mode 100644 index 000000000..7e9cd8685 --- /dev/null +++ b/app/vmselect/graphite/render_api_test.go @@ -0,0 +1,103 @@ +package graphite + +import ( + "testing" + "time" +) + +func TestParseIntervalSuccess(t *testing.T) { + f := func(s string, intervalExpected int64) { + t.Helper() + interval, err := parseInterval(s) + if err != nil { + t.Fatalf("unexpected error in parseInterva(%q): %s", s, err) + } + if interval != intervalExpected { + t.Fatalf("unexpected result for parseInterval(%q); got %d; want %d", s, interval, intervalExpected) + } + } + f(`1ms`, 1) + f(`-10.5ms`, -10) + f(`+5.5s`, 5500) + f(`7.85s`, 7850) + f(`-7.85sec`, -7850) + f(`-7.85secs`, -7850) + f(`5seconds`, 5000) + f(`10min`, 10*60*1000) + f(`10 mins`, 10*60*1000) + f(` 10 mins `, 10*60*1000) + f(`10m`, 10*60*1000) + f(`-10.5min`, -10.5*60*1000) + f(`-10.5m`, -10.5*60*1000) + f(`3minutes`, 3*60*1000) + f(`3h`, 3*3600*1000) + f(`-4.5hour`, -4.5*3600*1000) + f(`7hours`, 7*3600*1000) + f(`5d`, 5*24*3600*1000) + f(`-3.5days`, -3.5*24*3600*1000) + f(`0.5w`, 0.5*7*24*3600*1000) + f(`10weeks`, 10*7*24*3600*1000) + f(`2months`, 2*30*24*3600*1000) + f(`2mo`, 2*30*24*3600*1000) + f(`1.2y`, 1.2*365*24*3600*1000) + f(`-3years`, -3*365*24*3600*1000) +} + +func TestParseIntervalError(t *testing.T) { + f := func(s string) { + t.Helper() + interval, err := parseInterval(s) + if err == nil { + t.Fatalf("expecting non-nil error for parseInterval(%q)", s) + } + if interval != 0 { + t.Fatalf("unexpected non-zero interval for parseInterval(%q): %d", s, interval) + } + } + f("") + f("foo") + f(`'1minute'`) + f(`123`) +} + +func TestParseTimeSuccess(t *testing.T) { + startTime := time.Now() + startTimestamp := startTime.UnixNano() / 1e6 + f := func(s string, timestampExpected int64) { + t.Helper() + timestamp, err := parseTime(startTime, s) + if err != nil { + t.Fatalf("unexpected error from parseTime(%q): %s", s, err) + } + if timestamp != timestampExpected { + t.Fatalf("unexpected timestamp returned from parseTime(%q); got %d; want %d", s, timestamp, timestampExpected) + } + } + f("now", startTimestamp) + f("today", startTimestamp-startTimestamp%msecsPerDay) + f("yesterday", startTimestamp-(startTimestamp%msecsPerDay)-msecsPerDay) + f("1234567890", 1234567890000) + f("18:36_20210223", 1614105360000) + f("20210223", 1614038400000) + f("02/23/21", 1614038400000) + f("2021-02-23", 1614038400000) + f("2021-02-23T18:36:12Z", 1614105372000) + f("-3hours", startTimestamp-3*3600*1000) + f("1.5minutes", startTimestamp+1.5*60*1000) +} + +func TestParseTimeFailure(t *testing.T) { + f := func(s string) { + t.Helper() + timestamp, err := parseTime(time.Now(), s) + if err == nil { + t.Fatalf("expecting non-nil error for parseTime(%q)", s) + } + if timestamp != 0 { + t.Fatalf("expecting zero value for parseTime(%q); got %d", s, timestamp) + } + } + f("") + f("foobar") + f("1235aafb") +} diff --git a/app/vmselect/graphite/render_response.qtpl b/app/vmselect/graphite/render_response.qtpl new file mode 100644 index 000000000..308a378ad --- /dev/null +++ b/app/vmselect/graphite/render_response.qtpl @@ -0,0 +1,59 @@ +{% stripspace %} + +{% import ( + "math" + "sort" +) %} + +RenderJSONResponse generates response for /render?format=json . +See https://graphite.readthedocs.io/en/stable/render_api.html#json +{% func RenderJSONResponse(nextSeries nextSeriesFunc, jsonp string) %} + {% if jsonp != "" %}{%s= jsonp %}({% endif %} + {% code ss, err := fetchAllSeries(nextSeries) %} + {% if err != nil %} + { + "error": {%q= err.Error() %} + } + {% return %} + {% endif %} + {% code sort.Slice(ss, func(i, j int) bool { return ss[i].Name < ss[j].Name }) %} + [ + {% for i, s := range ss %} + {%= renderSeriesJSON(s) %} + {% if i+1 < len(ss) %},{% endif %} + {% endfor %} + ] + {% if jsonp != "" %}){% endif %} +{% endfunc %} + +{% func renderSeriesJSON(s *series) %} + { + "target": {%q= s.Name %}, + "tags":{ + {% code + tagKeys := make([]string, 0, len(s.Tags)) + for k := range s.Tags { + tagKeys = append(tagKeys, k) + } + sort.Strings(tagKeys) + %} + {% for i, k := range tagKeys %} + {% code v := s.Tags[k] %} + {%q= k %}: {%q= v %} + {% if i+1 < len(tagKeys) %},{% endif %} + {% endfor %} + }, + "datapoints":[ + {% code timestamps := s.Timestamps %} + {% for i, v := range s.Values %} + [ + {% if math.IsNaN(v) %}null{% else %}{%f= v %}{% endif %}, + {%dl= timestamps[i]/1e3 %} + ] + {% if i+1 < len(timestamps) %},{% endif %} + {% endfor %} + ] + } +{% endfunc %} + +{% endstripspace %} diff --git a/app/vmselect/graphite/render_response.qtpl.go b/app/vmselect/graphite/render_response.qtpl.go new file mode 100644 index 000000000..213403fb1 --- /dev/null +++ b/app/vmselect/graphite/render_response.qtpl.go @@ -0,0 +1,203 @@ +// Code generated by qtc from "render_response.qtpl". DO NOT EDIT. +// See https://github.com/valyala/quicktemplate for details. + +//line app/vmselect/graphite/render_response.qtpl:3 +package graphite + +//line app/vmselect/graphite/render_response.qtpl:3 +import ( + "math" + "sort" +) + +// RenderJSONResponse generates response for /render?format=json .See https://graphite.readthedocs.io/en/stable/render_api.html#json + +//line app/vmselect/graphite/render_response.qtpl:10 +import ( + qtio422016 "io" + + qt422016 "github.com/valyala/quicktemplate" +) + +//line app/vmselect/graphite/render_response.qtpl:10 +var ( + _ = qtio422016.Copy + _ = qt422016.AcquireByteBuffer +) + +//line app/vmselect/graphite/render_response.qtpl:10 +func StreamRenderJSONResponse(qw422016 *qt422016.Writer, nextSeries nextSeriesFunc, jsonp string) { +//line app/vmselect/graphite/render_response.qtpl:11 + if jsonp != "" { +//line app/vmselect/graphite/render_response.qtpl:11 + qw422016.N().S(jsonp) +//line app/vmselect/graphite/render_response.qtpl:11 + qw422016.N().S(`(`) +//line app/vmselect/graphite/render_response.qtpl:11 + } +//line app/vmselect/graphite/render_response.qtpl:12 + ss, err := fetchAllSeries(nextSeries) + +//line app/vmselect/graphite/render_response.qtpl:13 + if err != nil { +//line app/vmselect/graphite/render_response.qtpl:13 + qw422016.N().S(`{"error":`) +//line app/vmselect/graphite/render_response.qtpl:15 + qw422016.N().Q(err.Error()) +//line app/vmselect/graphite/render_response.qtpl:15 + qw422016.N().S(`}`) +//line app/vmselect/graphite/render_response.qtpl:17 + return +//line app/vmselect/graphite/render_response.qtpl:18 + } +//line app/vmselect/graphite/render_response.qtpl:19 + sort.Slice(ss, func(i, j int) bool { return ss[i].Name < ss[j].Name }) + +//line app/vmselect/graphite/render_response.qtpl:19 + qw422016.N().S(`[`) +//line app/vmselect/graphite/render_response.qtpl:21 + for i, s := range ss { +//line app/vmselect/graphite/render_response.qtpl:22 + streamrenderSeriesJSON(qw422016, s) +//line app/vmselect/graphite/render_response.qtpl:23 + if i+1 < len(ss) { +//line app/vmselect/graphite/render_response.qtpl:23 + qw422016.N().S(`,`) +//line app/vmselect/graphite/render_response.qtpl:23 + } +//line app/vmselect/graphite/render_response.qtpl:24 + } +//line app/vmselect/graphite/render_response.qtpl:24 + qw422016.N().S(`]`) +//line app/vmselect/graphite/render_response.qtpl:26 + if jsonp != "" { +//line app/vmselect/graphite/render_response.qtpl:26 + qw422016.N().S(`)`) +//line app/vmselect/graphite/render_response.qtpl:26 + } +//line app/vmselect/graphite/render_response.qtpl:27 +} + +//line app/vmselect/graphite/render_response.qtpl:27 +func WriteRenderJSONResponse(qq422016 qtio422016.Writer, nextSeries nextSeriesFunc, jsonp string) { +//line app/vmselect/graphite/render_response.qtpl:27 + qw422016 := qt422016.AcquireWriter(qq422016) +//line app/vmselect/graphite/render_response.qtpl:27 + StreamRenderJSONResponse(qw422016, nextSeries, jsonp) +//line app/vmselect/graphite/render_response.qtpl:27 + qt422016.ReleaseWriter(qw422016) +//line app/vmselect/graphite/render_response.qtpl:27 +} + +//line app/vmselect/graphite/render_response.qtpl:27 +func RenderJSONResponse(nextSeries nextSeriesFunc, jsonp string) string { +//line app/vmselect/graphite/render_response.qtpl:27 + qb422016 := qt422016.AcquireByteBuffer() +//line app/vmselect/graphite/render_response.qtpl:27 + WriteRenderJSONResponse(qb422016, nextSeries, jsonp) +//line app/vmselect/graphite/render_response.qtpl:27 + qs422016 := string(qb422016.B) +//line app/vmselect/graphite/render_response.qtpl:27 + qt422016.ReleaseByteBuffer(qb422016) +//line app/vmselect/graphite/render_response.qtpl:27 + return qs422016 +//line app/vmselect/graphite/render_response.qtpl:27 +} + +//line app/vmselect/graphite/render_response.qtpl:29 +func streamrenderSeriesJSON(qw422016 *qt422016.Writer, s *series) { +//line app/vmselect/graphite/render_response.qtpl:29 + qw422016.N().S(`{"target":`) +//line app/vmselect/graphite/render_response.qtpl:31 + qw422016.N().Q(s.Name) +//line app/vmselect/graphite/render_response.qtpl:31 + qw422016.N().S(`,"tags":{`) +//line app/vmselect/graphite/render_response.qtpl:34 + tagKeys := make([]string, 0, len(s.Tags)) + for k := range s.Tags { + tagKeys = append(tagKeys, k) + } + sort.Strings(tagKeys) + +//line app/vmselect/graphite/render_response.qtpl:40 + for i, k := range tagKeys { +//line app/vmselect/graphite/render_response.qtpl:41 + v := s.Tags[k] + +//line app/vmselect/graphite/render_response.qtpl:42 + qw422016.N().Q(k) +//line app/vmselect/graphite/render_response.qtpl:42 + qw422016.N().S(`:`) +//line app/vmselect/graphite/render_response.qtpl:42 + qw422016.N().Q(v) +//line app/vmselect/graphite/render_response.qtpl:43 + if i+1 < len(tagKeys) { +//line app/vmselect/graphite/render_response.qtpl:43 + qw422016.N().S(`,`) +//line app/vmselect/graphite/render_response.qtpl:43 + } +//line app/vmselect/graphite/render_response.qtpl:44 + } +//line app/vmselect/graphite/render_response.qtpl:44 + qw422016.N().S(`},"datapoints":[`) +//line app/vmselect/graphite/render_response.qtpl:47 + timestamps := s.Timestamps + +//line app/vmselect/graphite/render_response.qtpl:48 + for i, v := range s.Values { +//line app/vmselect/graphite/render_response.qtpl:48 + qw422016.N().S(`[`) +//line app/vmselect/graphite/render_response.qtpl:50 + if math.IsNaN(v) { +//line app/vmselect/graphite/render_response.qtpl:50 + qw422016.N().S(`null`) +//line app/vmselect/graphite/render_response.qtpl:50 + } else { +//line app/vmselect/graphite/render_response.qtpl:50 + qw422016.N().F(v) +//line app/vmselect/graphite/render_response.qtpl:50 + } +//line app/vmselect/graphite/render_response.qtpl:50 + qw422016.N().S(`,`) +//line app/vmselect/graphite/render_response.qtpl:51 + qw422016.N().DL(timestamps[i] / 1e3) +//line app/vmselect/graphite/render_response.qtpl:51 + qw422016.N().S(`]`) +//line app/vmselect/graphite/render_response.qtpl:53 + if i+1 < len(timestamps) { +//line app/vmselect/graphite/render_response.qtpl:53 + qw422016.N().S(`,`) +//line app/vmselect/graphite/render_response.qtpl:53 + } +//line app/vmselect/graphite/render_response.qtpl:54 + } +//line app/vmselect/graphite/render_response.qtpl:54 + qw422016.N().S(`]}`) +//line app/vmselect/graphite/render_response.qtpl:57 +} + +//line app/vmselect/graphite/render_response.qtpl:57 +func writerenderSeriesJSON(qq422016 qtio422016.Writer, s *series) { +//line app/vmselect/graphite/render_response.qtpl:57 + qw422016 := qt422016.AcquireWriter(qq422016) +//line app/vmselect/graphite/render_response.qtpl:57 + streamrenderSeriesJSON(qw422016, s) +//line app/vmselect/graphite/render_response.qtpl:57 + qt422016.ReleaseWriter(qw422016) +//line app/vmselect/graphite/render_response.qtpl:57 +} + +//line app/vmselect/graphite/render_response.qtpl:57 +func renderSeriesJSON(s *series) string { +//line app/vmselect/graphite/render_response.qtpl:57 + qb422016 := qt422016.AcquireByteBuffer() +//line app/vmselect/graphite/render_response.qtpl:57 + writerenderSeriesJSON(qb422016, s) +//line app/vmselect/graphite/render_response.qtpl:57 + qs422016 := string(qb422016.B) +//line app/vmselect/graphite/render_response.qtpl:57 + qt422016.ReleaseByteBuffer(qb422016) +//line app/vmselect/graphite/render_response.qtpl:57 + return qs422016 +//line app/vmselect/graphite/render_response.qtpl:57 +} diff --git a/app/vmselect/graphite/transform.go b/app/vmselect/graphite/transform.go new file mode 100644 index 000000000..480abcc26 --- /dev/null +++ b/app/vmselect/graphite/transform.go @@ -0,0 +1,5605 @@ +package graphite + +import ( + "container/heap" + "fmt" + "math" + "math/rand" + "regexp" + "sort" + "strconv" + "strings" + "sync" + "sync/atomic" + "time" + + "github.com/VictoriaMetrics/VictoriaMetrics/app/vmselect/graphiteql" + "github.com/VictoriaMetrics/VictoriaMetrics/lib/cgroup" +) + +// nextSeriesFunc must return the next series to process. +// +// nextSeriesFunc must release all the occupied resources before returning non-nil error. +// drainAllSeries can be used for releasing occupied resources. +// +// When there are no more series to return, (nil, nil) must be returned. +type nextSeriesFunc func() (*series, error) + +type transformFunc func(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) + +var transformFuncs = map[string]transformFunc{} + +func init() { + // A workaround for https://github.com/golang/go/issues/43741 + transformFuncs = map[string]transformFunc{ + "absolute": transformAbsolute, + "add": transformAdd, + "aggregate": transformAggregate, + "aggregateLine": transformAggregateLine, + "aggregateWithWildcards": transformAggregateWithWildcards, + "alias": transformAlias, + "aliasByMetric": transformAliasByMetric, + "aliasByNode": transformAliasByNode, + "aliasByTags": transformAliasByNode, + "aliasQuery": transformAliasQuery, + "aliasSub": transformAliasSub, + "alpha": transformAlpha, + "applyByNode": transformApplyByNode, + "areaBetween": transformAreaBetween, + "asPercent": transformAsPercent, + "averageAbove": transformAverageAbove, + "averageBelow": transformAverageBelow, + "averageOutsidePercentile": transformAverageOutsidePercentile, + "averageSeries": transformAverageSeries, + "averageSeriesWithWildcards": transformAverageSeriesWithWildcards, + "avg": transformAverageSeries, + "cactiStyle": transformTODO, + "changed": transformChanged, + "color": transformColor, + "consolidateBy": transformConsolidateBy, + "constantLine": transformConstantLine, + "countSeries": transformCountSeries, + "cumulative": transformCumulative, + "currentAbove": transformCurrentAbove, + "currentBelow": transformCurrentBelow, + "dashed": transformDashed, + "delay": transformDelay, + "derivative": transformDerivative, + "diffSeries": transformDiffSeries, + "divideSeries": transformDivideSeries, + "divideSeriesLists": transformDivideSeriesLists, + "drawAsInfinite": transformDrawAsInfinite, + "events": transformEvents, + "exclude": transformExclude, + "exp": transformExp, + "exponentialMovingAverage": transformExponentialMovingAverage, + "fallbackSeries": transformFallbackSeries, + "filterSeries": transformFilterSeries, + "grep": transformGrep, + "group": transformGroup, + "groupByNode": transformGroupByNode, + "groupByNodes": transformGroupByNodes, + "groupByTags": transformGroupByTags, + "highest": transformHighest, + "highestAverage": transformHighestAverage, + "highestCurrent": transformHighestCurrent, + "highestMax": transformHighestMax, + "hitcount": transformHitcount, + "holtWintersAberration": transformHoltWintersAberration, + "holtWintersConfidenceArea": transformHoltWintersConfidenceArea, + "holtWintersConfidenceBands": transformHoltWintersConfidenceBands, + "holtWintersForecast": transformHoltWintersForecast, + "identity": transformIdentity, + "integral": transformIntegral, + "integralByInterval": transformIntegralByInterval, + "interpolate": transformInterpolate, + "invert": transformInvert, + "isNonNull": transformIsNonNull, + "keepLastValue": transformKeepLastValue, + "legendValue": transformTODO, + "limit": transformLimit, + "lineWidth": transformLineWidth, + "linearRegression": transformLinearRegression, + "log": transformLogarithm, + "logarithm": transformLogarithm, + "logit": transformLogit, + "lowest": transformLowest, + "lowestAverage": transformLowestAverage, + "lowestCurrent": transformLowestCurrent, + "map": transformTODO, + "mapSeries": transformTODO, + "max": transformMaxSeries, + "maxSeries": transformMaxSeries, + "maximumAbove": transformMaximumAbove, + "maximumBelow": transformMaximumBelow, + "minMax": transformMinMax, + "min": transformMinSeries, + "minSeries": transformMinSeries, + "minimumAbove": transformMinimumAbove, + "minimumBelow": transformMinimumBelow, + "mostDeviant": transformMostDeviant, + "movingAverage": transformMovingAverage, + "movingMax": transformMovingMax, + "movingMedian": transformMovingMedian, + "movingMin": transformMovingMin, + "movingSum": transformMovingSum, + "movingWindow": transformMovingWindow, + "multiplySeries": transformMultiplySeries, + "multiplySeriesWithWildcards": transformMultiplySeriesWithWildcards, + "nPercentile": transformNPercentile, + "nonNegativeDerivative": transformNonNegativeDerivative, + "offset": transformOffset, + "offsetToZero": transformOffsetToZero, + "perSecond": transformPerSecond, + "percentileOfSeries": transformPercentileOfSeries, + // It looks like pie* functions aren't needed for Graphite render API + // "pieAverage": transformTODO, + // "pieMaximum": transformTODO, + // "pieMinimum": transformTODO, + "pow": transformPow, + "powSeries": transformPowSeries, + "randomWalk": transformRandomWalk, + "randomWalkFunction": transformRandomWalk, + "rangeOfSeries": transformRangeOfSeries, + "reduce": transformTODO, + "reduceSeries": transformTODO, + "removeAbovePercentile": transformRemoveAbovePercentile, + "removeAboveValue": transformRemoveAboveValue, + "removeBelowPercentile": transformRemoveBelowPercentile, + "removeBelowValue": transformRemoveBelowValue, + "removeBetweenPercentile": transformRemoveBetweenPercentile, + "removeEmptySeries": transformRemoveEmptySeries, + "round": transformRoundFunction, + "roundFunction": transformRoundFunction, + "scale": transformScale, + "scaleToSeconds": transformScaleToSeconds, + "secondYAxis": transformSecondYAxis, + "seriesByTag": transformSeriesByTag, + "setXFilesFactor": transformSetXFilesFactor, + "sigmoid": transformSigmoid, + "sin": transformSinFunction, + "sinFunction": transformSinFunction, + "smartSummarize": transformSmartSummarize, + "sortBy": transformSortBy, + "sortByMaxima": transformSortByMaxima, + "sortByMinima": transformSortByMinima, + "sortByName": transformSortByName, + "sortByTotal": transformSortByTotal, + "squareRoot": transformSquareRoot, + "stacked": transformStacked, + "stddevSeries": transformStddevSeries, + "stdev": transformStdev, + "substr": transformSubstr, + "sum": transformSumSeries, + "sumSeries": transformSumSeries, + "sumSeriesWithWildcards": transformSumSeriesWithWildcards, + "summarize": transformSummarize, + "threshold": transformThreshold, + "time": transformTimeFunction, + "timeFunction": transformTimeFunction, + "timeShift": transformTimeShift, + "timeSlice": transformTimeSlice, + "timeStack": transformTimeStack, + "transformNull": transformTransformNull, + "unique": transformUnique, + "useSeriesAbove": transformUseSeriesAbove, + "verticalLine": transformVerticalLine, + "weightedAverage": transformWeightedAverage, + "xFilesFactor": transformSetXFilesFactor, + } +} + +func transformTODO(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return nil, fmt.Errorf("TODO: implement this function") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.absolute +func transformAbsolute(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Abs(v) + } + s.Name = fmt.Sprintf("absolute(%s)", s.Name) + s.Tags["absolute"] = "1" + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.add +func transformAdd(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "constant", 1) + if err != nil { + return nil, err + } + nString := fmt.Sprintf("%g", n) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i := range values { + values[i] += n + } + s.Tags["add"] = nString + s.Name = fmt.Sprintf("add(%s,%s)", s.Name, nString) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aggregate +func transformAggregate(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 && len(args) != 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 or 3", len(args)) + } + funcName, err := getString(args, "func", 1) + if err != nil { + return nil, err + } + funcName = strings.TrimSuffix(funcName, "Series") + xFilesFactor, err := getOptionalNumber(args, "xFilesFactor", 2, ec.xFilesFactor) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return aggregateSeries(ec, fe, nextSeries, funcName, xFilesFactor) +} + +func aggregateSeries(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, funcName string, xFilesFactor float64) (nextSeriesFunc, error) { + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + as, err := newAggrState(ec.pointsLen(step), funcName) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + nextSeriesWrapper := getNextSeriesWrapperForAggregateFunc(funcName) + var seriesTags []map[string]string + var seriesExpressions []string + var mu sync.Mutex + f := nextSeriesWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + mu.Lock() + as.Update(s.Values) + seriesTags = append(seriesTags, s.Tags) + seriesExpressions = append(seriesExpressions, s.pathExpression) + mu.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + if len(seriesTags) == 0 { + return newZeroSeriesFunc(), nil + } + tags := seriesTags[0] + for _, m := range seriesTags[1:] { + for k, v := range tags { + if m[k] != v { + delete(tags, k) + } + } + } + name := formatAggrFuncForSeriesNames(funcName, seriesExpressions) + tags["aggregatedBy"] = funcName + if tags["name"] == "" { + tags["name"] = name + } + s := &series{ + Name: name, + Tags: tags, + Timestamps: ec.newTimestamps(step), + Values: as.Finalize(xFilesFactor), + pathExpression: name, + expr: expr, + step: step, + } + return singleSeriesFunc(s), nil +} + +func aggregateSeriesGeneric(ec *evalConfig, fe *graphiteql.FuncExpr, funcName string) (nextSeriesFunc, error) { + nextSeries, err := groupSeriesLists(ec, fe.Args, fe) + if err != nil { + return nil, err + } + return aggregateSeries(ec, fe, nextSeries, funcName, ec.xFilesFactor) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aggregateLine +func transformAggregateLine(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1, 2 or 3", len(args)) + } + funcName, err := getOptionalString(args, "func", 1, "avg") + if err != nil { + return nil, err + } + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + return nil, err + } + keepStep, err := getOptionalBool(args, "keepStep", 2, false) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + v := aggrFunc(values) + if keepStep { + for i := range values { + values[i] = v + } + } else { + s.Timestamps = []int64{ec.startTime, (ec.endTime + ec.startTime) / 2, ec.endTime} + s.Values = []float64{v, v, v} + } + vString := "None" + if !math.IsNaN(v) { + vString = fmt.Sprintf("%g", v) + } + s.Name = fmt.Sprintf("aggregateLine(%s,%s)", s.Name, vString) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aggregateWithWildcards +func transformAggregateWithWildcards(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want at least 2", len(args)) + } + funcName, err := getString(args, "func", 1) + if err != nil { + return nil, err + } + positions, err := getInts(args[2:], "positions") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return aggregateSeriesWithWildcards(ec, fe, nextSeries, funcName, positions) +} + +func aggregateSeriesWithWildcards(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, funcName string, positions []int) (nextSeriesFunc, error) { + positionsMap := make(map[int]struct{}) + for _, pos := range positions { + positionsMap[pos] = struct{}{} + } + keyFunc := func(name string, tags map[string]string) string { + parts := strings.Split(getPathFromName(name), ".") + dstParts := parts[:0] + for i, part := range parts { + if _, ok := positionsMap[i]; ok { + continue + } + dstParts = append(dstParts, part) + } + return strings.Join(dstParts, ".") + } + return groupByKeyFunc(ec, expr, nextSeries, funcName, keyFunc) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.alias +func transformAlias(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + newName, err := getString(args, "newName", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Name = newName + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aliasByMetric +func transformAliasByMetric(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + path := getPathFromName(s.Name) + n := strings.LastIndexByte(path, '.') + if n > 0 { + path = path[n+1:] + } + s.Name = path + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aliasByNode +func transformAliasByNode(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want at least 1", len(args)) + } + nodes, err := getNodes(args[1:]) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Name = getNameFromNodes(s.Name, s.Tags, nodes) + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aliasQuery +func transformAliasQuery(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 4 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 4", len(args)) + } + re, err := getRegexp(args, "search", 1) + if err != nil { + return nil, err + } + replace, err := getRegexpReplacement(args, "replace", 2) + if err != nil { + return nil, err + } + newName, err := getString(args, "newName", 3) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + query := re.ReplaceAllString(s.Name, replace) + next, err := execExpr(ec, query) + if err != nil { + return nil, fmt.Errorf("cannot evaluate query %q: %w", query, err) + } + ss, err := fetchAllSeries(next) + if err != nil { + return nil, fmt.Errorf("cannot fetch series for query %q: %w", query, err) + } + if len(ss) == 0 { + return nil, fmt.Errorf("cannot find series for query %q", query) + } + v := aggrLast(ss[0].Values) + if math.IsNaN(v) { + return nil, fmt.Errorf("cannot find values for query %q", query) + } + name := strings.ReplaceAll(newName, "%d", fmt.Sprintf("%d", int(v))) + name = strings.ReplaceAll(name, "%g", fmt.Sprintf("%g", v)) + name = strings.ReplaceAll(name, "%f", fmt.Sprintf("%f", v)) + s.Name = name + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.aliasSub +func transformAliasSub(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 3", len(args)) + } + re, err := getRegexp(args, "search", 1) + if err != nil { + return nil, err + } + replace, err := getRegexpReplacement(args, "replace", 2) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Name = re.ReplaceAllString(s.Name, replace) + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.alpha +func transformAlpha(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + _, err := getNumber(args, "alpha", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.applyByNode +func transformApplyByNode(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 3 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 3 to 4", len(args)) + } + nn, err := getNumber(args, "nodeNum", 1) + if err != nil { + return nil, err + } + nodeNum := int(nn) + templateFunction, err := getString(args, "templateFunction", 2) + if err != nil { + return nil, err + } + newName, err := getOptionalString(args, "newName", 3, "") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + nextTemplateSeries := newZeroSeriesFunc() + prefix := "" + visitedPrefixes := make(map[string]struct{}) + f := func() (*series, error) { + for { + ts, err := nextTemplateSeries() + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + if ts != nil { + if newName != "" { + ts.Name = strings.ReplaceAll(newName, "%", prefix) + } + ts.expr = fe + ts.pathExpression = prefix + return ts, nil + } + for { + s, err := nextSeries() + if err != nil { + return nil, err + } + if s == nil { + return nil, nil + } + prefix = getPathFromName(s.Name) + nodes := strings.Split(prefix, ".") + if nodeNum >= 0 && nodeNum < len(nodes) { + prefix = strings.Join(nodes[:nodeNum+1], ".") + } + if _, ok := visitedPrefixes[prefix]; !ok { + visitedPrefixes[prefix] = struct{}{} + break + } + } + query := strings.ReplaceAll(templateFunction, "%", prefix) + next, err := execExpr(ec, query) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + nextTemplateSeries = next + } + } + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.areaBetween +func transformAreaBetween(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + seriesFound := 0 + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + seriesFound++ + if seriesFound > 2 { + return nil, fmt.Errorf("expecting exactly two series; got more series") + } + s.Tags["areaBetween"] = "1" + s.Name = fmt.Sprintf("areaBetween(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.asPercent +func transformAsPercent(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want at least 1", len(args)) + } + totalArg := getOptionalArg(args, "total", 1) + if totalArg == nil { + totalArg = &graphiteql.ArgExpr{ + Expr: &graphiteql.NoneExpr{}, + } + } + var nodes []graphiteql.Expr + if len(args) > 2 { + ns, err := getNodes(args[2:]) + if err != nil { + return nil, err + } + nodes = ns + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + switch t := totalArg.Expr.(type) { + case *graphiteql.NoneExpr: + if len(nodes) == 0 { + ss, step, err := fetchNormalizedSeries(ec, nextSeries, true) + if err != nil { + return nil, err + } + inplacePercentForMultiSeries(ec, fe, ss, step) + return multiSeriesFunc(ss), nil + } + m, step, err := fetchNormalizedSeriesByNodes(ec, nextSeries, nodes) + if err != nil { + return nil, err + } + var ssAll []*series + for _, ss := range m { + inplacePercentForMultiSeries(ec, fe, ss, step) + ssAll = append(ssAll, ss...) + } + return multiSeriesFunc(ssAll), nil + case *graphiteql.NumberExpr: + if len(nodes) > 0 { + _, _ = drainAllSeries(nextSeries) + return nil, fmt.Errorf("unexpected non-empty nodes for numeric total") + } + total := t.N + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = v / total * 100 + } + s.Name = fmt.Sprintf("asPercent(%s,%g)", s.Name, total) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil + default: + nextTotal, err := evalExpr(ec, t) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + if len(nodes) == 0 { + // Fetch series serially in order to preserve the original order of series returned by nextTotal, + // so the returned series could be matched against series returned by nextSeries. + ssTotal, stepTotal, err := fetchNormalizedSeries(ec, nextTotal, false) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + if len(ssTotal) == 0 { + _, _ = drainAllSeries(nextSeries) + // The `total` expression matches zero series. Return empty response in this case. + return multiSeriesFunc(nil), nil + } + if len(ssTotal) == 1 { + sTotal := ssTotal[0] + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, stepTotal) + inplacePercentForSingleSeries(fe, s, sTotal) + return s, nil + }) + return f, nil + } + // Fetch series serially in order to preserve the original order of series returned by nextSeries + // and match these series to ssTotal + ss, step, err := fetchNormalizedSeries(ec, nextSeries, false) + if err != nil { + return nil, err + } + if len(ss) != len(ssTotal) { + return nil, fmt.Errorf("unexpected number of series returned by total expression; got %d; want %d", len(ssTotal), len(ss)) + } + if step != stepTotal { + return nil, fmt.Errorf("step mismatch for series and total series: %d vs %d", step, stepTotal) + } + for i, s := range ss { + inplacePercentForSingleSeries(fe, s, ssTotal[i]) + } + return multiSeriesFunc(ss), nil + } + m, step, err := fetchNormalizedSeriesByNodes(ec, nextSeries, nodes) + if err != nil { + _, _ = drainAllSeries(nextTotal) + return nil, err + } + mTotal, stepTotal, err := fetchNormalizedSeriesByNodes(ec, nextTotal, nodes) + if err != nil { + return nil, err + } + if step != stepTotal { + return nil, fmt.Errorf("step mismatch for series and total series: %d vs %d", step, stepTotal) + } + var ssAll []*series + for key, ssTotal := range mTotal { + seriesExpressions := make([]string, 0, len(ssTotal)) + as := newAggrStateSum(ec.pointsLen(step)) + for _, s := range ssTotal { + seriesExpressions = append(seriesExpressions, s.pathExpression) + as.Update(s.Values) + } + totalValues := as.Finalize(ec.xFilesFactor) + totalName := formatAggrFuncForPercentSeriesNames("sum", seriesExpressions) + ss := m[key] + if ss == nil { + s := newNaNSeries(ec, step) + newName := fmt.Sprintf("asPercent(MISSING,%s)", totalName) + s.Name = newName + s.Tags["name"] = newName + s.expr = fe + s.pathExpression = s.Name + ssAll = append(ssAll, s) + continue + } + for _, s := range ss { + values := s.Values + for i, v := range values { + values[i] = v / totalValues[i] * 100 + } + newName := fmt.Sprintf("asPercent(%s,%s)", s.Name, totalName) + s.Name = newName + s.Tags["name"] = newName + s.expr = fe + s.pathExpression = s.Name + ssAll = append(ssAll, s) + } + } + for key, ss := range m { + ssTotal := mTotal[key] + if ssTotal != nil { + continue + } + for _, s := range ss { + values := s.Values + for i := range values { + values[i] = nan + } + newName := fmt.Sprintf("asPercent(%s,MISSING)", s.Name) + s.Name = newName + s.Tags["name"] = newName + s.expr = fe + s.pathExpression = s.Name + ssAll = append(ssAll, s) + } + } + return multiSeriesFunc(ssAll), nil + } +} + +func inplacePercentForSingleSeries(expr graphiteql.Expr, s, sTotal *series) { + values := s.Values + totalValues := sTotal.Values + for i, v := range values { + values[i] = v / totalValues[i] * 100 + } + newName := fmt.Sprintf("asPercent(%s,%s)", s.Name, sTotal.Name) + s.Name = newName + s.Tags["name"] = newName + s.expr = expr + s.pathExpression = s.Name +} + +func inplacePercentForMultiSeries(ec *evalConfig, expr graphiteql.Expr, ss []*series, step int64) { + seriesExpressions := make([]string, 0, len(ss)) + as := newAggrStateSum(ec.pointsLen(step)) + for _, s := range ss { + seriesExpressions = append(seriesExpressions, s.pathExpression) + as.Update(s.Values) + } + totalValues := as.Finalize(ec.xFilesFactor) + totalName := formatAggrFuncForPercentSeriesNames("sum", seriesExpressions) + for _, s := range ss { + values := s.Values + for i, v := range values { + values[i] = v / totalValues[i] * 100 + } + newName := fmt.Sprintf("asPercent(%s,%s)", s.Name, totalName) + s.Name = newName + s.Tags["name"] = newName + s.expr = expr + s.pathExpression = s.Name + } +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.averageAbove +func transformAverageAbove(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "average", ">", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.averageBelow +func transformAverageBelow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "average", "<", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.averageOutsidePercentile +func transformAverageOutsidePercentile(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + var sws []seriesWithWeight + var lock sync.Mutex + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + avg := aggrAvg(s.Values) + lock.Lock() + sws = append(sws, seriesWithWeight{ + s: s, + v: avg, + }) + lock.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + avgs := make([]float64, len(sws)) + for i, sw := range sws { + avgs[i] = sw.v + } + if n > 50 { + n = 100 - n + } + lowPercentile := n + highPercentile := 100 - n + lowValue := newAggrFuncPercentile(lowPercentile)(avgs) + highValue := newAggrFuncPercentile(highPercentile)(avgs) + var ss []*series + for _, sw := range sws { + if sw.v < lowValue || sw.v > highValue { + s := sw.s + s.expr = fe + ss = append(ss, s) + } + } + return multiSeriesFunc(ss), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.averageSeries +func transformAverageSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "average") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.averageSeriesWithWildcards +func transformAverageSeriesWithWildcards(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesWithWildcardsGeneric(ec, fe, "average") +} + +func aggregateSeriesWithWildcardsGeneric(ec *evalConfig, fe *graphiteql.FuncExpr, funcName string) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; must be at least 1", len(args)) + } + positions, err := getInts(args[1:], "position") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return aggregateSeriesWithWildcards(ec, fe, nextSeries, funcName, positions) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.changed +func transformChanged(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("expecting a single arg; got %d args", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + prevValue := nan + for i, v := range values { + if math.IsNaN(prevValue) { + prevValue = v + values[i] = 0 + } else if !math.IsNaN(v) && prevValue != v { + prevValue = v + values[i] = 1 + } else { + values[i] = 0 + } + } + s.Name = fmt.Sprintf("changed(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.color +func transformColor(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + _, err := getString(args, "theColor", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.countSeries +func transformCountSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "count") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.cumulative +func transformCumulative(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return consolidateBy(ec, fe, nextSeries, "sum") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.consolidateBy +func transformConsolidateBy(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + funcName, err := getString(args, "consolidationFunc", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return consolidateBy(ec, fe, nextSeries, funcName) +} + +func consolidateBy(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, funcName string) (nextSeriesFunc, error) { + consolidateFunc, err := getAggrFunc(funcName) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidateFunc = consolidateFunc + s.Name = fmt.Sprintf("consolidateBy(%s,%s)", s.Name, graphiteql.QuoteString(funcName)) + s.Tags["consolidateBy"] = funcName + s.expr = expr + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.constantLine +func transformConstantLine(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("expecting a single arg; got %d", len(args)) + } + n, err := getNumber(args, "value", 0) + if err != nil { + return nil, err + } + return constantLine(ec, fe, n), nil +} + +func constantLine(ec *evalConfig, expr graphiteql.Expr, n float64) nextSeriesFunc { + name := fmt.Sprintf("%g", n) + step := (ec.endTime - ec.startTime) / 2 + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: []int64{ec.startTime, ec.startTime + step, ec.startTime + 2*step}, + Values: []float64{n, n, n}, + expr: expr, + pathExpression: string(expr.AppendString(nil)), + step: step, + } + return singleSeriesFunc(s) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.currentAbove +func transformCurrentAbove(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "current", ">", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.currentBelow +func transformCurrentBelow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "current", "<", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.dashed +func transformDashed(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + dashLength, err := getOptionalNumber(args, "dashLength", 1, 5) + if err != nil { + return nil, err + } + dashLengthStr := fmt.Sprintf("%g", dashLength) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Name = fmt.Sprintf("dashed(%s,%s)", s.Name, dashLengthStr) + s.Tags["dashed"] = dashLengthStr + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.delay +func transformDelay(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + stepsFloat, err := getNumber(args, "steps", 1) + if err != nil { + return nil, err + } + steps := int(stepsFloat) + stepsStr := fmt.Sprintf("%d", steps) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + stepsLocal := steps + if stepsLocal < 0 { + stepsLocal = -stepsLocal + if stepsLocal > len(values) { + stepsLocal = len(values) + } + copy(values, values[stepsLocal:]) + for i := len(values) - 1; i >= len(values)-stepsLocal; i-- { + values[i] = nan + } + } else { + if stepsLocal > len(values) { + stepsLocal = len(values) + } + copy(values[stepsLocal:], values[:len(values)-stepsLocal]) + for i := 0; i < stepsLocal; i++ { + values[i] = nan + } + } + s.Tags["delay"] = stepsStr + s.Name = fmt.Sprintf("delay(%s,%d)", s.Name, steps) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.derivative +func transformDerivative(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + prevValue := nan + for i, v := range values { + if math.IsNaN(prevValue) || math.IsNaN(v) { + values[i] = nan + } else { + values[i] = v - prevValue + } + prevValue = v + } + s.Tags["derivative"] = "1" + s.Name = fmt.Sprintf("derivative(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.diffSeries +func transformDiffSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "diff") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.divideSeries +func transformDivideSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + nextDivisor, err := evalSeriesList(ec, args, "divisorSeries", 1) + if err != nil { + return nil, err + } + ssDivisors, stepDivisor, err := fetchNormalizedSeries(ec, nextDivisor, false) + if err != nil { + return nil, err + } + if len(ssDivisors) > 1 { + return nil, fmt.Errorf("unexpected number of divisorSeries; got %d; want 1", len(ssDivisors)) + } + nextDividend, err := evalSeriesList(ec, args, "dividendSeriesList", 0) + if err != nil { + return nil, err + } + if len(ssDivisors) == 0 { + f := nextSeriesConcurrentWrapper(nextDividend, func(s *series) (*series, error) { + values := s.Values + for i := range values { + values[i] = nan + } + s.Name = fmt.Sprintf("divideSeries(%s,MISSING)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil + } + sDivisor := ssDivisors[0] + divisorName := sDivisor.Name + divisorValues := sDivisor.Values + f := nextSeriesSerialWrapper(nextDividend, func(s *series) (*series, error) { + s.consolidate(ec, stepDivisor) + values := s.Values + for i, v := range values { + values[i] = v / divisorValues[i] + } + s.Name = fmt.Sprintf("divideSeries(%s,%s)", s.Name, divisorName) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.divideSeriesLists +func transformDivideSeriesLists(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + nextDividend, err := evalSeriesList(ec, args, "dividendSeriesList", 0) + if err != nil { + return nil, err + } + ssDividend, stepDivident, err := fetchNormalizedSeries(ec, nextDividend, false) + if err != nil { + return nil, err + } + nextDivisor, err := evalSeriesList(ec, args, "divisorSeriesList", 1) + if err != nil { + return nil, err + } + ssDivisor, stepDivisor, err := fetchNormalizedSeries(ec, nextDivisor, false) + if err != nil { + return nil, err + } + if len(ssDividend) != len(ssDivisor) { + return nil, fmt.Errorf("divident and divisor must have equal number of series; got %d vs %d series", len(ssDividend), len(ssDivisor)) + } + if stepDivident != stepDivisor { + return nil, fmt.Errorf("step mismatch for divident and divisor: %d vs %d", stepDivident, stepDivisor) + } + for i, s := range ssDividend { + sDivisor := ssDivisor[i] + values := s.Values + divisorValues := sDivisor.Values + for j, v := range values { + values[j] = v / divisorValues[j] + } + s.Name = fmt.Sprintf("divideSeries(%s,%s)", s.Name, sDivisor.Name) + s.expr = fe + s.pathExpression = s.Name + } + return multiSeriesFunc(ssDividend), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.drawAsInfinite +func transformDrawAsInfinite(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Tags["drawAsInfinite"] = "1" + s.Name = fmt.Sprintf("drawAsInfinite(%s)", s.Name) + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.events +func transformEvents(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + var tags []string + for _, arg := range args { + se, ok := arg.Expr.(*graphiteql.StringExpr) + if !ok { + return nil, fmt.Errorf("expecting string tag; got %T", arg.Expr) + } + tags = append(tags, graphiteql.QuoteString(se.S)) + } + s := newNaNSeries(ec, ec.storageStep) + events := fmt.Sprintf("events(%s)", strings.Join(tags, ",")) + s.Name = events + s.Tags = map[string]string{"name": events} + s.expr = fe + s.pathExpression = s.Name + return singleSeriesFunc(s), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.exclude +func transformExclude(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("expecting two args; got %d args", len(args)) + } + pattern, err := getRegexp(args, "pattern", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + if pattern.MatchString(s.Name) { + return nil, nil + } + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.exp +func transformExp(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("expecting one arg; got %d args", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Exp(v) + } + s.Tags["exp"] = "e" + s.Name = fmt.Sprintf("exp(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.exponentialMovingAverage +func transformExponentialMovingAverage(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + windowSizeArg, err := getArg(args, "windowSize", 1) + if err != nil { + return nil, err + } + windowSizeStr := string(windowSizeArg.Expr.AppendString(nil)) + var c float64 + var windowSize int64 + switch t := windowSizeArg.Expr.(type) { + case *graphiteql.StringExpr: + ws, err := parseInterval(t.S) + if err != nil { + return nil, fmt.Errorf("cannot parse windowSize: %w", err) + } + c = 2 / (float64(ws)/1000 + 1) + windowSize = ws + case *graphiteql.NumberExpr: + c = 2 / (t.N + 1) + windowSize = int64(t.N * float64(ec.storageStep)) + default: + return nil, fmt.Errorf("windowSize must be either string or number; got %T", t) + } + if windowSize < 0 { + windowSize = -windowSize + } + + ecCopy := *ec + ecCopy.startTime -= windowSize + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + timestamps := s.Timestamps + i := 0 + for i < len(timestamps) && timestamps[i] < ec.startTime { + i++ + } + ema := aggrAvg(s.Values[:i]) + if math.IsNaN(ema) { + ema = 0 + } + values := s.Values[i:] + timestamps = timestamps[i:] + for i, v := range values { + ema = c*v + (1-c)*ema + values[i] = ema + } + s.Timestamps = append([]int64{}, timestamps...) + s.Values = append([]float64{}, values...) + s.Tags["exponentialMovingAverage"] = windowSizeStr + s.Name = fmt.Sprintf("exponentialMovingAverage(%s,%s)", s.Name, windowSizeStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.fallbackSeries +func transformFallbackSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of arg; got %d; want 2", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + seriesFetched := 0 + fallbackUsed := false + f := func() (*series, error) { + for { + s, err := nextSeries() + if err != nil { + return nil, err + } + if s != nil { + seriesFetched++ + s.expr = fe + return s, nil + } + if fallbackUsed || seriesFetched > 0 { + return nil, nil + } + fallback, err := evalSeriesList(ec, args, "fallback", 1) + if err != nil { + return nil, err + } + nextSeries = fallback + fallbackUsed = true + } + } + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.filterSeries +func transformFilterSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 4 { + return nil, fmt.Errorf("unexpected number of arg; got %d; want 4", len(args)) + } + funcName, err := getString(args, "func", 1) + if err != nil { + return nil, err + } + operator, err := getString(args, "operator", 2) + if err != nil { + return nil, err + } + threshold, err := getNumber(args, "threshold", 3) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, funcName, operator, threshold) +} + +func filterSeriesGeneric(expr graphiteql.Expr, nextSeries nextSeriesFunc, funcName, operator string, threshold float64) (nextSeriesFunc, error) { + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + operatorFunc, err := getOperatorFunc(operator) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + v := aggrFunc(s.Values) + if !operatorFunc(v, threshold) { + return nil, nil + } + s.expr = expr + return s, nil + }) + return f, nil +} + +func getOperatorFunc(operator string) (operatorFunc, error) { + switch operator { + case "=": + return operatorFuncEqual, nil + case "!=": + return operatorFuncNotEqual, nil + case ">": + return operatorFuncAbove, nil + case ">=": + return operatorFuncAboveEqual, nil + case "<": + return operatorFuncBelow, nil + case "<=": + return operatorFuncBelowEqual, nil + default: + return nil, fmt.Errorf("unknown operator %q", operator) + } +} + +type operatorFunc func(v, threshold float64) bool + +func operatorFuncEqual(v, threshold float64) bool { + return v == threshold +} + +func operatorFuncNotEqual(v, threshold float64) bool { + return v != threshold +} + +func operatorFuncAbove(v, threshold float64) bool { + return v > threshold +} + +func operatorFuncAboveEqual(v, threshold float64) bool { + return v >= threshold +} + +func operatorFuncBelow(v, threshold float64) bool { + return v < threshold +} + +func operatorFuncBelowEqual(v, threshold float64) bool { + return v <= threshold +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.grep +func transformGrep(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("expecting two args; got %d args", len(args)) + } + pattern, err := getRegexp(args, "pattern", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + if !pattern.MatchString(s.Name) { + return nil, nil + } + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.group +func transformGroup(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return groupSeriesLists(ec, fe.Args, fe) +} + +func groupSeriesLists(ec *evalConfig, args []*graphiteql.ArgExpr, expr graphiteql.Expr) (nextSeriesFunc, error) { + var nextSeriess []nextSeriesFunc + for i := 0; i < len(args); i++ { + nextSeries, err := evalSeriesList(ec, args, "seriesList", i) + if err != nil { + for _, f := range nextSeriess { + _, _ = drainAllSeries(f) + } + return nil, err + } + nextSeriess = append(nextSeriess, nextSeries) + } + return nextSeriesGroup(nextSeriess, expr), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.groupByNode +func transformGroupByNode(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 or 3", len(args)) + } + nodes, err := getNodes(args[1:2]) + if err != nil { + return nil, err + } + callback, err := getOptionalString(args, "callback", 2, "average") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return groupByNodesGeneric(ec, fe, nextSeries, nodes, callback) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.groupByNodes +func transformGroupByNodes(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want at least 2", len(args)) + } + callback, err := getString(args, "callback", 1) + if err != nil { + return nil, err + } + nodes, err := getNodes(args[2:]) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return groupByNodesGeneric(ec, fe, nextSeries, nodes, callback) +} + +func groupByNodesGeneric(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, nodes []graphiteql.Expr, callback string) (nextSeriesFunc, error) { + keyFunc := func(name string, tags map[string]string) string { + return getNameFromNodes(name, tags, nodes) + } + return groupByKeyFunc(ec, expr, nextSeries, callback, keyFunc) +} + +func groupByKeyFunc(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, aggrFuncName string, + keyFunc func(name string, tags map[string]string) string) (nextSeriesFunc, error) { + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + nextSeriesWrapper := getNextSeriesWrapperForAggregateFunc(aggrFuncName) + type x struct { + as aggrState + tags map[string]string + seriesExpressions []string + } + m := make(map[string]*x) + var mLock sync.Mutex + f := nextSeriesWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + key := keyFunc(s.Name, s.Tags) + mLock.Lock() + defer mLock.Unlock() + e := m[key] + if e == nil { + as, err := newAggrState(ec.pointsLen(step), aggrFuncName) + if err != nil { + return nil, err + } + e = &x{ + as: as, + tags: s.Tags, + } + m[key] = e + } else { + for k, v := range e.tags { + if v != s.Tags[k] { + delete(e.tags, k) + } + } + } + e.as.Update(s.Values) + e.seriesExpressions = append(e.seriesExpressions, s.pathExpression) + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + var ss []*series + for key, e := range m { + tags := e.tags + if tags["name"] == "" { + funcName := strings.TrimSuffix(aggrFuncName, "Series") + tags["name"] = fmt.Sprintf("%sSeries(%s)", funcName, formatPathsFromSeriesExpressions(e.seriesExpressions, true)) + } + tags["aggregatedBy"] = aggrFuncName + s := &series{ + Name: key, + Tags: tags, + Timestamps: ec.newTimestamps(step), + Values: e.as.Finalize(ec.xFilesFactor), + expr: expr, + pathExpression: tags["name"], + step: step, + } + ss = append(ss, s) + } + return multiSeriesFunc(ss), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.groupByTags +func transformGroupByTags(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want at least 2", len(args)) + } + callback, err := getString(args, "callback", 1) + if err != nil { + return nil, err + } + tagKeys := make(map[string]struct{}) + for _, arg := range args[2:] { + se, ok := arg.Expr.(*graphiteql.StringExpr) + if !ok { + return nil, fmt.Errorf("unexpected tag type: %T; expecting string", arg.Expr) + } + tagKeys[se.S] = struct{}{} + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + keyFunc := func(name string, tags map[string]string) string { + return formatKeyFromTags(tags, tagKeys, callback) + } + return groupByKeyFunc(ec, fe, nextSeries, callback, keyFunc) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.highest +func transformHighest(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 1 to 3", len(args)) + } + n, err := getOptionalNumber(args, "n", 1, 1) + if err != nil { + return nil, err + } + funcName, err := getOptionalString(args, "func", 2, "average") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return highestGeneric(ec, fe, nextSeries, n, funcName) +} + +func highestGeneric(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, n float64, funcName string) (nextSeriesFunc, error) { + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + nextSeriesWrapper := getNextSeriesWrapperForAggregateFunc(funcName) + var topSeries maxSeriesHeap + var topSeriesLock sync.Mutex + f := nextSeriesWrapper(nextSeries, func(s *series) (*series, error) { + v := aggrFunc(s.Values) + topSeriesLock.Lock() + defer topSeriesLock.Unlock() + if len(topSeries) < int(n) { + heap.Push(&topSeries, &seriesWithWeight{ + v: v, + s: s, + }) + } else if v > topSeries[0].v { + topSeries[0] = &seriesWithWeight{ + v: v, + s: s, + } + heap.Fix(&topSeries, 0) + } + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + sort.Slice(topSeries, func(i, j int) bool { + return topSeries[i].v < topSeries[j].v + }) + var ss []*series + for _, x := range topSeries { + s := x.s + s.expr = expr + ss = append(ss, s) + } + return multiSeriesFunc(ss), nil +} + +type seriesWithWeight struct { + v float64 + s *series +} + +type minSeriesHeap []*seriesWithWeight + +func (h *minSeriesHeap) Len() int { return len(*h) } +func (h *minSeriesHeap) Less(i, j int) bool { + a := *h + return a[i].v > a[j].v +} +func (h *minSeriesHeap) Swap(i, j int) { + a := *h + a[i], a[j] = a[j], a[i] +} +func (h *minSeriesHeap) Push(x interface{}) { + *h = append(*h, x.(*seriesWithWeight)) +} +func (h *minSeriesHeap) Pop() interface{} { + a := *h + x := a[len(a)-1] + *h = a[:len(a)-1] + return x +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.highestAverage +func transformHighestAverage(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return highestGeneric(ec, fe, nextSeries, n, "average") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.highestCurrent +func transformHighestCurrent(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return highestGeneric(ec, fe, nextSeries, n, "current") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.highestMax +func transformHighestMax(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return highestGeneric(ec, fe, nextSeries, n, "max") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.hitcount +func transformHitcount(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 or 3", len(args)) + } + intervalString, err := getString(args, "intervalString", 1) + if err != nil { + return nil, err + } + interval, err := parseInterval(intervalString) + if err != nil { + return nil, err + } + if interval <= 0 { + return nil, fmt.Errorf("interval must be positive; got %dms", interval) + } + alignToInterval, err := getOptionalBool(args, "alignToInterval", 2, false) + if err != nil { + return nil, err + } + ecCopy := *ec + if alignToInterval { + startTime := ecCopy.startTime + tz := ecCopy.currentTime.Location() + t := time.Unix(startTime/1e3, (startTime%1000)*1e6).In(tz) + if interval >= 24*3600*1000 { + t = time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, tz) + } else if interval >= 3600*1000 { + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), 0, 0, 0, tz) + } else if interval >= 60*1000 { + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), 0, 0, tz) + } + ecCopy.startTime = t.UnixNano() / 1e6 + } + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + ts := ecCopy.startTime + timestamps := s.Timestamps + values := s.Values + var dstTimestamps []int64 + var dstValues []float64 + i := 0 + vPrev := float64(0) + for ts < ecCopy.endTime { + tsPrev := ts + hitcount := float64(0) + if i < len(timestamps) && !math.IsNaN(vPrev) { + hitcount = vPrev * float64(timestamps[i]-tsPrev) / 1000 + } + tsEnd := ts + interval + for i < len(timestamps) { + tsCurr := timestamps[i] + if tsCurr >= tsEnd { + break + } + v := values[i] + if !math.IsNaN(v) { + hitcount += v * (float64(tsCurr-tsPrev) / 1000) + } + tsPrev = tsCurr + vPrev = v + i++ + } + if hitcount == 0 { + hitcount = nan + } + dstValues = append(dstValues, hitcount) + dstTimestamps = append(dstTimestamps, ts) + ts = tsEnd + } + s.Timestamps = dstTimestamps + s.Values = dstValues + s.Tags["hitcount"] = intervalString + if alignToInterval { + s.Name = fmt.Sprintf("hitcount(%s,%s,true)", s.Name, graphiteql.QuoteString(intervalString)) + } else { + s.Name = fmt.Sprintf("hitcount(%s,%s)", s.Name, graphiteql.QuoteString(intervalString)) + } + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.identity +func transformIdentity(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + name, err := getString(args, "name", 0) + if err != nil { + return nil, err + } + const step = 60e3 + var dstValues []float64 + var dstTimestamps []int64 + ts := ec.startTime + for ts < ec.endTime { + dstValues = append(dstValues, float64(ts)/1000) + dstTimestamps = append(dstTimestamps, ts) + ts += step + } + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: dstTimestamps, + Values: dstValues, + expr: fe, + pathExpression: name, + step: step, + } + return singleSeriesFunc(s), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.integral +func transformIntegral(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + sum := float64(0) + for i, v := range values { + if math.IsNaN(v) { + continue + } + sum += v + values[i] = sum + } + s.Tags["integral"] = "1" + s.Name = fmt.Sprintf("integral(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.integralByInterval +func transformIntegralByInterval(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + intervalUnit, err := getString(args, "intervalUnit", 1) + if err != nil { + return nil, err + } + interval, err := parseInterval(intervalUnit) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + timestamps := s.Timestamps + sum := float64(0) + dtPrev := int64(0) + for i, v := range values { + if math.IsNaN(v) { + continue + } + dt := timestamps[i] / interval + if dt != dtPrev { + sum = 0 + dtPrev = dt + } + sum += v + values[i] = sum + } + s.Tags["integralByInterval"] = "1" + s.Name = fmt.Sprintf("integralByInterval(%s,%s)", s.Name, graphiteql.QuoteString(intervalUnit)) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.interpolate +func transformInterpolate(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + limit, err := getOptionalNumber(args, "limit", 1, math.Inf(1)) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + nansCount := float64(0) + prevValue := nan + for i, v := range values { + if math.IsNaN(v) { + nansCount++ + continue + } + if nansCount > 0 && nansCount <= limit { + delta := (v - prevValue) / (nansCount + 1) + for j := i - int(nansCount); j < i; j++ { + prevValue += delta + values[j] = prevValue + } + } + nansCount = 0 + prevValue = v + } + s.Name = fmt.Sprintf("interpolate(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.invert +func transformInvert(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = 1 / v + } + s.Tags["invert"] = "1" + s.Name = fmt.Sprintf("invert(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.keepLastValue +func transformKeepLastValue(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + limit, err := getOptionalNumber(args, "limit", 1, math.Inf(1)) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "serieslList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + nansCount := float64(0) + prevValue := nan + for i, v := range values { + if !math.IsNaN(v) { + nansCount = 0 + prevValue = v + continue + } + nansCount++ + if nansCount <= limit { + values[i] = prevValue + } + } + s.Name = fmt.Sprintf("keepLastValue(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.limit +func transformLimit(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + seriesFetched := 0 + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + if seriesFetched >= int(n) { + return nil, nil + } + seriesFetched++ + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.lineWidth +func transformLineWidth(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + _, err := getNumber(args, "width", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.logarithm +func transformLogarithm(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + base, err := getOptionalNumber(args, "base", 1, 10) + if err != nil { + return nil, err + } + baseStr := fmt.Sprintf("%g", base) + baseLog := math.Log(base) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Log(v) / baseLog + } + s.Tags["log"] = baseStr + s.Name = fmt.Sprintf("log(%s,%s)", s.Name, baseStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.logit +func transformLogit(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Log(v / (1 - v)) + } + s.Tags["logit"] = "logit" + s.Name = fmt.Sprintf("logit(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.lowest +func transformLowest(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 1 to 3", len(args)) + } + n, err := getOptionalNumber(args, "n", 1, 1) + if err != nil { + return nil, err + } + funcName, err := getOptionalString(args, "func", 2, "average") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return lowestGeneric(ec, fe, nextSeries, n, funcName) +} + +func lowestGeneric(ec *evalConfig, expr graphiteql.Expr, nextSeries nextSeriesFunc, n float64, funcName string) (nextSeriesFunc, error) { + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + nextSeriesWrapper := getNextSeriesWrapperForAggregateFunc(funcName) + var minSeries minSeriesHeap + var minSeriesLock sync.Mutex + f := nextSeriesWrapper(nextSeries, func(s *series) (*series, error) { + v := aggrFunc(s.Values) + minSeriesLock.Lock() + defer minSeriesLock.Unlock() + if len(minSeries) < int(n) { + heap.Push(&minSeries, &seriesWithWeight{ + v: v, + s: s, + }) + } else if v < minSeries[0].v { + minSeries[0] = &seriesWithWeight{ + v: v, + s: s, + } + heap.Fix(&minSeries, 0) + } + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + sort.Slice(minSeries, func(i, j int) bool { + return minSeries[i].v > minSeries[j].v + }) + var ss []*series + for _, x := range minSeries { + s := x.s + s.expr = expr + ss = append(ss, s) + } + return multiSeriesFunc(ss), nil +} + +type maxSeriesHeap []*seriesWithWeight + +func (h *maxSeriesHeap) Len() int { return len(*h) } +func (h *maxSeriesHeap) Less(i, j int) bool { + a := *h + return a[i].v < a[j].v +} +func (h *maxSeriesHeap) Swap(i, j int) { + a := *h + a[i], a[j] = a[j], a[i] +} +func (h *maxSeriesHeap) Push(x interface{}) { + *h = append(*h, x.(*seriesWithWeight)) +} +func (h *maxSeriesHeap) Pop() interface{} { + a := *h + x := a[len(a)-1] + *h = a[:len(a)-1] + return x +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.lowestAverage +func transformLowestAverage(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return lowestGeneric(ec, fe, nextSeries, n, "average") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.lowestCurrent +func transformLowestCurrent(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return lowestGeneric(ec, fe, nextSeries, n, "current") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.maxSeries +func transformMaxSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "max") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.maximumAbove +func transformMaximumAbove(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "max", ">", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.maximumBelow +func transformMaximumBelow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "max", "<", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.minMax +func transformMinMax(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + min := aggrMin(values) + if math.IsNaN(min) { + min = 0 + } + max := aggrMax(values) + if math.IsNaN(max) { + max = 0 + } + vRange := max - min + for i, v := range values { + v = (v - min) / vRange + if math.IsInf(v, 0) { + v = 0 + } + values[i] = v + } + s.Name = fmt.Sprintf("minMax(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.minSeries +func transformMinSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "min") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.minimumAbove +func transformMinimumAbove(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "min", ">", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.minimumBelow +func transformMinimumBelow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return filterSeriesGeneric(fe, nextSeries, "min", "<", n) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.mostDeviant +func transformMostDeviant(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return highestGeneric(ec, fe, nextSeries, n, "stddev") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingAverage +func transformMovingAverage(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return movingWindowGeneric(ec, fe, "average") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingMax +func transformMovingMax(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return movingWindowGeneric(ec, fe, "max") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingMedian +func transformMovingMedian(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return movingWindowGeneric(ec, fe, "median") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingMin +func transformMovingMin(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return movingWindowGeneric(ec, fe, "min") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingSum +func transformMovingSum(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return movingWindowGeneric(ec, fe, "sum") +} + +func movingWindowGeneric(ec *evalConfig, fe *graphiteql.FuncExpr, funcName string) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 or 3", len(args)) + } + windowSizeArg, err := getArg(args, "windowSize", 1) + if err != nil { + return nil, err + } + xFilesFactor, err := getOptionalNumber(args, "xFilesFactor", 2, ec.xFilesFactor) + if err != nil { + return nil, err + } + seriesListArg, err := getArg(args, "seriesList", 0) + if err != nil { + return nil, err + } + return movingWindow(ec, fe, seriesListArg, windowSizeArg, funcName, xFilesFactor) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.movingWindow +func transformMovingWindow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 2 to 4", len(args)) + } + windowSizeArg, err := getArg(args, "windowSize", 1) + if err != nil { + return nil, err + } + funcName, err := getOptionalString(args, "func", 2, "avg") + if err != nil { + return nil, err + } + xFilesFactor, err := getOptionalNumber(args, "xFilesFactor", 3, ec.xFilesFactor) + if err != nil { + return nil, err + } + seriesListArg, err := getArg(args, "seriesList", 0) + if err != nil { + return nil, err + } + return movingWindow(ec, fe, seriesListArg, windowSizeArg, funcName, xFilesFactor) +} + +func movingWindow(ec *evalConfig, fe *graphiteql.FuncExpr, seriesListArg, windowSizeArg *graphiteql.ArgExpr, funcName string, xFilesFactor float64) (nextSeriesFunc, error) { + windowSize, stepsCount, err := getWindowSize(ec, windowSizeArg) + if err != nil { + return nil, err + } + windowSizeStr := string(windowSizeArg.Expr.AppendString(nil)) + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + return nil, err + } + ecCopy := *ec + ecCopy.startTime -= windowSize + nextSeries, err := evalExpr(&ecCopy, seriesListArg.Expr) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + if stepsCount > 0 && step != ec.storageStep { + // The inner function call changes the step and the moving* function refers to it. + // Adjust the startTime and re-calculate the inner function on the adjusted time range. + if _, err := drainAllSeries(nextSeries); err != nil { + return nil, err + } + windowSize = int64(stepsCount * float64(step)) + ecCopy = *ec + ecCopy.startTime -= windowSize + nextSeries, err = evalExpr(&ecCopy, seriesListArg.Expr) + if err != nil { + return nil, err + } + } + tagName := "moving" + strings.Title(funcName) + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + timestamps := s.Timestamps + values := s.Values + var dstTimestamps []int64 + var dstValues []float64 + tsEnd := ecCopy.startTime + windowSize + i := 0 + j := 0 + for tsEnd <= ecCopy.endTime { + tsStart := tsEnd - windowSize + for i < len(timestamps) && timestamps[i] < tsStart { + i++ + } + if i > j { + j = i + } + for j < len(timestamps) && timestamps[j] < tsEnd { + j++ + } + v := aggrFunc.apply(xFilesFactor, values[i:j]) + dstTimestamps = append(dstTimestamps, tsEnd) + dstValues = append(dstValues, v) + tsEnd += step + } + s.Timestamps = dstTimestamps + s.Values = dstValues + s.Tags[tagName] = windowSizeStr + s.Name = fmt.Sprintf("%s(%s,%s)", tagName, s.Name, windowSizeStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.multiplySeries +func transformMultiplySeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "multiply") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.multiplySeriesWithWildcards +func transformMultiplySeriesWithWildcards(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesWithWildcardsGeneric(ec, fe, "multiply") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.percentileOfSeries +func transformPercentileOfSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 or 3", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + // TODO: properly use interpolate + if _, err := getOptionalBool(args, "interpolate", 2, false); err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + as := newAggrStatePercentile(ec.pointsLen(step), n) + var lock sync.Mutex + var seriesExpressions []string + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + lock.Lock() + as.Update(s.Values) + + seriesExpressions = append(seriesExpressions, s.pathExpression) + lock.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + if len(seriesExpressions) == 0 { + return multiSeriesFunc(nil), nil + } + // peek first expr as graphite does. + sort.Strings(seriesExpressions) + name := fmt.Sprintf("percentileOfSeries(%s,%g)", seriesExpressions[0], n) + s := &series{ + Name: name, + Tags: map[string]string{"name": name}, + Timestamps: ec.newTimestamps(step), + Values: as.Finalize(ec.xFilesFactor), + expr: fe, + pathExpression: name, + step: step, + } + return singleSeriesFunc(s), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.pow +func transformPow(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + factor, err := getNumber(args, "factor", 1) + if err != nil { + return nil, err + } + factorStr := fmt.Sprintf("%g", factor) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Pow(v, factor) + } + s.Tags["pow"] = factorStr + s.Name = fmt.Sprintf("pow(%s,%s)", s.Name, factorStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.powSeries +func transformPowSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "pow") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.randomWalk +func transformRandomWalk(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + name, err := getString(args, "name", 0) + if err != nil { + return nil, err + } + step, err := getOptionalNumber(args, "step", 1, 60) + if err != nil { + return nil, err + } + if step <= 0 { + return nil, fmt.Errorf("step must be positive; got %g", step) + } + stepMsecs := int64(step * 1000) + var dstValues []float64 + var dstTimestamps []int64 + ts := ec.startTime + v := float64(0) + for ts < ec.endTime { + dstValues = append(dstValues, v) + dstTimestamps = append(dstTimestamps, ts) + v += rand.Float64() - 0.5 + ts += stepMsecs + } + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: dstTimestamps, + Values: dstValues, + expr: fe, + pathExpression: name, + step: stepMsecs, + } + return singleSeriesFunc(s), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.rangeOfSeries +func transformRangeOfSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "rangeOf") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeAbovePercentile +func transformRemoveAbovePercentile(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + aggrFunc := newAggrFuncPercentile(n) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + max := aggrFunc(values) + for i, v := range values { + if v > max { + values[i] = nan + } + } + s.Name = fmt.Sprintf("removeAbovePercentile(%s,%g)", s.Name, n) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeAboveValue +func transformRemoveAboveValue(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + if v > n { + values[i] = nan + } + } + s.Name = fmt.Sprintf("removeAboveValue(%s,%g)", s.Name, n) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeBelowPercentile +func transformRemoveBelowPercentile(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + aggrFunc := newAggrFuncPercentile(n) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + min := aggrFunc(values) + for i, v := range values { + if v < min { + values[i] = nan + } + } + s.Name = fmt.Sprintf("removeBelowPercentile(%s,%g)", s.Name, n) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeBelowValue +func transformRemoveBelowValue(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + if v < n { + values[i] = nan + } + } + s.Name = fmt.Sprintf("removeBelowValue(%s,%g)", s.Name, n) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeBetweenPercentile +func transformRemoveBetweenPercentile(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + if n > 50 { + n = 100 - n + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + var ss []*series + asLow := newAggrStatePercentile(ec.pointsLen(step), n) + asHigh := newAggrStatePercentile(ec.pointsLen(step), 100-n) + var lock sync.Mutex + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + lock.Lock() + asLow.Update(s.Values) + asHigh.Update(s.Values) + ss = append(ss, s) + lock.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + lows := asLow.Finalize(ec.xFilesFactor) + highs := asHigh.Finalize(ec.xFilesFactor) + var ssDst []*series + for _, s := range ss { + values := s.Values + for i, v := range values { + if v < lows[i] || v > highs[i] { + s.expr = fe + ssDst = append(ssDst, s) + break + } + } + } + return multiSeriesFunc(ssDst), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.removeEmptySeries +func transformRemoveEmptySeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + xFilesFactor, err := getOptionalNumber(args, "xFilesFactor", 1, ec.xFilesFactor) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + xff := s.xFilesFactor + if xff == 0 { + xff = xFilesFactor + } + n := aggrCount(s.Values) + if n/float64(len(s.Values)) < xff { + return nil, nil + } + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.roundFunction +func transformRoundFunction(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1 or 2", len(args)) + } + precision, err := getOptionalNumber(args, "precision", 1, 0) + if err != nil { + return nil, err + } + precisionProduct := math.Pow10(int(precision)) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Round(v*precisionProduct) / precisionProduct + } + if precision == 0 { + s.Name = fmt.Sprintf("round(%s)", s.Name) + } else { + s.Name = fmt.Sprintf("round(%s,%g)", s.Name, precision) + } + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.scale +func transformScale(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + factor, err := getNumber(args, "factor", 1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = v * factor + } + s.Name = fmt.Sprintf("scale(%s,%g)", s.Name, factor) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.seriesByTag +func transformSeriesByTag(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) == 0 { + return nil, fmt.Errorf("at least one tagExpression must be passed to seriesByTag") + } + var tagExpressions []string + for i := 0; i < len(args); i++ { + te, err := getString(args, "tagExpressions", i) + if err != nil { + return nil, err + } + tagExpressions = append(tagExpressions, te) + } + sq, err := getSearchQueryForExprs(ec.currentTime, ec.etfs, tagExpressions, *maxGraphiteSeries) + if err != nil { + return nil, err + } + sq.MinTimestamp = ec.startTime + sq.MaxTimestamp = ec.endTime + return newNextSeriesForSearchQuery(ec, sq, fe) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.setXFilesFactor +func transformSetXFilesFactor(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + xFilesFactor, err := getNumber(args, "xFilesFactor", 1) + if err != nil { + return nil, err + } + ecCopy := *ec + ecCopy.xFilesFactor = xFilesFactor + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + xFilesFactorStr := fmt.Sprintf("%g", xFilesFactor) + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.xFilesFactor = xFilesFactor + s.Tags["xFilesFactor"] = xFilesFactorStr + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sumSeriesWithWildcards +func transformSumSeriesWithWildcards(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesWithWildcardsGeneric(ec, fe, "sum") +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.summarize +func transformSummarize(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 2 to 4", len(args)) + } + intervalString, err := getString(args, "intervalString", 1) + if err != nil { + return nil, err + } + interval, err := parseInterval(intervalString) + if err != nil { + return nil, fmt.Errorf("cannot parse intervalString: %w", err) + } + if interval <= 0 { + return nil, fmt.Errorf("interval must be positive; got %dms", interval) + } + funcName, err := getOptionalString(args, "func", 2, "sum") + if err != nil { + return nil, err + } + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + return nil, err + } + alignToFrom, err := getOptionalBool(args, "alignToFrom", 3, false) + if err != nil { + return nil, err + } + ecCopy := *ec + if !alignToFrom { + ecCopy.startTime -= ecCopy.startTime % interval + ecCopy.endTime += interval - ecCopy.endTime%interval + } + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.summarize(aggrFunc, ecCopy.startTime, ecCopy.endTime, interval, s.xFilesFactor) + s.Tags["summarize"] = intervalString + s.Tags["summarizeFunction"] = funcName + if alignToFrom { + s.Name = fmt.Sprintf("summarize(%s,%s,%s,true)", s.Name, graphiteql.QuoteString(intervalString), graphiteql.QuoteString(funcName)) + } else { + s.Name = fmt.Sprintf("summarize(%s,%s,%s)", s.Name, graphiteql.QuoteString(intervalString), graphiteql.QuoteString(funcName)) + } + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.weightedAverage +func transformWeightedAverage(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2 at least", len(args)) + } + nodes, err := getNodes(args[2:]) + if err != nil { + return nil, err + } + avgSeries, err := evalSeriesList(ec, args, "seriesListAvg", 0) + if err != nil { + return nil, err + } + ss, stepAvg, err := fetchNormalizedSeries(ec, avgSeries, false) + if err != nil { + return nil, err + } + weightSeries, err := evalSeriesList(ec, args, "seriesListWeight", 1) + if err != nil { + return nil, err + } + ssWeight, stepWeight, err := fetchNormalizedSeries(ec, weightSeries, false) + if err != nil { + return nil, err + } + if len(ss) != len(ssWeight) { + return nil, fmt.Errorf("series len mismatch, got seriesListAvg: %d,seriesListWeight: %d ", len(ss), len(ssWeight)) + } + if stepAvg != stepWeight { + return nil, fmt.Errorf("step mismatch for seriesListAvg and seriesListWeight: %d vs %d", stepAvg, stepWeight) + } + mAvg := groupSeriesByNodes(ss, nodes) + mWeight := groupSeriesByNodes(ssWeight, nodes) + var ssProduct []*series + for k, ss := range mAvg { + wss := mWeight[k] + if len(wss) == 0 { + continue + } + s := ss[len(ss)-1] + ws := wss[len(wss)-1] + values := s.Values + valuesWeight := ws.Values + for i, v := range values { + values[i] = v * valuesWeight[i] + } + ssProduct = append(ssProduct, s) + } + if len(ssProduct) == 0 { + return multiSeriesFunc(nil), nil + } + + step := stepAvg + as := newAggrStateSum(ec.pointsLen(step)) + for _, s := range ssProduct { + as.Update(s.Values) + } + values := as.Finalize(ec.xFilesFactor) + + asWeight := newAggrStateSum(ec.pointsLen(step)) + for _, s := range ssWeight { + asWeight.Update(s.Values) + } + valuesWeight := asWeight.Finalize(ec.xFilesFactor) + + for i, v := range values { + values[i] = v / valuesWeight[i] + } + + var nodesStr []string + for _, node := range nodes { + nodesStr = append(nodesStr, string(node.AppendString(nil))) + } + name := fmt.Sprintf("weightedAverage(%s,%s,%s)", + formatPathsFromSeries(ss), + formatPathsFromSeries(ssWeight), + strings.Join(nodesStr, ","), + ) + sResult := &series{ + Name: name, + Tags: map[string]string{"name": name}, + Timestamps: ec.newTimestamps(step), + Values: values, + expr: fe, + pathExpression: name, + step: step, + } + return singleSeriesFunc(sResult), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.timeFunction +func transformTimeFunction(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 or 2 args", len(args)) + } + name, err := getString(args, "name", 0) + if err != nil { + return nil, err + } + step, err := getOptionalNumber(args, "step", 1, 60) + if err != nil { + return nil, err + } + stepMsecs := int64(step * 1000) + var values []float64 + var timestamps []int64 + ts := ec.startTime + for ts <= ec.endTime { + timestamps = append(timestamps, ts) + values = append(values, float64(ts/1000)) + ts += stepMsecs + } + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: timestamps, + Values: values, + expr: fe, + pathExpression: name, + step: stepMsecs, + } + return singleSeriesFunc(s), nil +} + +func getWindowSize(ec *evalConfig, windowSizeArg *graphiteql.ArgExpr) (windowSize int64, stepsCount float64, err error) { + switch t := windowSizeArg.Expr.(type) { + case *graphiteql.NumberExpr: + stepsCount = t.N + windowSize = int64(t.N * float64(ec.storageStep)) + case *graphiteql.StringExpr: + ws, err := parseInterval(t.S) + if err != nil { + return 0, 0, fmt.Errorf("cannot parse windowSize: %w", err) + } + windowSize = ws + default: + return 0, 0, fmt.Errorf("unexpected type for windowSize arg: %T; expecting number or string", windowSizeArg.Expr) + } + if windowSize <= 0 { + return 0, 0, fmt.Errorf("windowSize must be positive; got %dms", windowSize) + } + return windowSize, stepsCount, nil +} + +func getArg(args []*graphiteql.ArgExpr, name string, index int) (*graphiteql.ArgExpr, error) { + for _, arg := range args { + if arg.Name == name { + return arg, nil + } + } + if index >= len(args) { + return nil, fmt.Errorf("missing arg %q at position %d", name, index) + } + arg := args[index] + if arg.Name != "" { + return nil, fmt.Errorf("unexpected named arg at position %d: %q", index, arg.Name) + } + return arg, nil +} + +func getOptionalArg(args []*graphiteql.ArgExpr, name string, index int) *graphiteql.ArgExpr { + for _, arg := range args { + if arg.Name == name { + return arg + } + } + if index >= len(args) { + return nil + } + arg := args[index] + if arg.Name != "" { + return nil + } + return arg +} + +func evalSeriesList(ec *evalConfig, args []*graphiteql.ArgExpr, name string, index int) (nextSeriesFunc, error) { + arg, err := getArg(args, name, index) + if err != nil { + return nil, err + } + nextSeries, err := evalExpr(ec, arg.Expr) + if err != nil { + return nil, fmt.Errorf("cannot evaluate arg %q at position %d: %w", name, index, err) + } + return nextSeries, nil +} + +func getInts(args []*graphiteql.ArgExpr, name string) ([]int, error) { + var ns []int + for i := range args { + n, err := getNumber(args, name, i) + if err != nil { + return nil, err + } + ns = append(ns, int(n)) + } + return ns, nil +} + +func getNumber(args []*graphiteql.ArgExpr, name string, index int) (float64, error) { + arg, err := getArg(args, name, index) + if err != nil { + return 0, err + } + ne, ok := arg.Expr.(*graphiteql.NumberExpr) + if !ok { + return 0, fmt.Errorf("arg %q at position %d must be a number; got %T", name, index, arg.Expr) + } + return ne.N, nil +} + +func getOptionalNumber(args []*graphiteql.ArgExpr, name string, index int, defaultValue float64) (float64, error) { + arg := getOptionalArg(args, name, index) + if arg == nil { + return defaultValue, nil + } + if _, ok := arg.Expr.(*graphiteql.NoneExpr); ok { + return defaultValue, nil + } + ne, ok := arg.Expr.(*graphiteql.NumberExpr) + if !ok { + return 0, fmt.Errorf("arg %q at position %d must be a number; got %T", name, index, arg.Expr) + } + return ne.N, nil +} + +func getString(args []*graphiteql.ArgExpr, name string, index int) (string, error) { + arg, err := getArg(args, name, index) + if err != nil { + return "", err + } + se, ok := arg.Expr.(*graphiteql.StringExpr) + if !ok { + return "", fmt.Errorf("arg %q at position %d must be a string; got %T", name, index, arg.Expr) + } + return se.S, nil +} + +func getOptionalString(args []*graphiteql.ArgExpr, name string, index int, defaultValue string) (string, error) { + arg := getOptionalArg(args, name, index) + if arg == nil { + return defaultValue, nil + } + if _, ok := arg.Expr.(*graphiteql.NoneExpr); ok { + return defaultValue, nil + } + se, ok := arg.Expr.(*graphiteql.StringExpr) + if !ok { + return "", fmt.Errorf("arg %q at position %d must be a string; got %T", name, index, arg.Expr) + } + return se.S, nil +} + +func getOptionalBool(args []*graphiteql.ArgExpr, name string, index int, defaultValue bool) (bool, error) { + arg := getOptionalArg(args, name, index) + if arg == nil { + return defaultValue, nil + } + if _, ok := arg.Expr.(*graphiteql.NoneExpr); ok { + return defaultValue, nil + } + be, ok := arg.Expr.(*graphiteql.BoolExpr) + if !ok { + return false, fmt.Errorf("arg %q at position %d must be a bool; got %T", name, index, arg.Expr) + } + return be.B, nil +} + +func getRegexp(args []*graphiteql.ArgExpr, name string, index int) (*regexp.Regexp, error) { + search, err := getString(args, name, index) + if err != nil { + return nil, err + } + re, err := regexp.Compile(search) + if err != nil { + return nil, fmt.Errorf("cannot compile search regexp %q: %w", search, err) + } + return re, nil +} + +func getRegexpReplacement(args []*graphiteql.ArgExpr, name string, index int) (string, error) { + replace, err := getString(args, name, index) + if err != nil { + return "", err + } + return graphiteToGolangRegexpReplace(replace), nil +} + +func graphiteToGolangRegexpReplace(replace string) string { + return graphiteToGolangRe.ReplaceAllString(replace, "$$${1}") +} + +var graphiteToGolangRe = regexp.MustCompile(`\\(\d+)`) + +func getNodes(args []*graphiteql.ArgExpr) ([]graphiteql.Expr, error) { + var nodes []graphiteql.Expr + for i := 0; i < len(args); i++ { + expr := args[i].Expr + switch expr.(type) { + case *graphiteql.NumberExpr, *graphiteql.StringExpr: + default: + return nil, fmt.Errorf("unexpected arg type for `nodes`; got %T; expecting number or string", expr) + } + nodes = append(nodes, expr) + } + return nodes, nil +} + +func fetchNormalizedSeriesByNodes(ec *evalConfig, nextSeries nextSeriesFunc, nodes []graphiteql.Expr) (map[string][]*series, int64, error) { + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, 0, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + return s, nil + }) + ss, err := fetchAllSeries(f) + if err != nil { + return nil, 0, err + } + return groupSeriesByNodes(ss, nodes), step, nil +} + +func groupSeriesByNodes(ss []*series, nodes []graphiteql.Expr) map[string][]*series { + m := make(map[string][]*series) + for _, s := range ss { + key := getNameFromNodes(s.Name, s.Tags, nodes) + m[key] = append(m[key], s) + } + return m +} + +func getNameFromNodes(name string, tags map[string]string, nodes []graphiteql.Expr) string { + if len(nodes) == 0 { + return "" + } + path := getPathFromName(name) + parts := strings.Split(path, ".") + var dstParts []string + for _, node := range nodes { + switch t := node.(type) { + case *graphiteql.NumberExpr: + if n := int(t.N); n >= 0 && n < len(parts) { + dstParts = append(dstParts, parts[n]) + } + case *graphiteql.StringExpr: + if v := tags[t.S]; v != "" { + dstParts = append(dstParts, v) + } + } + } + return strings.Join(dstParts, ".") +} + +func getPathFromName(s string) string { + expr, err := graphiteql.Parse(s) + if err != nil { + return s + } + for { + switch t := expr.(type) { + case *graphiteql.MetricExpr: + return t.Query + case *graphiteql.FuncExpr: + for _, arg := range t.Args { + if me, ok := arg.Expr.(*graphiteql.MetricExpr); ok { + return me.Query + } + } + if len(t.Args) == 0 { + return s + } + expr = t.Args[0].Expr + case *graphiteql.StringExpr: + return t.S + case *graphiteql.NumberExpr: + return string(t.AppendString(nil)) + case *graphiteql.BoolExpr: + return strconv.FormatBool(t.B) + default: + return s + } + } +} + +func fetchNormalizedSeries(ec *evalConfig, nextSeries nextSeriesFunc, isConcurrent bool) ([]*series, int64, error) { + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, 0, err + } + nextSeriesWrapper := getNextSeriesWrapper(isConcurrent) + f := nextSeriesWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + return s, nil + }) + ss, err := fetchAllSeries(f) + if err != nil { + return nil, 0, err + } + return ss, step, nil +} + +func fetchAllSeries(nextSeries nextSeriesFunc) ([]*series, error) { + var ss []*series + for { + s, err := nextSeries() + if err != nil { + return nil, err + } + if s == nil { + return ss, nil + } + ss = append(ss, s) + } +} + +func drainAllSeries(nextSeries nextSeriesFunc) (int, error) { + seriesCount := 0 + for { + s, err := nextSeries() + if err != nil { + return seriesCount, err + } + if s == nil { + return seriesCount, nil + } + seriesCount++ + } +} + +func singleSeriesFunc(s *series) nextSeriesFunc { + return multiSeriesFunc([]*series{s}) +} + +func multiSeriesFunc(ss []*series) nextSeriesFunc { + for _, s := range ss { + if s == nil { + panic(fmt.Errorf("BUG: all the series passed to multiSeriesFunc must be non-nil")) + } + } + f := func() (*series, error) { + if len(ss) == 0 { + return nil, nil + } + s := ss[0] + ss = ss[1:] + return s, nil + } + return f +} + +func nextSeriesGroup(nextSeriess []nextSeriesFunc, expr graphiteql.Expr) nextSeriesFunc { + f := func() (*series, error) { + for { + if len(nextSeriess) == 0 { + return nil, nil + } + nextSeries := nextSeriess[0] + s, err := nextSeries() + if err != nil { + for _, f := range nextSeriess[1:] { + _, _ = drainAllSeries(f) + } + nextSeriess = nil + return nil, err + } + if s != nil { + if expr != nil { + s.expr = expr + } + return s, nil + } + nextSeriess = nextSeriess[1:] + } + } + return f +} + +func getNextSeriesWrapperForAggregateFunc(funcName string) func(nextSeriesFunc, func(s *series) (*series, error)) nextSeriesFunc { + isConcurrent := !isSerialFunc(funcName) + return getNextSeriesWrapper(isConcurrent) +} + +func isSerialFunc(funcName string) bool { + switch funcName { + case "diff", "first", "last", "current", "pow": + return true + } + return false +} + +func getNextSeriesWrapper(isConcurrent bool) func(nextSeriesFunc, func(s *series) (*series, error)) nextSeriesFunc { + if isConcurrent { + return nextSeriesConcurrentWrapper + } + return nextSeriesSerialWrapper +} + +// nextSeriesSerialWrapper serially fetches series from nextSeries and passes them to f. +// +// see nextSeriesConcurrentWrapper for CPU-bound f. +// +// If f returns (nil, nil), then the current series is skipped. +// If f returns non-nil error, then nextSeries is drained with drainAllSeries. +func nextSeriesSerialWrapper(nextSeries nextSeriesFunc, f func(s *series) (*series, error)) nextSeriesFunc { + wrapper := func() (*series, error) { + for { + s, err := nextSeries() + if err != nil { + return nil, err + } + if s == nil { + return nil, nil + } + sNew, err := f(s) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + if sNew != nil { + return sNew, nil + } + } + } + return wrapper +} + +// nextSeriesConcurrentWrapper fetches multiple series from nextSeries and calls f on these series from concurrent goroutines. +// +// This function is useful for parallelizing CPU-bound f across available CPU cores. +// f must be goroutine-safe, since it is called from multiple concurrent goroutines. +// See nextSeriesSerialWrapper for serial calls to f. +// +// If f returns (nil, nil), then the current series is skipped. +// If f returns non-nil error, then nextSeries is drained. +// +// nextSeries is called serially. +func nextSeriesConcurrentWrapper(nextSeries nextSeriesFunc, f func(s *series) (*series, error)) nextSeriesFunc { + goroutines := cgroup.AvailableCPUs() + type result struct { + s *series + err error + } + resultCh := make(chan *result, goroutines) + seriesCh := make(chan *series, goroutines) + errCh := make(chan error, 1) + var wg sync.WaitGroup + wg.Add(goroutines) + go func() { + var err error + for { + s, e := nextSeries() + if e != nil || s == nil { + err = e + break + } + seriesCh <- s + } + close(seriesCh) + wg.Wait() + close(resultCh) + errCh <- err + close(errCh) + }() + var skipProcessing uint32 + for i := 0; i < goroutines; i++ { + go func() { + defer wg.Done() + for s := range seriesCh { + if atomic.LoadUint32(&skipProcessing) != 0 { + continue + } + sNew, err := f(s) + if err != nil { + // Drain the rest of series and do not call f for them in order to conserve CPU time. + atomic.StoreUint32(&skipProcessing, 1) + resultCh <- &result{ + err: err, + } + } else if sNew != nil { + resultCh <- &result{ + s: sNew, + } + } + } + }() + } + wrapper := func() (*series, error) { + r := <-resultCh + if r == nil { + err := <-errCh + return nil, err + } + if r.err != nil { + // Drain the rest of series before returning the error. + for range resultCh { + } + <-errCh + return nil, r.err + } + if r.s == nil { + panic(fmt.Errorf("BUG: r.s must be non-nil")) + } + return r.s, nil + } + return wrapper +} + +func newZeroSeriesFunc() nextSeriesFunc { + f := func() (*series, error) { + return nil, nil + } + return f +} + +func unmarshalTags(s string) map[string]string { + if len(s) == 0 { + return make(map[string]string) + } + tmp := strings.Split(s, ";") + m := make(map[string]string, len(tmp)) + m["name"] = tmp[0] + for _, x := range tmp[1:] { + kv := strings.SplitN(x, "=", 2) + if len(kv) == 2 { + m[kv[0]] = kv[1] + } + } + return m +} + +func marshalTags(m map[string]string) string { + parts := make([]string, 0, len(m)) + parts = append(parts, m["name"]) + for k, v := range m { + if k != "name" { + parts = append(parts, k+"="+v) + } + } + sort.Strings(parts[1:]) + return strings.Join(parts, ";") +} + +func formatKeyFromTags(tags map[string]string, tagKeys map[string]struct{}, defaultName string) string { + newTags := make(map[string]string) + for key := range tagKeys { + newTags[key] = tags[key] + } + if _, ok := tagKeys["name"]; !ok { + newTags["name"] = defaultName + } + return marshalTags(newTags) +} + +func formatPathsFromSeries(ss []*series) string { + seriesExpressions := make([]string, len(ss)) + for i, s := range ss { + seriesExpressions[i] = s.pathExpression + } + return formatPathsFromSeriesExpressions(seriesExpressions, true) +} + +func formatAggrFuncForPercentSeriesNames(funcName string, seriesNames []string) string { + if len(seriesNames) == 0 { + return "None" + } + if len(seriesNames) == 1 { + return seriesNames[0] + } + return formatAggrFuncForSeriesNames(funcName, seriesNames) +} + +func formatAggrFuncForSeriesNames(funcName string, seriesNames []string) string { + if len(seriesNames) == 0 { + return "None" + } + sortPaths := !isSerialFunc(funcName) + return fmt.Sprintf("%sSeries(%s)", funcName, formatPathsFromSeriesExpressions(seriesNames, sortPaths)) +} + +func formatPathsFromSeriesExpressions(seriesExpressions []string, sortPaths bool) string { + if len(seriesExpressions) == 0 { + return "" + } + paths := make([]string, 0, len(seriesExpressions)) + visitedPaths := make(map[string]struct{}) + for _, path := range seriesExpressions { + if _, ok := visitedPaths[path]; ok { + continue + } + visitedPaths[path] = struct{}{} + paths = append(paths, path) + } + if sortPaths { + sort.Strings(paths) + } + return strings.Join(paths, ",") +} + +func newNaNSeries(ec *evalConfig, step int64) *series { + values := make([]float64, ec.pointsLen(step)) + for i := 0; i < len(values); i++ { + values[i] = nan + } + return &series{ + Tags: map[string]string{}, + Timestamps: ec.newTimestamps(step), + Values: values, + step: step, + } +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.verticalLine +func transformVerticalLine(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1, 2 or 3 args", len(args)) + } + tsArg, err := getString(args, "ts", 0) + if err != nil { + return nil, err + } + ts, err := parseTime(ec.currentTime, tsArg) + if err != nil { + return nil, err + } + name, err := getOptionalString(args, "label", 1, "") + if err != nil { + return nil, err + } + start := ec.startTime + if ts < start { + return nil, fmt.Errorf("verticalLine(): timestamp %d exists before start of range: %d", ts, start) + } + end := ec.endTime + if ts > end { + return nil, fmt.Errorf("verticalLine(): timestamp %d exists after end of range: %d", ts, end) + } + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: []int64{ts, ts}, + Values: []float64{1.0, 1.0}, + expr: fe, + pathExpression: name, + step: ec.endTime - ec.startTime, + } + return singleSeriesFunc(s), nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.useSeriesAbove +func transformUseSeriesAbove(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 4 args", len(args)) + } + value, err := getNumber(args, "value", 1) + if err != nil { + return nil, err + } + searchRe, err := getRegexp(args, "search", 2) + if err != nil { + return nil, err + } + replace, err := getRegexpReplacement(args, "replace", 3) + if err != nil { + return nil, err + } + ss, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + + var seriesNames []string + var lock sync.Mutex + f := nextSeriesConcurrentWrapper(ss, func(s *series) (*series, error) { + for _, v := range s.Values { + if v <= value { + continue + } + newName := searchRe.ReplaceAllString(s.Name, replace) + lock.Lock() + seriesNames = append(seriesNames, newName) + lock.Unlock() + break + } + return s, nil + }) + if _, err = drainAllSeries(f); err != nil { + return nil, err + } + query := fmt.Sprintf("group(%s)", strings.Join(seriesNames, ",")) + return execExpr(ec, query) +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.unique +func transformUnique(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + var uniqSeries []nextSeriesFunc + uniq := make(map[string]struct{}) + for i := range args { + nextSS, err := evalSeriesList(ec, args, "seriesList", i) + if err != nil { + for _, s := range uniqSeries { + _, _ = drainAllSeries(s) + } + return nil, err + } + // Use nextSeriesSerialWrapper in order to guarantee that the first series among duplicate series is returned. + nextUniq := nextSeriesSerialWrapper(nextSS, func(s *series) (*series, error) { + name := s.Name + if _, ok := uniq[name]; !ok { + uniq[name] = struct{}{} + return s, nil + } + return nil, nil + }) + uniqSeries = append(uniqSeries, nextUniq) + } + return nextSeriesGroup(uniqSeries, fe), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.transformNull +func transformTransformNull(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1,2 or 3 args", len(args)) + } + defaultValue, err := getOptionalNumber(args, "default", 1, 0) + if err != nil { + return nil, err + } + defaultStr := fmt.Sprintf("%g", defaultValue) + referenceSeries := getOptionalArg(args, "referenceSeries", 2) + if referenceSeries == nil { + // referenceSeries isn't set. Replace all NaNs with defaultValue. + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + if math.IsNaN(v) { + values[i] = defaultValue + } + } + s.Tags["transformNull"] = defaultStr + s.Name = fmt.Sprintf("transformNull(%s,%s)", s.Name, defaultStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil + } + + // referenceSeries is set. Replace NaNs with defaultValue only if referenceSeries has non-NaN value at the given point. + // Series must be normalized in order to match referenceSeries points. + nextRefSeries, err := evalExpr(ec, referenceSeries.Expr) + if err != nil { + return nil, fmt.Errorf("cannot evaluate referenceSeries: %w", err) + } + ssRef, step, err := fetchNormalizedSeries(ec, nextRefSeries, true) + if err != nil { + return nil, err + } + replaceNan := make([]bool, ec.pointsLen(step)) + for i := range replaceNan { + for _, sRef := range ssRef { + if !math.IsNaN(sRef.Values[i]) { + replaceNan[i] = true + break + } + } + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + values := s.Values + for i, v := range values { + if replaceNan[i] && math.IsNaN(v) { + values[i] = defaultValue + } + } + s.Tags["transformNull"] = defaultStr + s.Tags["referenceSeries"] = "1" + s.Name = fmt.Sprintf("transformNull(%s,%s,referenceSeries)", s.Name, defaultStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.timeStack +func transformTimeStack(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 4 args", len(args)) + } + timeShiftUnit, err := getOptionalString(args, "timeShiftUnit", 1, "1d") + if err != nil { + return nil, err + } + delta, err := parseInterval(timeShiftUnit) + if err != nil { + return nil, err + } + if delta > 0 && !strings.HasPrefix(timeShiftUnit, "+") { + delta = -delta + } + start, err := getOptionalNumber(args, "timeShiftStart", 2, 0) + if err != nil { + return nil, err + } + end, err := getOptionalNumber(args, "timeShiftEnd", 3, 7) + if err != nil { + return nil, err + } + if end < start { + return nil, fmt.Errorf("timeShiftEnd=%g cannot be smaller than timeShiftStart=%g", end, start) + } + var allSeries []nextSeriesFunc + for shift := int64(start); shift <= int64(end); shift++ { + innerDelta := delta * shift + ecCopy := *ec + ecCopy.startTime = ecCopy.startTime + innerDelta + ecCopy.endTime = ecCopy.endTime + innerDelta + nextSS, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + for _, f := range allSeries { + _, _ = drainAllSeries(f) + } + return nil, err + } + shiftStr := fmt.Sprintf("%d", shift) + f := nextSeriesConcurrentWrapper(nextSS, func(s *series) (*series, error) { + timestamps := s.Timestamps + for i := range timestamps { + timestamps[i] -= innerDelta + } + s.Tags["timeShiftUnit"] = timeShiftUnit + s.Tags["timeShift"] = shiftStr + s.Name = fmt.Sprintf("timeShift(%s,%s,%s)", s.Name, timeShiftUnit, shiftStr) + s.expr = fe + s.pathExpression = s.Name + + return s, nil + }) + allSeries = append(allSeries, f) + } + return nextSeriesGroup(allSeries, fe), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.timeSlice +func transformTimeSlice(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 2 or 3 args", len(args)) + } + startStr, err := getString(args, "startSliceAt", 1) + if err != nil { + return nil, err + } + start, err := parseTime(ec.currentTime, startStr) + if err != nil { + return nil, err + } + endStr, err := getOptionalString(args, "endSliceAt", 2, "now") + if err != nil { + return nil, err + } + end, err := parseTime(ec.currentTime, endStr) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + startSecsStr := fmt.Sprintf("%d", start/1000) + endSecsStr := fmt.Sprintf("%d", end/1000) + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + timestamps := s.Timestamps + for i := range values { + if timestamps[i] < start || timestamps[i] > end { + values[i] = nan + } + } + s.Tags["timeSliceStart"] = startSecsStr + s.Tags["timeSliceEnd"] = endSecsStr + s.Name = fmt.Sprintf("timeSlice(%s,%s,%s)", s.Name, startSecsStr, endSecsStr) + + s.expr = fe + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.timeShift +func transformTimeShift(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 2 to 4 args", len(args)) + } + timeShiftStr, err := getString(args, "timeShift", 1) + if err != nil { + return nil, err + } + timeShift, err := parseInterval(timeShiftStr) + if err != nil { + return nil, err + } + if timeShift > 0 && !strings.HasPrefix(timeShiftStr, "+") { + timeShift = -timeShift + } + resetEnd, err := getOptionalBool(args, "resetEnd", 2, true) + if err != nil { + return nil, err + } + _, err = getOptionalBool(args, "alignDST", 3, false) + if err != nil { + return nil, err + } + // TODO: properly use alignDST + + ecCopy := *ec + ecCopy.startTime += timeShift + ecCopy.endTime += timeShift + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + if resetEnd { + for i, ts := range s.Timestamps { + if ts > ec.endTime { + s.Timestamps = s.Timestamps[:i] + s.Values = s.Values[:i] + break + } + } + } + timestamps := s.Timestamps + for i := range timestamps { + timestamps[i] -= timeShift + } + s.Tags["timeShift"] = timeShiftStr + s.Name = fmt.Sprintf(`timeShift(%s,%s)`, s.Name, graphiteql.QuoteString(timeShiftStr)) + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.nPercentile +func transformNPercentile(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + n, err := getNumber(args, "n", 1) + if err != nil { + return nil, err + } + nStr := fmt.Sprintf("%g", n) + aggrFunc := newAggrFuncPercentile(n) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + percentile := aggrFunc(values) + for i := range values { + values[i] = percentile + } + s.Tags["nPercentile"] = nStr + s.Name = fmt.Sprintf("nPercentile(%s,%s)", s.Name, nStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.nonNegativeDerivative +func transformNonNegativeDerivative(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 1 to 3", len(args)) + } + maxValue, err := getOptionalNumber(args, "maxValue", 1, nan) + if err != nil { + return nil, err + } + minValue, err := getOptionalNumber(args, "minValue", 2, nan) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + prev := nan + var delta float64 + values := s.Values + for i, v := range values { + delta, prev = nonNegativeDelta(v, prev, maxValue, minValue) + values[i] = delta + } + s.Tags["nonNegativeDerivative"] = "1" + s.Name = fmt.Sprintf(`nonNegativeDerivative(%s)`, s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.offset +func transformOffset(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + factor, err := getNumber(args, "factor", 1) + if err != nil { + return nil, err + } + factorStr := fmt.Sprintf("%g", factor) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + if !math.IsNaN(v) { + values[i] = v + factor + } + } + s.Tags["offset"] = factorStr + s.Name = fmt.Sprintf("offset(%s,%s)", s.Name, factorStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.offsetToZero +func transformOffsetToZero(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + min := aggrMin(values) + for i, v := range values { + values[i] = v - min + } + s.Tags["offsetToZero"] = fmt.Sprintf("%g", min) + s.Name = fmt.Sprintf("offsetToZero(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.perSecond +func transformPerSecond(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + maxValue, err := getOptionalNumber(args, "maxValue", 1, nan) + if err != nil { + return nil, err + } + minValue, err := getOptionalNumber(args, "minValue", 2, nan) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + prev := nan + var delta float64 + values := s.Values + timestamps := s.Timestamps + for i, v := range values { + delta, prev = nonNegativeDelta(v, prev, maxValue, minValue) + stepSecs := nan + if i > 0 { + stepSecs = float64(timestamps[i]-timestamps[i-1]) / 1000 + } + values[i] = delta / stepSecs + } + s.Tags["perSecond"] = "1" + s.Name = fmt.Sprintf(`perSecond(%s)`, s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +func nonNegativeDelta(curr, prev, max, min float64) (float64, float64) { + if !math.IsNaN(max) && curr > max { + return nan, nan + } + if !math.IsNaN(min) && curr < min { + return nan, nan + } + if math.IsNaN(curr) || math.IsNaN(prev) { + return nan, curr + } + if curr >= prev { + return curr - prev, curr + } + if !math.IsNaN(max) { + if math.IsNaN(min) { + min = float64(0) + } + return max + 1 + curr - prev - min, curr + } + if !math.IsNaN(min) { + return curr - min, curr + } + return nan, curr +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.threshold +func transformThreshold(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + value, err := getNumber(args, "value", 0) + if err != nil { + return nil, err + } + label, err := getOptionalString(args, "label", 1, "") + if err != nil { + return nil, err + } + _, err = getOptionalString(args, "color", 2, "") + if err != nil { + return nil, err + } + nextSeries := constantLine(ec, fe, value) + if label == "" { + return nextSeries, nil + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + s.Name = label + s.expr = fe + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sumSeries +func transformSumSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "sum") +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.substr +func transformSubstr(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + startf, err := getOptionalNumber(args, "start", 1, 0) + if err != nil { + return nil, err + } + stopf, err := getOptionalNumber(args, "stop", 2, 0) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + path := getPathFromName(s.Name) + splitName := strings.Split(path, ".") + start := int(startf) + stop := int(stopf) + if start > len(splitName) { + start = len(splitName) + } else if start < 0 { + start = len(splitName) + start + if start < 0 { + start = 0 + } + } + if stop == 0 { + stop = len(splitName) + } else if stop > len(splitName) { + stop = len(splitName) + } else if stop < 0 { + stop = len(splitName) + stop + if stop < 0 { + stop = 0 + } + } + if stop < start { + stop = start + } + s.Name = strings.Join(splitName[start:stop], ".") + s.expr = fe + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.stdev +func transformStdev(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 2 to 3 args", len(args)) + } + pointsf, err := getNumber(args, "points", 1) + if err != nil { + return nil, err + } + points := int(pointsf) + pointsStr := fmt.Sprintf("%d", points) + windowTolerance, err := getOptionalNumber(args, "windowTolerance", 2, 0.1) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + var sum, sum2 float64 + var n int + values := s.Values + dstValues := make([]float64, len(values)) + for i, v := range values { + if !math.IsNaN(v) { + n++ + sum += v + sum2 += v * v + } + if i >= points { + v = values[i-points] + if !math.IsNaN(v) { + n-- + sum -= v + sum2 -= v * v + } + } + stddev := nan + if n > 0 && float64(n)/pointsf >= windowTolerance { + stddev = math.Sqrt(float64(n)*sum2-sum*sum) / float64(n) + } + dstValues[i] = stddev + } + s.Values = dstValues + s.Tags["stdev"] = pointsStr + s.Name = fmt.Sprintf("stdev(%s,%s)", s.Name, pointsStr) + s.expr = fe + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.stddevSeries +func transformStddevSeries(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + return aggregateSeriesGeneric(ec, fe, "stddev") +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.stacked +func transformStacked(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 2 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 2 args", len(args)) + } + stackName, err := getOptionalString(args, "stackName", 1, "__DEFAULT__") + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + totalStack := make([]float64, ec.pointsLen(step)) + // Use nextSeriesSerialWrapper instead of nextSeriesConcurrentWrapper for preserving the original order of series. + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + // Consolidation is needed in order to align points in time. Otherwise stacking has little sense. + s.consolidate(ec, step) + values := s.Values + for i, v := range values { + if !math.IsNaN(v) { + totalStack[i] += v + values[i] = totalStack[i] + } + } + if stackName == "__DEFAULT__" { + s.Tags["stacked"] = stackName + s.Name = fmt.Sprintf("stacked(%s)", s.Name) + } + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.squareRoot +func transformSquareRoot(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 arg", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = math.Pow(v, 0.5) + } + s.Tags["squareRoot"] = "1" + s.Name = fmt.Sprintf("squareRoot(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sortByTotal +func transformSortByTotal(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 arg", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return sortByGeneric(ec, fe, nextSeries, "sum", true) +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sortBy +func transformSortBy(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + funcName, err := getOptionalString(args, "func", 1, "average") + if err != nil { + return nil, err + } + reverse, err := getOptionalBool(args, "reverse", 2, false) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return sortByGeneric(ec, fe, nextSeries, funcName, reverse) +} + +func sortByGeneric(ec *evalConfig, fe *graphiteql.FuncExpr, nextSeries nextSeriesFunc, funcName string, reverse bool) (nextSeriesFunc, error) { + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + _, _ = drainAllSeries(nextSeries) + return nil, err + } + var sws []seriesWithWeight + var ssLock sync.Mutex + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + v := aggrFunc(s.Values) + if math.IsNaN(v) { + v = math.Inf(-1) + } + s.expr = fe + ssLock.Lock() + sws = append(sws, seriesWithWeight{ + s: s, + v: v, + }) + ssLock.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + sort.Slice(sws, func(i, j int) bool { + left := sws[i].v + right := sws[j].v + if reverse { + left, right = right, left + } + return left < right + }) + ss := make([]*series, len(sws)) + for i, sw := range sws { + ss[i] = sw.s + } + return multiSeriesFunc(ss), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sortByName +func transformSortByName(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + natural, err := getOptionalBool(args, "natural", 1, false) + if err != nil { + return nil, err + } + reverse, err := getOptionalBool(args, "reverse", 2, false) + if err != nil { + return nil, err + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + type seriesWithName struct { + s *series + name string + } + var sns []seriesWithName + f := nextSeriesSerialWrapper(nextSeries, func(s *series) (*series, error) { + name := s.Name + sns = append(sns, seriesWithName{ + s: s, + name: name, + }) + s.expr = fe + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + sort.Slice(sns, func(i, j int) bool { + left := sns[i].name + right := sns[j].name + if reverse { + left, right = right, left + } + if natural { + return naturalLess(left, right) + } + return left < right + }) + ss := make([]*series, len(sns)) + for i, sn := range sns { + ss[i] = sn.s + } + return multiSeriesFunc(ss), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sortByMinima +func transformSortByMinima(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 arg", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + // Filter out series with all the values smaller than 0 + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + max := aggrMax(s.Values) + if math.IsNaN(max) || max <= 0 { + return nil, nil + } + return s, nil + }) + return sortByGeneric(ec, fe, f, "min", false) +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sortByMaxima +func transformSortByMaxima(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 arg", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + return sortByGeneric(ec, fe, nextSeries, "max", true) +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.smartSummarize +func transformSmartSummarize(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 2 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args; got %d; want from 2 to 4", len(args)) + } + intervalString, err := getString(args, "intervalString", 1) + if err != nil { + return nil, err + } + interval, err := parseInterval(intervalString) + if err != nil { + return nil, fmt.Errorf("cannot parse intervalString: %w", err) + } + if interval <= 0 { + return nil, fmt.Errorf("interval must be positive; got %dms", interval) + } + funcName, err := getOptionalString(args, "func", 2, "sum") + if err != nil { + return nil, err + } + aggrFunc, err := getAggrFunc(funcName) + if err != nil { + return nil, err + } + alignTo, err := getOptionalString(args, "alignTo", 3, "") + if err != nil { + return nil, err + } + ecCopy := *ec + if alignTo != "" { + ecCopy.startTime, err = alignTimeUnit(ecCopy.startTime, alignTo, ec.currentTime.Location()) + if err != nil { + return nil, err + } + } + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.summarize(aggrFunc, ecCopy.startTime, ecCopy.endTime, interval, s.xFilesFactor) + s.Tags["smartSummarize"] = intervalString + s.Tags["smartSummarizeFunction"] = funcName + s.Name = fmt.Sprintf("smartSummarize(%s,%s,%s)", s.Name, graphiteql.QuoteString(intervalString), graphiteql.QuoteString(funcName)) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +func alignTimeUnit(startTime int64, s string, tz *time.Location) (int64, error) { + t := time.Unix(startTime/1e3, (startTime%1000)*1e6).In(tz) + switch { + case strings.HasPrefix(s, "ms"): + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), t.Second(), (t.Nanosecond()/1e6)*1e6, tz) + case strings.HasPrefix(s, "s"): + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), t.Second(), 0, tz) + case strings.HasPrefix(s, "min"): + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), t.Minute(), 0, 0, tz) + case strings.HasPrefix(s, "h"): + t = time.Date(t.Year(), t.Month(), t.Day(), t.Hour(), 0, 0, 0, tz) + case strings.HasPrefix(s, "d"): + t = time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, tz) + case strings.HasPrefix(s, "w"): + // check for week day to align. + weekday := s[len(s)-1] + isoWeekDayAlignTo := 1 + if weekday >= '0' && weekday <= '9' { + isoWeekDayAlignTo = int(weekday - '0') + } + daysToSubtract := int(t.Weekday()) - isoWeekDayAlignTo + if daysToSubtract < 0 { + daysToSubtract += 7 + } + t = time.Date(t.Year(), t.Month(), t.Day(), 0, 0, 0, 0, tz).Add(-time.Hour * 24 * time.Duration(daysToSubtract)) + case strings.HasPrefix(s, "mon"): + t = time.Date(t.Year(), t.Month(), 0, 0, 0, 0, 0, tz) + case strings.HasPrefix(s, "y"): + t = time.Date(t.Year(), 0, 0, 0, 0, 0, 0, tz) + default: + return 0, fmt.Errorf("unsupported interval %q", s) + } + return t.UnixNano() / 1e6, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sinFunction +func transformSinFunction(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + name, err := getString(args, "name", 0) + if err != nil { + return nil, err + } + amplitude, err := getOptionalNumber(args, "amplitude", 1, 1) + if err != nil { + return nil, err + } + step, err := getOptionalNumber(args, "step", 2, 60) + if err != nil { + return nil, err + } + if step <= 0 { + return nil, fmt.Errorf("step must be positive; got %g", step) + } + stepMsecs := int64(step * 1000) + values := make([]float64, 0, ec.pointsLen(stepMsecs)) + timestamps := make([]int64, 0, ec.pointsLen(stepMsecs)) + ts := ec.startTime + for ts < ec.endTime { + v := amplitude * math.Sin(float64(ts)/1000) + values = append(values, v) + timestamps = append(timestamps, ts) + ts += stepMsecs + } + s := &series{ + Name: name, + Tags: unmarshalTags(name), + Timestamps: timestamps, + Values: values, + expr: fe, + pathExpression: name, + step: stepMsecs, + } + return singleSeriesFunc(s), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sigmoid +func transformSigmoid(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting 1 arg", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + values[i] = 1 / (1 + math.Exp(-v)) + } + s.Tags["sigmoid"] = "sigmoid" + s.Name = fmt.Sprintf("sigmoid(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.scaleToSeconds +func transformScaleToSeconds(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 2 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 2", len(args)) + } + seconds, err := getNumber(args, "seconds", 1) + if err != nil { + return nil, err + } + secondsStr := fmt.Sprintf("%g", seconds) + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + timestamps := s.Timestamps + values := s.Values + step := nan + if len(timestamps) > 1 { + step = float64(timestamps[1]-timestamps[0]) / 1000 + } + for i, v := range values { + if i > 0 { + step = float64(timestamps[i]-timestamps[i-1]) / 1000 + } + values[i] = v * (seconds / step) + } + s.Tags["scaleToSeconds"] = secondsStr + s.Name = fmt.Sprintf("scaleToSeconds(%s,%s)", s.Name, secondsStr) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.secondYAxis +func transformSecondYAxis(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.Tags["secondYAxis"] = "1" + s.Name = fmt.Sprintf("secondYAxis(%s)", s.Name) + s.expr = fe + return s, nil + }) + return f, nil +} + +// See https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.isNonNull +func transformIsNonNull(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) != 1 { + return nil, fmt.Errorf("unexpected number of args; got %d; want 1", len(args)) + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + values := s.Values + for i, v := range values { + if math.IsNaN(v) { + values[i] = 0 + } else { + values[i] = 1 + } + } + s.Tags["isNonNull"] = "1" + s.Name = fmt.Sprintf("isNonNull(%s)", s.Name) + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.sinFunction +func transformLinearRegression(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + ss, _, err := fetchNormalizedSeries(ec, nextSeries, false) + if err != nil { + return nil, err + } + startSourceAt := getOptionalArg(args, "startSourceAt", 1) + endSourceAt := getOptionalArg(args, "endSourceAt", 2) + + if startSourceAt == nil && endSourceAt == nil { + // fast path, calculate for series with the same time range. + return linearRegressionForSeries(ec, fe, ss, ss) + } + ecCopy := *ec + ecCopy.startTime, err = getTimeFromArgExpr(ecCopy.startTime, ecCopy.currentTime, startSourceAt) + if err != nil { + return nil, err + } + ecCopy.endTime, err = getTimeFromArgExpr(ecCopy.endTime, ecCopy.currentTime, endSourceAt) + if err != nil { + return nil, err + } + nextSourceSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + sourceSeries, _, err := fetchNormalizedSeries(&ecCopy, nextSourceSeries, false) + if err != nil { + return nil, err + } + return linearRegressionForSeries(&ecCopy, fe, ss, sourceSeries) +} + +func linearRegressionForSeries(ec *evalConfig, fe *graphiteql.FuncExpr, ss, sourceSeries []*series) (nextSeriesFunc, error) { + var resp []*series + for i := 0; i < len(ss); i++ { + source := sourceSeries[i] + s := ss[i] + s.Tags["linearRegressions"] = fmt.Sprintf("%d, %d", ec.startTime/1e3, ec.endTime/1e3) + s.Tags["name"] = s.Name + s.Name = fmt.Sprintf("linearRegression(%s, %d, %d)", s.Name, ec.startTime/1e3, ec.endTime/1e3) + s.expr = fe + s.pathExpression = s.Name + ok, factor, offset := linearRegressionAnalysis(source, float64(s.step)) + // skip + if !ok { + continue + } + values := s.Values + for j := 0; j < len(values); j++ { + values[j] = offset + (float64(int(s.Timestamps[0])+j*int(s.step)))*factor + } + resp = append(resp, s) + } + return multiSeriesFunc(resp), nil +} + +func getTimeFromArgExpr(originT int64, currentT time.Time, expr *graphiteql.ArgExpr) (int64, error) { + if expr == nil { + return originT, nil + } + switch data := expr.Expr.(type) { + case *graphiteql.NoneExpr: + case *graphiteql.StringExpr: + t, err := parseTime(currentT, data.S) + if err != nil { + return originT, err + } + originT = t + case *graphiteql.NumberExpr: + originT = int64(data.N * 1e3) + } + return originT, nil +} + +// Returns is_not_none, factor and offset of linear regression function by least squares method. +// https://en.wikipedia.org/wiki/Linear_least_squares +// https://github.com/graphite-project/graphite-web/blob/master/webapp/graphite/render/functions.py#L4158 +func linearRegressionAnalysis(s *series, step float64) (bool, float64, float64) { + if step == 0 { + return false, 0, 0 + } + var sumI, sumII int + var sumV, sumIV float64 + values := s.Values + for i, v := range values { + if math.IsNaN(v) { + continue + } + sumI += i + sumII += i * i + sumIV += float64(i) * v + sumV += v + } + denominator := float64(len(values)*sumII - sumI*sumI) + if denominator == 0 { + return false, 0.0, 0.0 + } + factor := (float64(len(values))*sumIV - float64(sumI)*sumV) / denominator / step + // safe to take index, denominator cannot be non zero in case of empty array. + offset := (float64(sumII)*sumV-sumIV*float64(sumI))/denominator - factor*float64(s.Timestamps[0]) + return true, factor, offset +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.holtWintersConfidenceBands +func transformHoltWintersConfidenceBands(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 4 args", len(args)) + } + resultSeries, err := holtWinterConfidenceBands(ec, fe, args) + if err != nil { + return nil, err + } + return multiSeriesFunc(resultSeries), nil +} + +func holtWinterConfidenceBands(ec *evalConfig, fe *graphiteql.FuncExpr, args []*graphiteql.ArgExpr) ([]*series, error) { + delta, err := getOptionalNumber(args, "delta", 1, 3) + if err != nil { + return nil, err + } + bootstrapInterval, err := getOptionalString(args, "bootstrapInterval", 2, "7d") + if err != nil { + return nil, err + } + bootstrapMs, err := parseInterval(bootstrapInterval) + if err != nil { + return nil, err + } + seasonality, err := getOptionalString(args, "seasonality", 3, "1d") + if err != nil { + return nil, err + } + + seasonalityMs, err := parseInterval(seasonality) + if err != nil { + return nil, err + } + ecCopy := *ec + ecCopy.startTime = ecCopy.startTime - bootstrapMs + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + trimWindowPoints := ecCopy.pointsLen(step) - ec.pointsLen(step) + var resultSeries []*series + var resultSeriesLock sync.Mutex + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(&ecCopy, step) + timeStamps := s.Timestamps[trimWindowPoints:] + analysis := holtWintersAnalysis(&ecCopy, s, seasonalityMs) + forecastValues := analysis.predictions.Values[trimWindowPoints:] + deviationValues := analysis.deviations.Values[trimWindowPoints:] + valuesLen := len(forecastValues) + upperBand := make([]float64, 0, valuesLen) + lowerBand := make([]float64, 0, valuesLen) + for i := 0; i < valuesLen; i++ { + forecastItem := forecastValues[i] + deviationItem := deviationValues[i] + if math.IsNaN(forecastItem) || math.IsNaN(deviationItem) { + upperBand = append(upperBand, nan) + lowerBand = append(lowerBand, nan) + } else { + scaledDeviation := delta * deviationItem + upperBand = append(upperBand, forecastItem+scaledDeviation) + lowerBand = append(lowerBand, forecastItem-scaledDeviation) + } + } + name := fmt.Sprintf("holtWintersConfidenceUpper(%s)", s.Name) + upperSeries := &series{ + Timestamps: timeStamps, + Values: upperBand, + Name: name, + Tags: map[string]string{"holtWintersConfidenceUpper": "1", "name": s.Name}, + expr: fe, + pathExpression: name, + step: step, + } + name = fmt.Sprintf("holtWintersConfidenceLower(%s)", s.Name) + lowerSeries := &series{ + Timestamps: timeStamps, + Values: lowerBand, + Name: name, + Tags: map[string]string{"holtWintersConfidenceLower": "1", "name": s.Name}, + expr: fe, + pathExpression: name, + step: step, + } + resultSeriesLock.Lock() + resultSeries = append(resultSeries, upperSeries, lowerSeries) + resultSeriesLock.Unlock() + return s, nil + }) + if _, err := drainAllSeries(f); err != nil { + return nil, err + } + return resultSeries, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.holtWintersConfidenceArea +func transformHoltWintersConfidenceArea(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 4 args", len(args)) + } + bands, err := holtWinterConfidenceBands(ec, fe, args) + if err != nil { + return nil, err + } + if len(bands) != 2 { + return nil, fmt.Errorf("expecting exactly two series; got more series") + } + for _, s := range bands { + s.Name = fmt.Sprintf("areaBetween(%s)", s.Name) + s.Tags["areaBetween"] = "1" + } + return multiSeriesFunc(bands), nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.holtWintersAberration +func transformHoltWintersAberration(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 4 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 4 args", len(args)) + } + bands, err := holtWinterConfidenceBands(ec, fe, args) + if err != nil { + return nil, err + } + confidenceBands := make(map[string][]float64) + for _, s := range bands { + confidenceBands[s.Name] = s.Values + } + nextSeries, err := evalSeriesList(ec, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(ec, step) + values := s.Values + lowerBand := confidenceBands[fmt.Sprintf("holtWintersConfidenceLower(%s)", s.Name)] + upperBand := confidenceBands[fmt.Sprintf("holtWintersConfidenceUpper(%s)", s.Name)] + if len(values) != len(lowerBand) || len(values) != len(upperBand) { + return nil, fmt.Errorf("bug, len mismatch for series: %d and upperBand values: %d or lowerBand values: %d", len(values), len(upperBand), len(lowerBand)) + } + aberration := make([]float64, 0, len(values)) + for i := 0; i < len(values); i++ { + v := values[i] + upperValue := upperBand[i] + lowerValue := lowerBand[i] + if math.IsNaN(v) { + aberration = append(aberration, 0) + continue + } + if !math.IsNaN(upperValue) && v > upperValue { + aberration = append(aberration, v-upperValue) + continue + } + if !math.IsNaN(lowerValue) && v < lowerValue { + aberration = append(aberration, v-lowerValue) + continue + } + aberration = append(aberration, 0) + } + s.Tags["holtWintersAberration"] = "1" + s.Name = fmt.Sprintf("holtWintersAberration(%s)", s.Name) + s.Values = aberration + s.expr = fe + s.pathExpression = s.Name + return s, nil + }) + return f, nil +} + +// https://graphite.readthedocs.io/en/stable/functions.html#graphite.render.functions.holtWintersForecast +func transformHoltWintersForecast(ec *evalConfig, fe *graphiteql.FuncExpr) (nextSeriesFunc, error) { + args := fe.Args + if len(args) < 1 || len(args) > 3 { + return nil, fmt.Errorf("unexpected number of args: %d; expecting from 1 to 3 args", len(args)) + } + bootstrapInterval, err := getOptionalString(args, "bootstrapInterval", 1, "7d") + if err != nil { + return nil, err + } + bootstrapMs, err := parseInterval(bootstrapInterval) + if err != nil { + return nil, err + } + seasonality, err := getOptionalString(args, "seasonality", 2, "1d") + if err != nil { + return nil, err + } + seasonalityMs, err := parseInterval(seasonality) + if err != nil { + return nil, err + } + + ecCopy := *ec + ecCopy.startTime = ecCopy.startTime - bootstrapMs + nextSeries, err := evalSeriesList(&ecCopy, args, "seriesList", 0) + if err != nil { + return nil, err + } + step, err := nextSeries.peekStep(ec.storageStep) + if err != nil { + return nil, err + } + trimWindowPoints := ecCopy.pointsLen(step) - ec.pointsLen(step) + f := nextSeriesConcurrentWrapper(nextSeries, func(s *series) (*series, error) { + s.consolidate(&ecCopy, step) + analysis := holtWintersAnalysis(&ecCopy, s, seasonalityMs) + predictions := analysis.predictions + + s.Tags["holtWintersForecast"] = "1" + s.Values = predictions.Values[trimWindowPoints:] + s.Timestamps = predictions.Timestamps[trimWindowPoints:] + newName := fmt.Sprintf("holtWintersForecast(%s)", s.Name) + s.Name = newName + s.Tags["name"] = newName + s.expr = fe + s.pathExpression = s.Name + + return s, nil + }) + return f, nil + +} + +func holtWintersAnalysis(ec *evalConfig, s *series, seasonality int64) holtWintersAnalysisResult { + alpha := 0.1 + gamma := alpha + beta := 0.0035 + + seasonLength := seasonality / s.step + + var intercept, seasonal, deviation, slope float64 + + intercepts := make([]float64, 0, len(s.Values)) + predictions := make([]float64, 0, len(s.Values)) + slopes := make([]float64, 0, len(s.Values)) + seasonals := make([]float64, 0, len(s.Values)) + deviations := make([]float64, 0, len(s.Values)) + + getlastSeasonal := func(i int64) float64 { + j := i - seasonLength + if j >= 0 { + return seasonals[j] + } + return 0 + } + + getlastDeviation := func(i int64) float64 { + j := i - seasonLength + if j >= 0 { + return deviations[j] + } + return 0 + } + var lastSeasonal, lastSeasonalDev, nextLastSeasonal float64 + nextPred := nan + + for i, v := range s.Values { + if math.IsNaN(v) { + intercepts = append(intercepts, 0) + slopes = append(slopes, 0) + seasonals = append(seasonals, 0) + predictions = append(predictions, nextPred) + deviations = append(deviations, 0) + nextPred = nan + continue + } + + var lastIntercept, lastSlope, prediction float64 + if i == 0 { + lastIntercept = v + lastSlope = 0 + prediction = v + } else { + lastIntercept = intercepts[len(intercepts)-1] + lastSlope = slopes[len(slopes)-1] + if math.IsNaN(lastIntercept) { + lastIntercept = v + } + prediction = nextPred + } + + lastSeasonal = getlastSeasonal(int64(i)) + nextLastSeasonal = getlastSeasonal(int64(i + 1)) + lastSeasonalDev = getlastDeviation(int64(i)) + + intercept = holtWintersIntercept(alpha, v, lastSeasonal, lastIntercept, lastSlope) + slope = holtWintersSlope(beta, intercept, lastIntercept, lastSlope) + seasonal = holtWintersSeasonal(gamma, v, intercept, lastSeasonal) + + nextPred = intercept + slope + nextLastSeasonal + deviation = holtWintersDeviation(gamma, v, prediction, lastSeasonalDev) + + intercepts = append(intercepts, intercept) + slopes = append(slopes, slope) + seasonals = append(seasonals, seasonal) + predictions = append(predictions, prediction) + deviations = append(deviations, deviation) + } + forecastSeries := &series{ + Timestamps: s.Timestamps, + Values: predictions, + Name: fmt.Sprintf("holtWintersForecast(%s)", s.Name), + step: s.step, + } + deviationsSS := &series{ + Timestamps: s.Timestamps, + Values: deviations, + Name: fmt.Sprintf("holtWintersDeviation(%s)", s.Name), + step: s.step, + } + + return holtWintersAnalysisResult{ + deviations: deviationsSS, + predictions: forecastSeries, + } +} + +type holtWintersAnalysisResult struct { + predictions *series + deviations *series +} + +func holtWintersIntercept(alpha, actual, lastReason, lastIntercept, lastSlope float64) float64 { + return alpha*(actual-lastReason) + (1-alpha)*(lastIntercept+lastSlope) +} + +func holtWintersSlope(beta, intercept, lastIntercept, lastSlope float64) float64 { + return beta*(intercept-lastIntercept) + (1-beta)*lastSlope +} +func holtWintersSeasonal(gamma, actual, intercept, lastSeason float64) float64 { + return gamma*(actual-intercept) + (1-gamma)*lastSeason +} + +func holtWintersDeviation(gamma, actual, prediction, lastSeasonalDev float64) float64 { + if math.IsNaN(prediction) { + prediction = 0 + } + return gamma*math.Abs(actual-prediction) + (1-gamma)*lastSeasonalDev +} + +func (nsf *nextSeriesFunc) peekStep(step int64) (int64, error) { + nextSeries := *nsf + s, err := nextSeries() + if err != nil { + return 0, err + } + if s != nil { + step = s.step + } + calls := uint64(0) + *nsf = func() (*series, error) { + if atomic.AddUint64(&calls, 1) == 1 { + return s, nil + } + return nextSeries() + } + return step, nil +} diff --git a/app/vmselect/graphite/transform_test.go b/app/vmselect/graphite/transform_test.go new file mode 100644 index 000000000..388d84b78 --- /dev/null +++ b/app/vmselect/graphite/transform_test.go @@ -0,0 +1,81 @@ +package graphite + +import ( + "reflect" + "testing" +) + +func TestUnmarshalTags(t *testing.T) { + f := func(s string, tagsExpected map[string]string) { + t.Helper() + tags := unmarshalTags(s) + if !reflect.DeepEqual(tags, tagsExpected) { + t.Fatalf("unexpected tags unmarshaled for s=%q\ngot\n%s\nwant\n%s", s, tags, tagsExpected) + } + } + f("", map[string]string{}) + f("foo.bar", map[string]string{ + "name": "foo.bar", + }) + f("foo;bar=baz", map[string]string{ + "name": "foo", + "bar": "baz", + }) + f("foo.bar;bar;x=aa;baz=aaa;x=y", map[string]string{ + "name": "foo.bar", + "baz": "aaa", + "x": "y", + }) +} + +func TestMarshalTags(t *testing.T) { + f := func(s, sExpected string) { + t.Helper() + tags := unmarshalTags(s) + sMarshaled := marshalTags(tags) + if sMarshaled != sExpected { + t.Fatalf("unexpected marshaled tags for s=%q\ngot\n%s\nwant\n%s", s, sMarshaled, sExpected) + } + } + f("", "") + f("foo", "foo") + f("foo;bar=baz", "foo;bar=baz") + f("foo.bar;baz;xx=yy;a=b", "foo.bar;a=b;xx=yy") + f("foo.bar;a=bb;a=ccc;d=a.b.c", "foo.bar;a=ccc;d=a.b.c") +} + +func TestGetPathFromName(t *testing.T) { + f := func(name, pathExpected string) { + t.Helper() + path := getPathFromName(name) + if path != pathExpected { + t.Fatalf("unexpected path extracted from name %q; got %q; want %q", name, path, pathExpected) + } + } + f("", "") + f("foo", "foo") + f("foo.bar", "foo.bar") + f("foo.bar,baz.aa", "foo.bar,baz.aa") + f("foo(bar.baz,aa.bb)", "bar.baz") + f("foo(1, 'foo', aaa )", "aaa") + f("foo|bar(baz)", "foo") + f("a(b(c.d.e))", "c.d.e") + f("foo()", "foo()") + f("123", "123") + f("foo(123)", "123") + f("fo(bar", "fo(bar") +} + +func TestGraphiteToGolangRegexpReplace(t *testing.T) { + f := func(s, replaceExpected string) { + t.Helper() + replace := graphiteToGolangRegexpReplace(s) + if replace != replaceExpected { + t.Fatalf("unexpected result for graphiteToGolangRegexpReplace(%q); got %q; want %q", s, replace, replaceExpected) + } + } + f("", "") + f("foo", "foo") + f(`a\d+`, `a\d+`) + f(`\1f\\oo\2`, `$1f\\oo$2`) +} diff --git a/app/vmselect/main.go b/app/vmselect/main.go index 7b6f33af1..8f4d43129 100644 --- a/app/vmselect/main.go +++ b/app/vmselect/main.go @@ -224,9 +224,23 @@ func RequestHandler(w http.ResponseWriter, r *http.Request) bool { return true } if strings.HasPrefix(path, "/functions") { - graphiteFunctionsRequests.Inc() - w.Header().Set("Content-Type", "application/json") - fmt.Fprintf(w, "%s", `{}`) + funcName := path[len("/functions"):] + funcName = strings.TrimPrefix(funcName, "/") + if funcName == "" { + graphiteFunctionsRequests.Inc() + if err := graphite.FunctionsHandler(startTime, w, r); err != nil { + graphiteFunctionsErrors.Inc() + httpserver.Errorf(w, r, "%s", err) + return true + } + return true + } + graphiteFunctionDetailsRequests.Inc() + if err := graphite.FunctionDetailsHandler(startTime, funcName, w, r); err != nil { + graphiteFunctionDetailsErrors.Inc() + httpserver.Errorf(w, r, "%s", err) + return true + } return true } @@ -437,6 +451,14 @@ func RequestHandler(w http.ResponseWriter, r *http.Request) bool { return true } return true + case "/render": + graphiteRenderRequests.Inc() + if err := graphite.RenderHandler(startTime, w, r); err != nil { + graphiteRenderErrors.Inc() + httpserver.Errorf(w, r, "error in %q: %s", r.URL.Path, err) + return true + } + return true case "/metric-relabel-debug": promscrapeMetricRelabelDebugRequests.Inc() promscrape.WriteMetricRelabelDebug(w, r) @@ -611,10 +633,17 @@ var ( graphiteTagsDelSeriesRequests = metrics.NewCounter(`vm_http_requests_total{path="/tags/delSeries"}`) graphiteTagsDelSeriesErrors = metrics.NewCounter(`vm_http_request_errors_total{path="/tags/delSeries"}`) + graphiteRenderRequests = metrics.NewCounter(`vm_http_requests_total{path="/render"}`) + graphiteRenderErrors = metrics.NewCounter(`vm_http_request_errors_total{path="/render"}`) + promscrapeMetricRelabelDebugRequests = metrics.NewCounter(`vm_http_requests_total{path="/metric-relabel-debug"}`) promscrapeTargetRelabelDebugRequests = metrics.NewCounter(`vm_http_requests_total{path="/target-relabel-debug"}`) graphiteFunctionsRequests = metrics.NewCounter(`vm_http_requests_total{path="/functions"}`) + graphiteFunctionsErrors = metrics.NewCounter(`vm_http_request_errors_total{path="/functions"}`) + + graphiteFunctionDetailsRequests = metrics.NewCounter(`vm_http_requests_total{path="/functions/"}`) + graphiteFunctionDetailsErrors = metrics.NewCounter(`vm_http_request_errors_total{path="/functions/"}`) expandWithExprsRequests = metrics.NewCounter(`vm_http_requests_total{path="/expand-with-exprs"}`) diff --git a/docs/CHANGELOG.md b/docs/CHANGELOG.md index d7927ce2d..69517fced 100644 --- a/docs/CHANGELOG.md +++ b/docs/CHANGELOG.md @@ -21,6 +21,7 @@ created by v1.90.0 or newer versions. The solution is to upgrade to v1.90.0 or n * SECURITY: upgrade base docker image (alpine) from 3.17.2 to 3.17.3. See [alpine 3.17.3 release notes](https://alpinelinux.org/posts/Alpine-3.17.3-released.html). +* FEATURE: open source [Graphite Render API](https://docs.victoriametrics.com/#graphite-render-api-usage). This API allows using VictoriaMetrics as a drop-in replacement for Graphite at both data ingestion and querying sides and reducing infrastructure costs by up to 10x comparing to Graphite. See [this case study](https://docs.victoriametrics.com/CaseStudies.html#grammarly) as an example. * FEATURE: release Windows binaries for [single-node VictoriaMetrics](https://docs.victoriametrics.com/), [VictoriaMetrics cluster](https://docs.victoriametrics.com/Cluster-VictoriaMetrics.html), [vmbackup](https://docs.victoriametrics.com/vmbackup.html) and [vmrestore](https://docs.victoriametrics.com/vmrestore.html). See [this](https://github.com/VictoriaMetrics/VictoriaMetrics/issues/3236), [this](https://github.com/VictoriaMetrics/VictoriaMetrics/issues/3821) and [this](https://github.com/VictoriaMetrics/VictoriaMetrics/issues/70) issues. * FEATURE: log metrics with truncated labels if the length of label value in the ingested metric exceeds `-maxLabelValueLen`. This should simplify debugging for this case. * FEATURE: [vmagent](https://docs.victoriametrics.com/vmagent.html): show target URL when debugging [target relabeling](https://docs.victoriametrics.com/vmagent.html#relabel-debug). This should simplify target relabel debugging a bit. See [this pull request](https://github.com/VictoriaMetrics/VictoriaMetrics/pull/3882). diff --git a/docs/Cluster-VictoriaMetrics.md b/docs/Cluster-VictoriaMetrics.md index 8b8105681..cca5a7327 100644 --- a/docs/Cluster-VictoriaMetrics.md +++ b/docs/Cluster-VictoriaMetrics.md @@ -359,7 +359,7 @@ Check practical examples of VictoriaMetrics API [here](https://docs.victoriametr - URLs for [Graphite Metrics API](https://graphite-api.readthedocs.io/en/latest/api.html#the-metrics-api): `http://:8481/select//graphite/`, where: - `` is an arbitrary number identifying data namespace for query (aka tenant) - `` may have the following values: - - `render` - implements Graphite Render API. See [these docs](https://graphite.readthedocs.io/en/stable/render_api.html). This functionality is available in [Enterprise package](https://docs.victoriametrics.com/enterprise.html). Enterprise binaries can be downloaded and evaluated for free from [the releases page](https://github.com/VictoriaMetrics/VictoriaMetrics/releases). + - `render` - implements Graphite Render API. See [these docs](https://graphite.readthedocs.io/en/stable/render_api.html). - `metrics/find` - searches Graphite metrics. See [these docs](https://graphite-api.readthedocs.io/en/latest/api.html#metrics-find). - `metrics/expand` - expands Graphite metrics. See [these docs](https://graphite-api.readthedocs.io/en/latest/api.html#metrics-expand). - `metrics/index.json` - returns all the metric names. See [these docs](https://graphite-api.readthedocs.io/en/latest/api.html#metrics-index-json). @@ -1128,9 +1128,9 @@ Below is the output for `/path/to/vmselect -help`: -search.disableCache Whether to disable response caching. This may be useful during data backfilling -search.graphiteMaxPointsPerSeries int - The maximum number of points per series Graphite render API can return. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 1000000) + The maximum number of points per series Graphite render API can return (default 1000000) -search.graphiteStorageStep duration - The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 10s) + The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API (default 10s) -search.latencyOffset duration The time when data points become visible in query results after the collection. It can be overridden on per-query basis via latency_offset arg. Too small value can result in incomplete last points for query results (default 30s) -search.logQueryMemoryUsage size @@ -1147,7 +1147,7 @@ Below is the output for `/path/to/vmselect -help`: -search.maxFederateSeries int The maximum number of time series, which can be returned from /federate. This option allows limiting memory usage (default 1000000) -search.maxGraphiteSeries int - The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage . This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 300000) + The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage (default 300000) -search.maxLookback duration Synonym to -search.lookback-delta from Prometheus. The value is dynamically detected from interval between time series datapoints if not set. It can be overridden on per-query basis via max_lookback arg. See also '-search.maxStalenessInterval' flag, which has the same meaining due to historical reasons -search.maxMemoryPerQuery size diff --git a/docs/README.md b/docs/README.md index 667a87195..2744c41b8 100644 --- a/docs/README.md +++ b/docs/README.md @@ -41,7 +41,8 @@ VictoriaMetrics has the following prominent features: * It can be used as long-term storage for Prometheus. See [these docs](#prometheus-setup) for details. * It can be used as a drop-in replacement for Prometheus in Grafana, because it supports [Prometheus querying API](#prometheus-querying-api-usage). * It can be used as a drop-in replacement for Graphite in Grafana, because it supports [Graphite API](#graphite-api-usage). -* It features easy setup and operation: + VictoriaMetrics allows reducing infrastructure costs by more than 10x comparing to Graphite - see [this case study](https://docs.victoriametrics.com/CaseStudies.html#grammarly). +* It is easy to setup and operate: * VictoriaMetrics consists of a single [small executable](https://medium.com/@valyala/stripping-dependency-bloat-in-victoriametrics-docker-image-983fb5912b0d) without external dependencies. * All the configuration is done via explicit command-line flags with reasonable defaults. @@ -628,7 +629,6 @@ The `__graphite__` pseudo-label supports e.g. alternate regexp filters such as ` VictoriaMetrics also supports Graphite query language - see [these docs](#graphite-render-api-usage). - ## How to send data from OpenTSDB-compatible agents VictoriaMetrics supports [telnet put protocol](http://opentsdb.net/docs/build/html/api_telnet/put.html) @@ -830,10 +830,10 @@ VictoriaMetrics supports `__graphite__` pseudo-label for filtering time series w ### Graphite Render API usage -[VictoriaMetrics Enterprise](https://docs.victoriametrics.com/enterprise.html) supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset +VictoriaMetrics supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset at `/render` endpoint, which is used by [Graphite datasource in Grafana](https://grafana.com/docs/grafana/latest/datasources/graphite/). -When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. -Enterprise binaries can be downloaded and evaluated for free from [the releases page](https://github.com/VictoriaMetrics/VictoriaMetrics/releases). +When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points +stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. ### Graphite Metrics API usage @@ -2439,9 +2439,9 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.disableCache Whether to disable response caching. This may be useful during data backfilling -search.graphiteMaxPointsPerSeries int - The maximum number of points per series Graphite render API can return. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 1000000) + The maximum number of points per series Graphite render API can return (default 1000000) -search.graphiteStorageStep duration - The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 10s) + The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API (default 10s) -search.latencyOffset duration The time when data points become visible in query results after the collection. It can be overridden on per-query basis via latency_offset arg. Too small value can result in incomplete last points for query results (default 30s) -search.logQueryMemoryUsage size @@ -2458,7 +2458,7 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.maxFederateSeries int The maximum number of time series, which can be returned from /federate. This option allows limiting memory usage (default 1000000) -search.maxGraphiteSeries int - The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage . This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 300000) + The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage (default 300000) -search.maxLookback duration Synonym to -search.lookback-delta from Prometheus. The value is dynamically detected from interval between time series datapoints if not set. It can be overridden on per-query basis via max_lookback arg. See also '-search.maxStalenessInterval' flag, which has the same meaining due to historical reasons -search.maxMemoryPerQuery size diff --git a/docs/Single-server-VictoriaMetrics.md b/docs/Single-server-VictoriaMetrics.md index c3b3aef06..18b1dff11 100644 --- a/docs/Single-server-VictoriaMetrics.md +++ b/docs/Single-server-VictoriaMetrics.md @@ -44,7 +44,8 @@ VictoriaMetrics has the following prominent features: * It can be used as long-term storage for Prometheus. See [these docs](#prometheus-setup) for details. * It can be used as a drop-in replacement for Prometheus in Grafana, because it supports [Prometheus querying API](#prometheus-querying-api-usage). * It can be used as a drop-in replacement for Graphite in Grafana, because it supports [Graphite API](#graphite-api-usage). -* It features easy setup and operation: + VictoriaMetrics allows reducing infrastructure costs by more than 10x comparing to Graphite - see [this case study](https://docs.victoriametrics.com/CaseStudies.html#grammarly). +* It is easy to setup and operate: * VictoriaMetrics consists of a single [small executable](https://medium.com/@valyala/stripping-dependency-bloat-in-victoriametrics-docker-image-983fb5912b0d) without external dependencies. * All the configuration is done via explicit command-line flags with reasonable defaults. @@ -631,7 +632,6 @@ The `__graphite__` pseudo-label supports e.g. alternate regexp filters such as ` VictoriaMetrics also supports Graphite query language - see [these docs](#graphite-render-api-usage). - ## How to send data from OpenTSDB-compatible agents VictoriaMetrics supports [telnet put protocol](http://opentsdb.net/docs/build/html/api_telnet/put.html) @@ -833,10 +833,10 @@ VictoriaMetrics supports `__graphite__` pseudo-label for filtering time series w ### Graphite Render API usage -[VictoriaMetrics Enterprise](https://docs.victoriametrics.com/enterprise.html) supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset +VictoriaMetrics supports [Graphite Render API](https://graphite.readthedocs.io/en/stable/render_api.html) subset at `/render` endpoint, which is used by [Graphite datasource in Grafana](https://grafana.com/docs/grafana/latest/datasources/graphite/). -When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. -Enterprise binaries can be downloaded and evaluated for free from [the releases page](https://github.com/VictoriaMetrics/VictoriaMetrics/releases). +When configuring Graphite datasource in Grafana, the `Storage-Step` http request header must be set to a step between Graphite data points +stored in VictoriaMetrics. For example, `Storage-Step: 10s` would mean 10 seconds distance between Graphite datapoints stored in VictoriaMetrics. ### Graphite Metrics API usage @@ -2442,9 +2442,9 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.disableCache Whether to disable response caching. This may be useful during data backfilling -search.graphiteMaxPointsPerSeries int - The maximum number of points per series Graphite render API can return. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 1000000) + The maximum number of points per series Graphite render API can return (default 1000000) -search.graphiteStorageStep duration - The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API. This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 10s) + The interval between datapoints stored in the database. It is used at Graphite Render API handler for normalizing the interval between datapoints in case it isn't normalized. It can be overridden by sending 'storage_step' query arg to /render API or by sending the desired interval via 'Storage-Step' http header during querying /render API (default 10s) -search.latencyOffset duration The time when data points become visible in query results after the collection. It can be overridden on per-query basis via latency_offset arg. Too small value can result in incomplete last points for query results (default 30s) -search.logQueryMemoryUsage size @@ -2461,7 +2461,7 @@ Pass `-help` to VictoriaMetrics in order to see the list of supported command-li -search.maxFederateSeries int The maximum number of time series, which can be returned from /federate. This option allows limiting memory usage (default 1000000) -search.maxGraphiteSeries int - The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage . This flag is available only in VictoriaMetrics enterprise. See https://docs.victoriametrics.com/enterprise.html (default 300000) + The maximum number of time series, which can be scanned during queries to Graphite Render API. See https://docs.victoriametrics.com/#graphite-render-api-usage (default 300000) -search.maxLookback duration Synonym to -search.lookback-delta from Prometheus. The value is dynamically detected from interval between time series datapoints if not set. It can be overridden on per-query basis via max_lookback arg. See also '-search.maxStalenessInterval' flag, which has the same meaining due to historical reasons -search.maxMemoryPerQuery size diff --git a/docs/enterprise.md b/docs/enterprise.md index fb63b3b75..cfadb77f1 100644 --- a/docs/enterprise.md +++ b/docs/enterprise.md @@ -34,10 +34,6 @@ plus the following additional features: by specifying different retentions to different datasets. - [Automatic discovery of vmstorage nodes](https://docs.victoriametrics.com/Cluster-VictoriaMetrics.html#automatic-vmstorage-discovery) - this feature allows updating the list of `vmstorage` nodes at `vminsert` and `vmselect` without the need to restart these services. -- [Graphite querying](https://docs.victoriametrics.com/#graphite-render-api-usage) - this feature allows seamless - transition from Graphite to VictoriaMetrics without the need to modify queries at dashboards and alerts. - VictoriaMetrics allows reducing infrastructure costs by more than 10x comparing to Graphite - - see [this case study](https://docs.victoriametrics.com/CaseStudies.html#grammarly). - [Backup automation](https://docs.victoriametrics.com/vmbackupmanager.html). - [Advanced per-tenant stats](https://docs.victoriametrics.com/PerTenantStatistic.html). - [Advanced auth and rate limiter](https://docs.victoriametrics.com/vmgateway.html).