Compare commits

..

11 Commits

Author SHA1 Message Date
joshhunt 978c989f3e Dashboards: Lazy-load scripted dashboards dependencies 2026-01-06 11:34:29 +00:00
Peter Nguyen 217427e072 Loki Language Provider: Add missing interpolation to fetchLabelsByLabelsEndpoint (#114608)
* Plugins: Implement bug fix for loki label selectors w/ variable interpolation

* Chore: Add test to ensure result is interpolated

---------

Co-authored-by: Zoltán Bedi <zoltan.bedi@gmail.com>
2026-01-06 10:29:51 +00:00
grafana-pr-automation[bot] 585d24dafa I18n: Download translations from Crowdin (#115860)
New Crowdin translations by GitHub Action

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2026-01-06 10:01:38 +00:00
Josh Hunt fccece3ca0 Refactor: Remove jQuery from AppWrapper (#115842) 2026-01-06 09:58:42 +00:00
Juan Cabanas d44cab9eaf DashboardLibrary: Add validations to visualize community dashboards (#114562)
* dashboard library check added

* community dashboard section tests in progress

* tests added

* translations added

* pagination removed

* total pages removed

* test updated. pagination removed

* filters applied

* tracking event removed to be created in another pr

* slug added so url is correclty generated

* ui fix

* improvements after review

* improvements after review

* more tests added. new logic created

* fix

* changes applied

* tests removed. pattern updated

* preset of 6 elements applied

* Improve code comments and adjust variable name based on PR feedback

* Fix unit test and add extra case for regex pattern

* Fix interaction event, we were missing contentKind on BasicProvisioned flow and datasources types were not being send

---------

Co-authored-by: nmarrs <nathanielmarrs@gmail.com>
Co-authored-by: alexandra vargas <alexa1866@gmail.com>
2026-01-06 10:38:15 +01:00
Saurabh Yadav 3d3b4dd213 Clean up packages/grafana-prometheus/src/dashboards (#115861)
* remove:Dashboard json files

* removed: dashboards from packages/grafana-prometheus/src/dashboards
2026-01-06 09:26:04 +00:00
Larissa Wandzura 2947d41ea8 Docs: Fixed broken links for Cloudwatch (#115848)
* updates broken links and aliases

* fixed query editor links
2026-01-05 20:56:50 +00:00
Stephanie Hingtgen 0acb030f46 Revert: OSS Seeding (115729) (#115839) 2026-01-05 12:33:55 -06:00
Stephanie Hingtgen 658a1c8228 Dashboards: Allow editing provisioned dashboards if AllowUIUpdates is set (#115804) 2026-01-05 11:46:14 -06:00
Will Browne 618316a2f7 Revert "App Plugins: Allow to define experimental pages" (#115841)
Revert "App Plugins: Allow to define experimental pages (#114232)"

This reverts commit e1a2f178e7.
2026-01-05 17:04:07 +00:00
vesalaakso-oura a9c2117aa7 Transformers: Add smoothing transformer (#111077)
* Transformers: Add smoothing transformer

Added a smoothing transformer to help clean up noisy time series data.
It uses the ASAP algorithm to pick the most important data points while
keeping the overall shape and trends intact.

The transformer always keeps the first and last points so you get the
complete time range. I also added a test for it.

* Change category

Change category from Reformat to CalculateNewFields

* Remove first/last point preservation

* Fix operator recreation

* Simplify ASAP code

Include performance optimization as well

* Refactor interpolateFromSmoothedCurve

Break function into smaller focused functions and lift functions to the
top level

* Add isApplicable Check

Make sure the transformer is applicable for timeseries data

* Add tests for isApplicable check

* UI/UX improvements: Display effective resolution when limited by data points

Show "Effective: X" indicator when resolution is capped by the 2x data
points multiplier. Includes tooltip explaining the limit.

Memoizes calculation to prevent unnecessary recalculation on re-renders.

Example: With 72 data points and resolution set to 150, displays
"Effective: 144" since the limit is 72 x 2 = 144.

Plus added tests

* Improve discoverability by adding tags

* Preserve Original Data

Let's preserve original data as well, makes the UX so much better.
Changed from appending (smoothed) to frame names to use Smoothed frame name. This should match the pattern used by other transformers (e.g,. regression)
Updated tests accordingly
Updated tooltip note

* Add asap tests

Basic functionality:
* returns valid DataPoint objects
* Maintain x-axis ordering

Edge cases:
* Empty array
* single data point
* filter NaN values
* all NaN values
* sort unsorted data
* negative values

* Update dark and light images

* Clear state cache

* Add feature toggle

* Conditionally add new transformation to the registry

* chore: update and regenerate feature toggles

* chore: update yarn.lock

* chore: fix transformers and imports
2026-01-05 17:53:45 +01:00
85 changed files with 3365 additions and 5391 deletions
@@ -1,11 +1,12 @@
---
aliases:
- ../data-sources/aws-CloudWatch/
- ../data-sources/aws-CloudWatch/preconfig-CloudWatch-dashboards/
- ../data-sources/aws-CloudWatch/provision-CloudWatch/
- CloudWatch/
- preconfig-CloudWatch-dashboards/
- provision-CloudWatch/
- ../../data-sources/aws-cloudwatch/configure/
- ../../data-sources/aws-cloudwatch/
- ../../data-sources/aws-cloudwatch/preconfig-cloudwatch-dashboards/
- ../../data-sources/aws-cloudwatch/provision-cloudwatch/
- ../cloudwatch/
- ../preconfig-cloudwatch-dashboards/
- ../provision-cloudwatch/
description: This document provides configuration instructions for the CloudWatch data source.
keywords:
- grafana
@@ -25,11 +26,6 @@ refs:
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/visualizations/logs/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/visualizations/logs/
explore:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
provisioning-data-sources:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/administration/provisioning/#data-sources
@@ -40,16 +36,6 @@ refs:
destination: /docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/#aws
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/#aws
alerting:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/alerting/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/alerting-and-irm/alerting/
build-dashboards:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/dashboards/build-dashboards/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/dashboards/build-dashboards/
data-source-management:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/administration/data-source-management/
@@ -153,7 +139,7 @@ You must use both an access key ID and a secret access key to authenticate.
Grafana automatically creates a link to a trace in X-Ray data source if logs contain the `@xrayTraceId` field. To use this feature, you must already have an X-Ray data source configured. For details, see the [X-Ray data source docs](/grafana/plugins/grafana-X-Ray-datasource/). To view the X-Ray link, select the log row in either the Explore view or dashboard [Logs panel](ref:logs) to view the log details section.
To log the `@xrayTraceId`, refer to the [AWS X-Ray documentation](https://docs.amazonaws.cn/en_us/xray/latest/devguide/xray-services.html). To provide the field to Grafana, your log queries must also contain the `@xrayTraceId` field, for example by using the query `fields @message, @xrayTraceId`.
To log the `@xrayTraceId`, refer to the [AWS X-Ray documentation](https://docs.aws.amazon.com/xray/latest/devguide/xray-services.html). To provide the field to Grafana, your log queries must also contain the `@xrayTraceId` field, for example by using the query `fields @message, @xrayTraceId`.
**Private data source connect** - _Only for Grafana Cloud users._
@@ -34,11 +34,6 @@ refs:
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/query-transform-data/#navigate-the-query-tab
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/query-transform-data/#navigate-the-query-tab
explore:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
alerting:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/alerting/
@@ -183,7 +178,7 @@ If you use the expression field to reference another query, such as `queryA * 2`
When you select `Builder` mode within the Metric search editor, a new Account field is displayed. Use the `Account` field to specify which of the linked monitoring accounts to target for the given query. By default, the `All` option is specified, which will target all linked accounts.
While in `Code` mode, you can specify any math expression. If the Monitoring account badge displays in the query editor header, all `SEARCH` expressions entered in this field will be cross-account by default and can query metrics from linked accounts. Note that while queries run cross-account, the autocomplete feature currently doesn't fetch cross-account resources, so you'll need to manually specify resource names when writing cross-account queries.
You can limit the search to one or a set of accounts, as documented in the [AWS documentation](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Unified-Cross-Account.html).
You can limit the search to one or a set of accounts, as documented in the [AWS documentation](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Unified-Cross-Account.html).
### Period macro
@@ -198,7 +193,7 @@ The link provided is valid for any account but displays the expected metrics onl
{{< figure src="/media/docs/cloudwatch/cloudwatch-deep-link-v12.1.png" caption="CloudWatch deep linking" >}}
This feature is not available for metrics based on [metric math expressions](#metric-math-expressions).
This feature is not available for metrics based on [metric math expressions](#use-metric-math-expressions).
### Use Metric Insights syntax
@@ -319,9 +314,9 @@ The CloudWatch plugin monitors and troubleshoots applications that span multiple
To enable cross-account observability, complete the following steps:
1. Go to the [Amazon CloudWatch documentation](http://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Unified-Cross-Account.html) and follow the instructions for enabling cross-account observability.
1. Go to the [Amazon CloudWatch documentation](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch-Unified-Cross-Account.html) and follow the instructions for enabling cross-account observability.
1. Add [two API actions](https://grafana.com//docs/grafana/latest/datasources/aws-cloudwatch/configure/#cross-account-observability-permissions) to the IAM policy attached to the role/user running the plugin.
1. Add [two API actions](https://grafana.com/docs/grafana/latest/datasources/aws-cloudwatch/configure/#cross-account-observability-permissions) to the IAM policy attached to the role/user running the plugin.
Cross-account querying is available in the plugin through the **Logs**, **Metric search**, and **Metric Insights** modes.
After you have configured it, you'll see a **Monitoring account** badge in the query editor header.
+1
View File
@@ -347,6 +347,7 @@
"date-fns": "4.1.0",
"debounce-promise": "3.1.2",
"diff": "^8.0.0",
"downsample": "1.4.0",
"fast-deep-equal": "^3.1.3",
"fast-json-patch": "3.1.1",
"file-saver": "2.0.5",
@@ -42,5 +42,6 @@ export enum DataTransformerID {
formatTime = 'formatTime',
formatString = 'formatString',
regression = 'regression',
smoothing = 'smoothing',
groupToNestedTable = 'groupToNestedTable',
}
+4
View File
@@ -1255,4 +1255,8 @@ export interface FeatureToggles {
* Enables support for variables whose values can have multiple properties
*/
multiPropsVariables?: boolean;
/**
* Enables the ASAP smoothing transformation for time series data
*/
smoothingTransformation?: boolean;
}
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
@@ -1,834 +0,0 @@
{
"_comment": "Core Grafana history https://github.com/grafana/grafana/blob/v11.0.0-preview/public/app/plugins/datasource/prometheus/dashboards/prometheus_stats.json",
"__inputs": [
{
"name": "DS_GDEV-PROMETHEUS",
"label": "gdev-prometheus",
"description": "",
"type": "datasource",
"pluginId": "prometheus",
"pluginName": "Prometheus"
}
],
"__requires": [
{
"type": "grafana",
"id": "grafana",
"name": "Grafana",
"version": "8.1.0-pre"
},
{
"type": "datasource",
"id": "prometheus",
"name": "Prometheus",
"version": "1.0.0"
},
{
"type": "panel",
"id": "stat",
"name": "Stat",
"version": ""
},
{
"type": "panel",
"id": "text",
"name": "Text",
"version": ""
},
{
"type": "panel",
"id": "timeseries",
"name": "Time series",
"version": ""
}
],
"annotations": {
"list": [
{
"builtIn": 1,
"datasource": "-- Grafana --",
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"type": "dashboard"
}
]
},
"editable": true,
"gnetId": null,
"graphTooltip": 0,
"id": null,
"iteration": 1624859749459,
"links": [
{
"icon": "info",
"tags": [],
"targetBlank": true,
"title": "Grafana Docs",
"tooltip": "",
"type": "link",
"url": "https://grafana.com/docs/grafana/latest/"
},
{
"icon": "info",
"tags": [],
"targetBlank": true,
"title": "Prometheus Docs",
"type": "link",
"url": "http://prometheus.io/docs/introduction/overview/"
}
],
"panels": [
{
"cacheTimeout": null,
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "thresholds"
},
"decimals": 1,
"mappings": [
{
"options": {
"match": "null",
"result": {
"text": "N/A"
}
},
"type": "special"
}
],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "red",
"value": 80
}
]
},
"unit": "s"
},
"overrides": []
},
"gridPos": {
"h": 5,
"w": 6,
"x": 0,
"y": 0
},
"id": 5,
"interval": null,
"links": [],
"maxDataPoints": 100,
"options": {
"colorMode": "none",
"graphMode": "none",
"justifyMode": "auto",
"orientation": "horizontal",
"reduceOptions": {
"calcs": ["lastNotNull"],
"fields": "",
"values": false
},
"text": {},
"textMode": "auto"
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "(time() - process_start_time_seconds{job=\"prometheus\", instance=~\"$node\"})",
"intervalFactor": 2,
"refId": "A"
}
],
"title": "Uptime",
"type": "stat"
},
{
"cacheTimeout": null,
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"fixedColor": "rgb(31, 120, 193)",
"mode": "fixed"
},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "rgba(50, 172, 45, 0.97)",
"value": null
},
{
"color": "rgba(237, 129, 40, 0.89)",
"value": 1
},
{
"color": "rgba(245, 54, 54, 0.9)",
"value": 5
}
]
},
"unit": "none"
},
"overrides": []
},
"gridPos": {
"h": 5,
"w": 6,
"x": 6,
"y": 0
},
"id": 6,
"interval": null,
"links": [],
"maxDataPoints": 100,
"options": {
"colorMode": "none",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "horizontal",
"reduceOptions": {
"calcs": ["lastNotNull"],
"fields": "",
"values": false
},
"text": {},
"textMode": "auto"
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "prometheus_local_storage_memory_series{instance=~\"$node\"}",
"intervalFactor": 2,
"refId": "A"
}
],
"title": "Local Storage Memory Series",
"type": "stat"
},
{
"cacheTimeout": null,
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "thresholds"
},
"mappings": [
{
"options": {
"0": {
"text": "Empty"
}
},
"type": "value"
}
],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "rgba(50, 172, 45, 0.97)",
"value": null
},
{
"color": "rgba(237, 129, 40, 0.89)",
"value": 500
},
{
"color": "rgba(245, 54, 54, 0.9)",
"value": 4000
}
]
},
"unit": "none"
},
"overrides": []
},
"gridPos": {
"h": 5,
"w": 6,
"x": 12,
"y": 0
},
"id": 7,
"interval": null,
"links": [],
"maxDataPoints": 100,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "horizontal",
"reduceOptions": {
"calcs": ["lastNotNull"],
"fields": "",
"values": false
},
"text": {},
"textMode": "auto"
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "prometheus_local_storage_indexing_queue_length{instance=~\"$node\"}",
"intervalFactor": 2,
"refId": "A"
}
],
"title": "Internal Storage Queue Length",
"type": "stat"
},
{
"datasource": null,
"editable": true,
"error": false,
"gridPos": {
"h": 5,
"w": 6,
"x": 18,
"y": 0
},
"id": 9,
"links": [],
"options": {
"content": "<span style=\"font-family: 'Open Sans', 'Helvetica Neue', Helvetica; font-size: 25px;vertical-align: text-top;color: #bbbfc2;margin-left: 10px;\">Prometheus</span>\n\n<p style=\"margin-top: 10px;\">You're using Prometheus, an open-source systems monitoring and alerting toolkit originally built at SoundCloud. For more information, check out the <a href=\"https://grafana.com/\">Grafana</a> and <a href=\"http://prometheus.io/\">Prometheus</a> projects.</p>",
"mode": "html"
},
"pluginVersion": "8.1.0-pre",
"style": {},
"transparent": true,
"type": "text"
},
{
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "palette-classic"
},
"custom": {
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"drawStyle": "line",
"fillOpacity": 10,
"gradientMode": "none",
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
},
"lineInterpolation": "linear",
"lineWidth": 2,
"pointSize": 5,
"scaleDistribution": {
"type": "linear"
},
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
},
"thresholdsStyle": {
"mode": "off"
}
},
"links": [],
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "red",
"value": 80
}
]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "prometheus"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#C15C17",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "{instance=\"localhost:9090\",job=\"prometheus\"}"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#C15C17",
"mode": "fixed"
}
}
]
}
]
},
"gridPos": {
"h": 6,
"w": 18,
"x": 0,
"y": 5
},
"id": 3,
"links": [],
"options": {
"legend": {
"calcs": [],
"displayMode": "list",
"placement": "bottom"
},
"tooltip": {
"mode": "single"
}
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "rate(prometheus_local_storage_ingested_samples_total{instance=~\"$node\"}[5m])",
"interval": "",
"intervalFactor": 2,
"legendFormat": "{{job}}",
"metric": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Samples ingested (rate-5m)",
"type": "timeseries"
},
{
"datasource": null,
"editable": true,
"error": false,
"gridPos": {
"h": 6,
"w": 4,
"x": 18,
"y": 5
},
"id": 8,
"links": [],
"options": {
"content": "#### Samples Ingested\nThis graph displays the count of samples ingested by the Prometheus server, as measured over the last 5 minutes, per time series in the range vector. When troubleshooting an issue on IRC or GitHub, this is often the first stat requested by the Prometheus team. ",
"mode": "markdown"
},
"pluginVersion": "8.1.0-pre",
"style": {},
"transparent": true,
"type": "text"
},
{
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "palette-classic"
},
"custom": {
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"drawStyle": "line",
"fillOpacity": 10,
"gradientMode": "none",
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
},
"lineInterpolation": "linear",
"lineWidth": 2,
"pointSize": 5,
"scaleDistribution": {
"type": "linear"
},
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
},
"thresholdsStyle": {
"mode": "off"
}
},
"links": [],
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "red",
"value": 80
}
]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "prometheus"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#F9BA8F",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "{instance=\"localhost:9090\",interval=\"5s\",job=\"prometheus\"}"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#F9BA8F",
"mode": "fixed"
}
}
]
}
]
},
"gridPos": {
"h": 7,
"w": 10,
"x": 0,
"y": 11
},
"id": 2,
"links": [],
"options": {
"legend": {
"calcs": [],
"displayMode": "list",
"placement": "bottom"
},
"tooltip": {
"mode": "single"
}
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "rate(prometheus_target_interval_length_seconds_count{instance=~\"$node\"}[5m])",
"intervalFactor": 2,
"legendFormat": "{{job}}",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Target Scrapes (last 5m)",
"type": "timeseries"
},
{
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "palette-classic"
},
"custom": {
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"drawStyle": "line",
"fillOpacity": 10,
"gradientMode": "none",
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
},
"lineInterpolation": "linear",
"lineWidth": 2,
"pointSize": 5,
"scaleDistribution": {
"type": "linear"
},
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
},
"thresholdsStyle": {
"mode": "off"
}
},
"links": [],
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "red",
"value": 80
}
]
},
"unit": "short"
},
"overrides": []
},
"gridPos": {
"h": 7,
"w": 8,
"x": 10,
"y": 11
},
"id": 14,
"links": [],
"options": {
"legend": {
"calcs": [],
"displayMode": "list",
"placement": "bottom"
},
"tooltip": {
"mode": "single"
}
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "prometheus_target_interval_length_seconds{quantile!=\"0.01\", quantile!=\"0.05\",instance=~\"$node\"}",
"interval": "",
"intervalFactor": 2,
"legendFormat": "{{quantile}} ({{interval}})",
"metric": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Scrape Duration",
"type": "timeseries"
},
{
"datasource": null,
"editable": true,
"error": false,
"gridPos": {
"h": 7,
"w": 6,
"x": 18,
"y": 11
},
"id": 11,
"links": [],
"options": {
"content": "#### Scrapes\nPrometheus scrapes metrics from instrumented jobs, either directly or via an intermediary push gateway for short-lived jobs. Target scrapes will show how frequently targets are scraped, as measured over the last 5 minutes, per time series in the range vector. Scrape Duration will show how long the scrapes are taking, with percentiles available as series. ",
"mode": "markdown"
},
"pluginVersion": "8.1.0-pre",
"style": {},
"transparent": true,
"type": "text"
},
{
"datasource": "${DS_GDEV-PROMETHEUS}",
"fieldConfig": {
"defaults": {
"color": {
"mode": "palette-classic"
},
"custom": {
"axisLabel": "",
"axisPlacement": "auto",
"barAlignment": 0,
"drawStyle": "line",
"fillOpacity": 10,
"gradientMode": "none",
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
},
"lineInterpolation": "linear",
"lineWidth": 2,
"pointSize": 5,
"scaleDistribution": {
"type": "linear"
},
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
},
"thresholdsStyle": {
"mode": "off"
}
},
"links": [],
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
},
{
"color": "red",
"value": 80
}
]
},
"unit": "percentunit"
},
"overrides": []
},
"gridPos": {
"h": 7,
"w": 18,
"x": 0,
"y": 18
},
"id": 12,
"links": [],
"options": {
"legend": {
"calcs": [],
"displayMode": "list",
"placement": "bottom"
},
"tooltip": {
"mode": "single"
}
},
"pluginVersion": "8.1.0-pre",
"targets": [
{
"expr": "prometheus_evaluator_duration_seconds{quantile!=\"0.01\", quantile!=\"0.05\",instance=~\"$node\"}",
"interval": "",
"intervalFactor": 2,
"legendFormat": "{{quantile}}",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Rule Eval Duration",
"type": "timeseries"
},
{
"datasource": null,
"editable": true,
"error": false,
"gridPos": {
"h": 7,
"w": 6,
"x": 18,
"y": 18
},
"id": 15,
"links": [],
"options": {
"content": "#### Rule Evaluation Duration\nThis graph panel plots the duration for all evaluations to execute. The 50th percentile, 90th percentile and 99th percentile are shown as three separate series to help identify outliers that may be skewing the data.",
"mode": "markdown"
},
"pluginVersion": "8.1.0-pre",
"style": {},
"transparent": true,
"type": "text"
}
],
"refresh": false,
"revision": "1.0",
"schemaVersion": 30,
"tags": ["prometheus"],
"templating": {
"list": [
{
"allValue": null,
"current": {},
"datasource": "${DS_GDEV-PROMETHEUS}",
"definition": "",
"description": null,
"error": null,
"hide": 0,
"includeAll": false,
"label": "HOST:",
"multi": false,
"name": "node",
"options": [],
"query": {
"query": "label_values(prometheus_build_info, instance)",
"refId": "gdev-prometheus-node-Variable-Query"
},
"refresh": 1,
"regex": "",
"skipUrlSync": false,
"sort": 1,
"tagValuesQuery": "",
"tagsQuery": "",
"type": "query",
"useTags": false
}
]
},
"time": {
"from": "now-5m",
"to": "now"
},
"timepicker": {
"now": true,
"refresh_intervals": ["5s", "10s", "30s", "1m", "5m", "15m", "30m", "1h", "2h", "1d"]
},
"timezone": "browser",
"title": "Prometheus Stats",
"uid": "rpfmFFz7z",
"version": 2
}
@@ -212,7 +212,6 @@ export const VizTooltipRow = ({
colorIndicator={colorIndicator}
position={ColorIndicatorPosition.Trailing}
lineStyle={lineStyle}
isHollow={isHiddenFromViz}
/>
)}
</div>
@@ -148,26 +148,18 @@ export const getContentItems = (
_restFields?.forEach((field) => {
if (!field.config.custom?.hideFrom?.tooltip) {
const valueIdx = dataIdxs[seriesIdx ?? 0];
const { colorIndicator, colorPlacement } = getIndicatorAndPlacement(field);
const display = field.display!(field.values[dataIdxs[0]!]);
if (valueIdx != null) {
const value = field.values[valueIdx];
if (value != null) {
const display = field.display!(value);
const { colorIndicator, colorPlacement } = getIndicatorAndPlacement(field);
rows.push({
label: field.state?.displayName ?? field.name,
value: formattedValueToString(display),
color: FALLBACK_COLOR,
colorIndicator,
colorPlacement,
lineStyle: field.config.custom?.lineStyle,
isHiddenFromViz: true,
});
}
}
rows.push({
label: field.state?.displayName ?? field.name,
value: formattedValueToString(display),
color: FALLBACK_COLOR,
colorIndicator,
colorPlacement,
lineStyle: field.config.custom?.lineStyle,
isHiddenFromViz: true,
});
}
});
-29
View File
@@ -1,7 +1,6 @@
package middleware
import (
"context"
"errors"
"net/http"
"net/url"
@@ -22,13 +21,6 @@ import (
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/web"
"github.com/open-feature/go-sdk/openfeature"
)
var openfeatureClient = openfeature.NewDefaultClient()
const (
pluginPageFeatureFlagPrefix = "plugin-page-visible."
)
type AuthOptions struct {
@@ -154,12 +146,6 @@ func RoleAppPluginAuth(accessControl ac.AccessControl, ps pluginstore.Store, log
return
}
if !PageIsFeatureToggleEnabled(c.Req.Context(), c.Req.URL.Path) {
logger.Debug("Forbidden experimental plugin page", "plugin", pluginID, "path", c.Req.URL.Path)
accessForbidden(c)
return
}
permitted := true
path := normalizeIncludePath(c.Req.URL.Path)
hasAccess := ac.HasAccess(accessControl, c)
@@ -308,18 +294,3 @@ func shouldForceLogin(c *contextmodel.ReqContext) bool {
return forceLogin
}
// PageIsFeatureToggleEnabled checks if a page is enabled via OpenFeature feature flags.
// It returns false if the feature flag is set and set to false.
// The feature flag key format is: "plugin-page-visible.<path>"
func PageIsFeatureToggleEnabled(ctx context.Context, path string) bool {
flagKey := pluginPageFeatureFlagPrefix + filepath.Clean(path)
enabled := openfeatureClient.Boolean(
ctx,
flagKey,
true,
openfeature.TransactionContext(ctx),
)
return enabled
}
-96
View File
@@ -1,17 +1,12 @@
package middleware
import (
"context"
"errors"
"fmt"
"net/http"
"net/http/httptest"
"sync"
"testing"
"github.com/open-feature/go-sdk/openfeature"
"github.com/open-feature/go-sdk/openfeature/memprovider"
oftesting "github.com/open-feature/go-sdk/openfeature/testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
@@ -33,8 +28,6 @@ import (
"github.com/grafana/grafana/pkg/web"
)
var openfeatureTestMutex sync.Mutex
func setupAuthMiddlewareTest(t *testing.T, identity *authn.Identity, authErr error) *contexthandler.ContextHandler {
return contexthandler.ProvideService(setting.NewCfg(), &authntest.FakeService{
ExpectedErr: authErr,
@@ -429,60 +422,6 @@ func TestCanAdminPlugin(t *testing.T) {
}
}
func TestPageIsFeatureToggleEnabled(t *testing.T) {
type testCase struct {
desc string
path string
flags map[string]bool
expectedResult bool
}
tests := []testCase{
{
desc: "returns true when feature flag is enabled",
path: "/a/my-plugin/settings",
flags: map[string]bool{
pluginPageFeatureFlagPrefix + "/a/my-plugin/settings": true,
},
expectedResult: true,
},
{
desc: "returns false when feature flag is disabled",
path: "/a/my-plugin/settings",
flags: map[string]bool{
pluginPageFeatureFlagPrefix + "/a/my-plugin/settings": false,
},
expectedResult: false,
},
{
desc: "returns false when feature flag is disabled with trailing slash",
path: "/a/my-plugin/settings/",
flags: map[string]bool{
pluginPageFeatureFlagPrefix + "/a/my-plugin/settings": false,
},
expectedResult: false,
},
{
desc: "returns true when feature flag does not exist",
path: "/a/my-plugin/settings",
flags: map[string]bool{},
expectedResult: true,
},
}
for _, tt := range tests {
t.Run(tt.desc, func(t *testing.T) {
ctx := context.Background()
setupTestProvider(t, tt.flags)
result := PageIsFeatureToggleEnabled(ctx, tt.path)
assert.Equal(t, tt.expectedResult, result)
})
}
}
func contextProvider(modifiers ...func(c *contextmodel.ReqContext)) web.Handler {
return func(c *web.Context) {
reqCtx := &contextmodel.ReqContext{
@@ -498,38 +437,3 @@ func contextProvider(modifiers ...func(c *contextmodel.ReqContext)) web.Handler
c.Req = c.Req.WithContext(ctxkey.Set(c.Req.Context(), reqCtx))
}
}
// setupTestProvider creates a test OpenFeature provider with the given flags.
// Uses a global lock to prevent concurrent provider changes across tests.
func setupTestProvider(t *testing.T, flags map[string]bool) oftesting.TestProvider {
t.Helper()
// Lock to prevent concurrent provider changes
openfeatureTestMutex.Lock()
testProvider := oftesting.NewTestProvider()
flagsMap := map[string]memprovider.InMemoryFlag{}
for key, value := range flags {
flagsMap[key] = memprovider.InMemoryFlag{
DefaultVariant: "defaultVariant",
Variants: map[string]any{
"defaultVariant": value,
},
}
}
testProvider.UsingFlags(t, flagsMap)
err := openfeature.SetProviderAndWait(testProvider)
require.NoError(t, err)
t.Cleanup(func() {
testProvider.Cleanup()
_ = openfeature.SetProviderAndWait(openfeature.NoopProvider{})
// Unlock after cleanup to allow other tests to run
openfeatureTestMutex.Unlock()
})
return testProvider
}
@@ -1,44 +0,0 @@
package acimpl
import (
"context"
"time"
"github.com/grafana/grafana/pkg/services/accesscontrol"
)
const (
ossBasicRoleSeedLockName = "oss-ac-basic-role-seeder"
ossBasicRoleSeedTimeout = 2 * time.Minute
)
// refreshBasicRolePermissionsInDB ensures basic role permissions are fully derived from in-memory registrations
func (s *Service) refreshBasicRolePermissionsInDB(ctx context.Context, rolesSnapshot map[string][]accesscontrol.Permission) error {
if s.sql == nil || s.seeder == nil {
return nil
}
run := func(ctx context.Context) error {
desired := map[accesscontrol.SeedPermission]struct{}{}
for role, permissions := range rolesSnapshot {
for _, permission := range permissions {
desired[accesscontrol.SeedPermission{BuiltInRole: role, Action: permission.Action, Scope: permission.Scope}] = struct{}{}
}
}
s.seeder.SetDesiredPermissions(desired)
return s.seeder.Seed(ctx)
}
if s.serverLock == nil {
return run(ctx)
}
var err error
errLock := s.serverLock.LockExecuteAndRelease(ctx, ossBasicRoleSeedLockName, ossBasicRoleSeedTimeout, func(ctx context.Context) {
err = run(ctx)
})
if errLock != nil {
return errLock
}
return err
}
@@ -1,128 +0,0 @@
package acimpl
import (
"context"
"testing"
"time"
"github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/localcache"
"github.com/grafana/grafana/pkg/infra/tracing"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/database"
"github.com/grafana/grafana/pkg/services/accesscontrol/permreg"
"github.com/grafana/grafana/pkg/services/accesscontrol/resourcepermissions"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/org"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util/testutil"
)
func TestIntegration_OSSBasicRolePermissions_PersistAndRefreshOnRegisterFixedRoles(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
ctx := context.Background()
sql := db.InitTestDB(t)
store := database.ProvideService(sql)
svc := ProvideOSSService(
setting.NewCfg(),
store,
&resourcepermissions.FakeActionSetSvc{},
localcache.ProvideService(),
featuremgmt.WithFeatures(),
tracing.InitializeTracerForTest(),
sql,
permreg.ProvidePermissionRegistry(),
nil,
)
require.NoError(t, svc.DeclareFixedRoles(accesscontrol.RoleRegistration{
Role: accesscontrol.RoleDTO{
Name: "fixed:test:role",
Permissions: []accesscontrol.Permission{
{Action: "test:read", Scope: ""},
},
},
Grants: []string{string(org.RoleViewer)},
}))
require.NoError(t, svc.RegisterFixedRoles(ctx))
// verify permission is persisted to DB for basic:viewer
require.NoError(t, sql.WithDbSession(ctx, func(sess *db.Session) error {
var role accesscontrol.Role
ok, err := sess.Table("role").Where("uid = ?", accesscontrol.BasicRoleUIDPrefix+"viewer").Get(&role)
require.NoError(t, err)
require.True(t, ok)
var count int64
count, err = sess.Table("permission").Where("role_id = ? AND action = ? AND scope = ?", role.ID, "test:read", "").Count()
require.NoError(t, err)
require.Equal(t, int64(1), count)
return nil
}))
// ensure RegisterFixedRoles refreshes it back to defaults
require.NoError(t, sql.WithDbSession(ctx, func(sess *db.Session) error {
ts := time.Now()
var role accesscontrol.Role
ok, err := sess.Table("role").Where("uid = ?", accesscontrol.BasicRoleUIDPrefix+"viewer").Get(&role)
require.NoError(t, err)
require.True(t, ok)
_, err = sess.Exec("DELETE FROM permission WHERE role_id = ?", role.ID)
require.NoError(t, err)
p := accesscontrol.Permission{
RoleID: role.ID,
Action: "custom:keep",
Scope: "",
Created: ts,
Updated: ts,
}
p.Kind, p.Attribute, p.Identifier = accesscontrol.SplitScope(p.Scope)
_, err = sess.Table("permission").Insert(&p)
return err
}))
svc2 := ProvideOSSService(
setting.NewCfg(),
store,
&resourcepermissions.FakeActionSetSvc{},
localcache.ProvideService(),
featuremgmt.WithFeatures(),
tracing.InitializeTracerForTest(),
sql,
permreg.ProvidePermissionRegistry(),
nil,
)
require.NoError(t, svc2.DeclareFixedRoles(accesscontrol.RoleRegistration{
Role: accesscontrol.RoleDTO{
Name: "fixed:test:role",
Permissions: []accesscontrol.Permission{
{Action: "test:read", Scope: ""},
},
},
Grants: []string{string(org.RoleViewer)},
}))
require.NoError(t, svc2.RegisterFixedRoles(ctx))
require.NoError(t, sql.WithDbSession(ctx, func(sess *db.Session) error {
var role accesscontrol.Role
ok, err := sess.Table("role").Where("uid = ?", accesscontrol.BasicRoleUIDPrefix+"viewer").Get(&role)
require.NoError(t, err)
require.True(t, ok)
var count int64
count, err = sess.Table("permission").Where("role_id = ? AND action = ? AND scope = ?", role.ID, "test:read", "").Count()
require.NoError(t, err)
require.Equal(t, int64(1), count)
count, err = sess.Table("permission").Where("role_id = ? AND action = ?", role.ID, "custom:keep").Count()
require.NoError(t, err)
require.Equal(t, int64(0), count)
return nil
}))
}
+2 -62
View File
@@ -30,7 +30,6 @@ import (
"github.com/grafana/grafana/pkg/services/accesscontrol/migrator"
"github.com/grafana/grafana/pkg/services/accesscontrol/permreg"
"github.com/grafana/grafana/pkg/services/accesscontrol/pluginutils"
"github.com/grafana/grafana/pkg/services/accesscontrol/seeding"
"github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/folder"
@@ -97,12 +96,6 @@ func ProvideOSSService(
roles: accesscontrol.BuildBasicRoleDefinitions(),
store: store,
permRegistry: permRegistry,
sql: db,
serverLock: lock,
}
if backend, ok := store.(*database.AccessControlStore); ok {
s.seeder = seeding.New(log.New("accesscontrol.seeder"), backend, backend)
}
return s
@@ -119,11 +112,8 @@ type Service struct {
rolesMu sync.RWMutex
roles map[string]*accesscontrol.RoleDTO
store accesscontrol.Store
seeder *seeding.Seeder
permRegistry permreg.PermissionRegistry
isInitialized bool
sql db.DB
serverLock *serverlock.ServerLockService
}
func (s *Service) GetUsageStats(_ context.Context) map[string]any {
@@ -441,54 +431,17 @@ func (s *Service) RegisterFixedRoles(ctx context.Context) error {
defer span.End()
s.rolesMu.Lock()
registrations := s.registrations.Slice()
defer s.rolesMu.Unlock()
s.registrations.Range(func(registration accesscontrol.RoleRegistration) bool {
s.registerRolesLocked(registration)
return true
})
s.isInitialized = true
rolesSnapshot := s.getBasicRolePermissionsLocked()
s.rolesMu.Unlock()
if s.seeder != nil {
if err := s.seeder.SeedRoles(ctx, registrations); err != nil {
return err
}
if err := s.seeder.RemoveAbsentRoles(ctx); err != nil {
return err
}
}
if err := s.refreshBasicRolePermissionsInDB(ctx, rolesSnapshot); err != nil {
return err
}
return nil
}
// getBasicRolePermissionsSnapshotFromRegistrationsLocked computes the desired basic role permissions from the
// current registration list, using the shared seeding registration logic.
//
// it has to be called while holding the roles lock
func (s *Service) getBasicRolePermissionsLocked() map[string][]accesscontrol.Permission {
desired := map[accesscontrol.SeedPermission]struct{}{}
s.registrations.Range(func(registration accesscontrol.RoleRegistration) bool {
seeding.AppendDesiredPermissions(desired, s.log, &registration.Role, registration.Grants, registration.Exclude, true)
return true
})
out := make(map[string][]accesscontrol.Permission)
for sp := range desired {
out[sp.BuiltInRole] = append(out[sp.BuiltInRole], accesscontrol.Permission{
Action: sp.Action,
Scope: sp.Scope,
})
}
return out
}
// registerRolesLocked processes a single role registration and adds permissions to basic roles.
// Must be called with s.rolesMu locked.
func (s *Service) registerRolesLocked(registration accesscontrol.RoleRegistration) {
@@ -521,7 +474,6 @@ func (s *Service) DeclarePluginRoles(ctx context.Context, ID, name string, regs
defer span.End()
acRegs := pluginutils.ToRegistrations(ID, name, regs)
updatedBasicRoles := false
for _, r := range acRegs {
if err := pluginutils.ValidatePluginRole(ID, r.Role); err != nil {
return err
@@ -548,23 +500,11 @@ func (s *Service) DeclarePluginRoles(ctx context.Context, ID, name string, regs
if initialized {
s.rolesMu.Lock()
s.registerRolesLocked(r)
updatedBasicRoles = true
s.rolesMu.Unlock()
s.cache.Flush()
}
}
if updatedBasicRoles {
s.rolesMu.RLock()
rolesSnapshot := s.getBasicRolePermissionsLocked()
s.rolesMu.RUnlock()
// plugin roles can be declared after startup - keep DB in sync
if err := s.refreshBasicRolePermissionsInDB(ctx, rolesSnapshot); err != nil {
return err
}
}
return nil
}
@@ -1,623 +0,0 @@
package database
import (
"context"
"strings"
"time"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/seeding"
"github.com/grafana/grafana/pkg/services/sqlstore/migrator"
"github.com/grafana/grafana/pkg/util/xorm/core"
)
const basicRolePermBatchSize = 500
// LoadRoles returns all fixed and plugin roles (global org) with permissions, indexed by role name.
func (s *AccessControlStore) LoadRoles(ctx context.Context) (map[string]*accesscontrol.RoleDTO, error) {
out := map[string]*accesscontrol.RoleDTO{}
err := s.sql.WithDbSession(ctx, func(sess *db.Session) error {
type roleRow struct {
ID int64 `xorm:"id"`
OrgID int64 `xorm:"org_id"`
Version int64 `xorm:"version"`
UID string `xorm:"uid"`
Name string `xorm:"name"`
DisplayName string `xorm:"display_name"`
Description string `xorm:"description"`
Group string `xorm:"group_name"`
Hidden bool `xorm:"hidden"`
Updated time.Time `xorm:"updated"`
Created time.Time `xorm:"created"`
}
roles := []roleRow{}
if err := sess.Table("role").
Where("org_id = ?", accesscontrol.GlobalOrgID).
Where("(name LIKE ? OR name LIKE ?)", accesscontrol.FixedRolePrefix+"%", accesscontrol.PluginRolePrefix+"%").
Find(&roles); err != nil {
return err
}
if len(roles) == 0 {
return nil
}
roleIDs := make([]any, 0, len(roles))
roleByID := make(map[int64]*accesscontrol.RoleDTO, len(roles))
for _, r := range roles {
dto := &accesscontrol.RoleDTO{
ID: r.ID,
OrgID: r.OrgID,
Version: r.Version,
UID: r.UID,
Name: r.Name,
DisplayName: r.DisplayName,
Description: r.Description,
Group: r.Group,
Hidden: r.Hidden,
Updated: r.Updated,
Created: r.Created,
}
out[dto.Name] = dto
roleByID[dto.ID] = dto
roleIDs = append(roleIDs, dto.ID)
}
type permRow struct {
RoleID int64 `xorm:"role_id"`
Action string `xorm:"action"`
Scope string `xorm:"scope"`
}
perms := []permRow{}
if err := sess.Table("permission").In("role_id", roleIDs...).Find(&perms); err != nil {
return err
}
for _, p := range perms {
dto := roleByID[p.RoleID]
if dto == nil {
continue
}
dto.Permissions = append(dto.Permissions, accesscontrol.Permission{
RoleID: p.RoleID,
Action: p.Action,
Scope: p.Scope,
})
}
return nil
})
return out, err
}
func (s *AccessControlStore) SetRole(ctx context.Context, existingRole *accesscontrol.RoleDTO, wantedRole accesscontrol.RoleDTO) error {
if existingRole == nil {
return nil
}
return s.sql.WithDbSession(ctx, func(sess *db.Session) error {
_, err := sess.Table("role").
Where("id = ? AND org_id = ?", existingRole.ID, accesscontrol.GlobalOrgID).
Update(map[string]any{
"display_name": wantedRole.DisplayName,
"description": wantedRole.Description,
"group_name": wantedRole.Group,
"hidden": wantedRole.Hidden,
"updated": time.Now(),
})
return err
})
}
func (s *AccessControlStore) SetPermissions(ctx context.Context, existingRole *accesscontrol.RoleDTO, wantedRole accesscontrol.RoleDTO) error {
if existingRole == nil {
return nil
}
type key struct{ Action, Scope string }
existing := map[key]struct{}{}
for _, p := range existingRole.Permissions {
existing[key{p.Action, p.Scope}] = struct{}{}
}
desired := map[key]struct{}{}
for _, p := range wantedRole.Permissions {
desired[key{p.Action, p.Scope}] = struct{}{}
}
toAdd := make([]accesscontrol.Permission, 0)
toRemove := make([]accesscontrol.SeedPermission, 0)
now := time.Now()
for k := range desired {
if _, ok := existing[k]; ok {
continue
}
perm := accesscontrol.Permission{
RoleID: existingRole.ID,
Action: k.Action,
Scope: k.Scope,
Created: now,
Updated: now,
}
perm.Kind, perm.Attribute, perm.Identifier = accesscontrol.SplitScope(perm.Scope)
toAdd = append(toAdd, perm)
}
for k := range existing {
if _, ok := desired[k]; ok {
continue
}
toRemove = append(toRemove, accesscontrol.SeedPermission{Action: k.Action, Scope: k.Scope})
}
if len(toAdd) == 0 && len(toRemove) == 0 {
return nil
}
return s.sql.WithTransactionalDbSession(ctx, func(sess *db.Session) error {
if len(toRemove) > 0 {
if err := DeleteRolePermissionTuples(sess, s.sql.GetDBType(), existingRole.ID, toRemove); err != nil {
return err
}
}
if len(toAdd) > 0 {
_, err := sess.InsertMulti(toAdd)
return err
}
return nil
})
}
func (s *AccessControlStore) CreateRole(ctx context.Context, role accesscontrol.RoleDTO) error {
now := time.Now()
uid := role.UID
if uid == "" && (strings.HasPrefix(role.Name, accesscontrol.FixedRolePrefix) || strings.HasPrefix(role.Name, accesscontrol.PluginRolePrefix)) {
uid = accesscontrol.PrefixedRoleUID(role.Name)
}
r := accesscontrol.Role{
OrgID: accesscontrol.GlobalOrgID,
Version: role.Version,
UID: uid,
Name: role.Name,
DisplayName: role.DisplayName,
Description: role.Description,
Group: role.Group,
Hidden: role.Hidden,
Created: now,
Updated: now,
}
if r.Version == 0 {
r.Version = 1
}
return s.sql.WithTransactionalDbSession(ctx, func(sess *db.Session) error {
if _, err := sess.Insert(&r); err != nil {
return err
}
if len(role.Permissions) == 0 {
return nil
}
// De-duplicate permissions on (action, scope) to avoid unique constraint violations.
// Some role definitions may accidentally include duplicates.
type permKey struct{ Action, Scope string }
seen := make(map[permKey]struct{}, len(role.Permissions))
perms := make([]accesscontrol.Permission, 0, len(role.Permissions))
for _, p := range role.Permissions {
k := permKey{Action: p.Action, Scope: p.Scope}
if _, ok := seen[k]; ok {
continue
}
seen[k] = struct{}{}
perm := accesscontrol.Permission{
RoleID: r.ID,
Action: p.Action,
Scope: p.Scope,
Created: now,
Updated: now,
}
perm.Kind, perm.Attribute, perm.Identifier = accesscontrol.SplitScope(perm.Scope)
perms = append(perms, perm)
}
_, err := sess.InsertMulti(perms)
return err
})
}
func (s *AccessControlStore) DeleteRoles(ctx context.Context, roleUIDs []string) error {
if len(roleUIDs) == 0 {
return nil
}
uids := make([]any, 0, len(roleUIDs))
for _, uid := range roleUIDs {
uids = append(uids, uid)
}
return s.sql.WithTransactionalDbSession(ctx, func(sess *db.Session) error {
type row struct {
ID int64 `xorm:"id"`
UID string `xorm:"uid"`
}
rows := []row{}
if err := sess.Table("role").
Where("org_id = ?", accesscontrol.GlobalOrgID).
In("uid", uids...).
Find(&rows); err != nil {
return err
}
if len(rows) == 0 {
return nil
}
roleIDs := make([]any, 0, len(rows))
for _, r := range rows {
roleIDs = append(roleIDs, r.ID)
}
// Remove permissions and assignments first to avoid FK issues (if enabled).
{
args := append([]any{"DELETE FROM permission WHERE role_id IN (?" + strings.Repeat(",?", len(roleIDs)-1) + ")"}, roleIDs...)
if _, err := sess.Exec(args...); err != nil {
return err
}
}
{
args := append([]any{"DELETE FROM user_role WHERE role_id IN (?" + strings.Repeat(",?", len(roleIDs)-1) + ")"}, roleIDs...)
if _, err := sess.Exec(args...); err != nil {
return err
}
}
{
args := append([]any{"DELETE FROM team_role WHERE role_id IN (?" + strings.Repeat(",?", len(roleIDs)-1) + ")"}, roleIDs...)
if _, err := sess.Exec(args...); err != nil {
return err
}
}
{
args := append([]any{"DELETE FROM builtin_role WHERE role_id IN (?" + strings.Repeat(",?", len(roleIDs)-1) + ")"}, roleIDs...)
if _, err := sess.Exec(args...); err != nil {
return err
}
}
args := append([]any{"DELETE FROM role WHERE org_id = ? AND uid IN (?" + strings.Repeat(",?", len(uids)-1) + ")", accesscontrol.GlobalOrgID}, uids...)
_, err := sess.Exec(args...)
return err
})
}
// OSS basic-role permission refresh uses seeding.Seeder.Seed() with a desired set computed in memory.
// These methods implement the permission seeding part of seeding.SeedingBackend against the current permission table.
func (s *AccessControlStore) LoadPrevious(ctx context.Context) (map[accesscontrol.SeedPermission]struct{}, error) {
var out map[accesscontrol.SeedPermission]struct{}
err := s.sql.WithDbSession(ctx, func(sess *db.Session) error {
rows, err := LoadBasicRoleSeedPermissions(sess)
if err != nil {
return err
}
out = make(map[accesscontrol.SeedPermission]struct{}, len(rows))
for _, r := range rows {
r.Origin = ""
out[r] = struct{}{}
}
return nil
})
return out, err
}
func (s *AccessControlStore) Apply(ctx context.Context, added, removed []accesscontrol.SeedPermission, updated map[accesscontrol.SeedPermission]accesscontrol.SeedPermission) error {
rolesToUpgrade := seeding.RolesToUpgrade(added, removed)
// Run the same OSS apply logic as ossBasicRoleSeedBackend.Apply inside a single transaction.
return s.sql.WithTransactionalDbSession(ctx, func(sess *db.Session) error {
defs := accesscontrol.BuildBasicRoleDefinitions()
builtinToRoleID, err := EnsureBasicRolesExist(sess, defs)
if err != nil {
return err
}
backend := &ossBasicRoleSeedBackend{
sess: sess,
now: time.Now(),
builtinToRoleID: builtinToRoleID,
desired: nil,
dbType: s.sql.GetDBType(),
}
if err := backend.Apply(ctx, added, removed, updated); err != nil {
return err
}
return BumpBasicRoleVersions(sess, rolesToUpgrade)
})
}
// EnsureBasicRolesExist ensures the built-in basic roles exist in the role table and are bound in builtin_role.
// It returns a mapping from builtin role name (for example "Admin") to role ID.
func EnsureBasicRolesExist(sess *db.Session, defs map[string]*accesscontrol.RoleDTO) (map[string]int64, error) {
uidToBuiltin := make(map[string]string, len(defs))
uids := make([]any, 0, len(defs))
for builtin, def := range defs {
uidToBuiltin[def.UID] = builtin
uids = append(uids, def.UID)
}
type roleRow struct {
ID int64 `xorm:"id"`
UID string `xorm:"uid"`
}
rows := []roleRow{}
if err := sess.Table("role").
Where("org_id = ?", accesscontrol.GlobalOrgID).
In("uid", uids...).
Find(&rows); err != nil {
return nil, err
}
ts := time.Now()
builtinToRoleID := make(map[string]int64, len(defs))
for _, r := range rows {
br, ok := uidToBuiltin[r.UID]
if !ok {
continue
}
builtinToRoleID[br] = r.ID
}
for builtin, def := range defs {
roleID, ok := builtinToRoleID[builtin]
if !ok {
role := accesscontrol.Role{
OrgID: def.OrgID,
Version: def.Version,
UID: def.UID,
Name: def.Name,
DisplayName: def.DisplayName,
Description: def.Description,
Group: def.Group,
Hidden: def.Hidden,
Created: ts,
Updated: ts,
}
if _, err := sess.Insert(&role); err != nil {
return nil, err
}
roleID = role.ID
builtinToRoleID[builtin] = roleID
}
has, err := sess.Table("builtin_role").
Where("role_id = ? AND role = ? AND org_id = ?", roleID, builtin, accesscontrol.GlobalOrgID).
Exist()
if err != nil {
return nil, err
}
if !has {
br := accesscontrol.BuiltinRole{
RoleID: roleID,
OrgID: accesscontrol.GlobalOrgID,
Role: builtin,
Created: ts,
Updated: ts,
}
if _, err := sess.Table("builtin_role").Insert(&br); err != nil {
return nil, err
}
}
}
return builtinToRoleID, nil
}
// DeleteRolePermissionTuples deletes permissions for a single role by (action, scope) pairs.
//
// It uses a row-constructor IN clause where supported (MySQL, Postgres, SQLite) and falls back
// to a WHERE ... OR ... form for MSSQL.
func DeleteRolePermissionTuples(sess *db.Session, dbType core.DbType, roleID int64, perms []accesscontrol.SeedPermission) error {
if len(perms) == 0 {
return nil
}
if dbType == migrator.MSSQL {
// MSSQL doesn't support (action, scope) IN ((?,?),(?,?)) row constructors.
where := make([]string, 0, len(perms))
args := make([]any, 0, 1+len(perms)*2)
args = append(args, roleID)
for _, p := range perms {
where = append(where, "(action = ? AND scope = ?)")
args = append(args, p.Action, p.Scope)
}
_, err := sess.Exec(
append([]any{
"DELETE FROM permission WHERE role_id = ? AND (" + strings.Join(where, " OR ") + ")",
}, args...)...,
)
return err
}
args := make([]any, 0, 1+len(perms)*2)
args = append(args, roleID)
for _, p := range perms {
args = append(args, p.Action, p.Scope)
}
sql := "DELETE FROM permission WHERE role_id = ? AND (action, scope) IN (" +
strings.Repeat("(?, ?),", len(perms)-1) + "(?, ?))"
_, err := sess.Exec(append([]any{sql}, args...)...)
return err
}
type ossBasicRoleSeedBackend struct {
sess *db.Session
now time.Time
builtinToRoleID map[string]int64
desired map[accesscontrol.SeedPermission]struct{}
dbType core.DbType
}
func (b *ossBasicRoleSeedBackend) LoadPrevious(_ context.Context) (map[accesscontrol.SeedPermission]struct{}, error) {
rows, err := LoadBasicRoleSeedPermissions(b.sess)
if err != nil {
return nil, err
}
out := make(map[accesscontrol.SeedPermission]struct{}, len(rows))
for _, r := range rows {
// Ensure the key matches what OSS seeding uses (Origin is always empty for basic role refresh).
r.Origin = ""
out[r] = struct{}{}
}
return out, nil
}
func (b *ossBasicRoleSeedBackend) LoadDesired(_ context.Context) (map[accesscontrol.SeedPermission]struct{}, error) {
return b.desired, nil
}
func (b *ossBasicRoleSeedBackend) Apply(_ context.Context, added, removed []accesscontrol.SeedPermission, updated map[accesscontrol.SeedPermission]accesscontrol.SeedPermission) error {
// Delete removed permissions (this includes user-defined permissions that aren't in desired).
if len(removed) > 0 {
permsByRoleID := map[int64][]accesscontrol.SeedPermission{}
for _, p := range removed {
roleID, ok := b.builtinToRoleID[p.BuiltInRole]
if !ok {
continue
}
permsByRoleID[roleID] = append(permsByRoleID[roleID], p)
}
for roleID, perms := range permsByRoleID {
// Chunk to keep statement sizes and parameter counts bounded.
if err := batch(len(perms), basicRolePermBatchSize, func(start, end int) error {
return DeleteRolePermissionTuples(b.sess, b.dbType, roleID, perms[start:end])
}); err != nil {
return err
}
}
}
// Insert added permissions and updated-target permissions.
toInsertSeed := make([]accesscontrol.SeedPermission, 0, len(added)+len(updated))
toInsertSeed = append(toInsertSeed, added...)
for _, v := range updated {
toInsertSeed = append(toInsertSeed, v)
}
if len(toInsertSeed) == 0 {
return nil
}
// De-duplicate on (role_id, action, scope). This avoids unique constraint violations when:
// - the same permission appears in both added and updated
// - multiple plugin origins grant the same permission (Origin is not persisted in permission table)
type permKey struct {
RoleID int64
Action string
Scope string
}
seen := make(map[permKey]struct{}, len(toInsertSeed))
toInsert := make([]accesscontrol.Permission, 0, len(toInsertSeed))
for _, p := range toInsertSeed {
roleID, ok := b.builtinToRoleID[p.BuiltInRole]
if !ok {
continue
}
k := permKey{RoleID: roleID, Action: p.Action, Scope: p.Scope}
if _, ok := seen[k]; ok {
continue
}
seen[k] = struct{}{}
perm := accesscontrol.Permission{
RoleID: roleID,
Action: p.Action,
Scope: p.Scope,
Created: b.now,
Updated: b.now,
}
perm.Kind, perm.Attribute, perm.Identifier = accesscontrol.SplitScope(perm.Scope)
toInsert = append(toInsert, perm)
}
return batch(len(toInsert), basicRolePermBatchSize, func(start, end int) error {
// MySQL: ignore conflicts to make seeding idempotent under retries/concurrency.
// Conflicts can happen if the same permission already exists (unique on role_id, action, scope).
if b.dbType == migrator.MySQL {
args := make([]any, 0, (end-start)*8)
for i := start; i < end; i++ {
p := toInsert[i]
args = append(args, p.RoleID, p.Action, p.Scope, p.Kind, p.Attribute, p.Identifier, p.Updated, p.Created)
}
sql := append([]any{`INSERT IGNORE INTO permission (role_id, action, scope, kind, attribute, identifier, updated, created) VALUES ` +
strings.Repeat("(?, ?, ?, ?, ?, ?, ?, ?),", end-start-1) + "(?, ?, ?, ?, ?, ?, ?, ?)"}, args...)
_, err := b.sess.Exec(sql...)
return err
}
_, err := b.sess.InsertMulti(toInsert[start:end])
return err
})
}
func batch(count, size int, eachFn func(start, end int) error) error {
for i := 0; i < count; {
end := i + size
if end > count {
end = count
}
if err := eachFn(i, end); err != nil {
return err
}
i = end
}
return nil
}
// BumpBasicRoleVersions increments the role version for the given builtin basic roles (Viewer/Editor/Admin/Grafana Admin).
// Unknown role names are ignored.
func BumpBasicRoleVersions(sess *db.Session, basicRoles []string) error {
if len(basicRoles) == 0 {
return nil
}
defs := accesscontrol.BuildBasicRoleDefinitions()
uids := make([]any, 0, len(basicRoles))
for _, br := range basicRoles {
def, ok := defs[br]
if !ok {
continue
}
uids = append(uids, def.UID)
}
if len(uids) == 0 {
return nil
}
sql := "UPDATE role SET version = version + 1 WHERE org_id = ? AND uid IN (?" + strings.Repeat(",?", len(uids)-1) + ")"
_, err := sess.Exec(append([]any{sql, accesscontrol.GlobalOrgID}, uids...)...)
return err
}
// LoadBasicRoleSeedPermissions returns the current (builtin_role, action, scope) permissions granted to basic roles.
// It sets Origin to empty.
func LoadBasicRoleSeedPermissions(sess *db.Session) ([]accesscontrol.SeedPermission, error) {
rows := []accesscontrol.SeedPermission{}
err := sess.SQL(
`SELECT role.display_name AS builtin_role, p.action, p.scope, '' AS origin
FROM role INNER JOIN permission AS p ON p.role_id = role.id
WHERE role.org_id = ? AND role.name LIKE 'basic:%'`,
accesscontrol.GlobalOrgID,
).Find(&rows)
return rows, err
}
@@ -15,7 +15,6 @@ import (
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/infra/serverlock"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/authz/zanzana"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/folder"
@@ -131,9 +130,6 @@ func (r *ZanzanaReconciler) Run(ctx context.Context) error {
// Reconcile schedules as job that will run and reconcile resources between
// legacy access control and zanzana.
func (r *ZanzanaReconciler) Reconcile(ctx context.Context) error {
// Ensure we don't reconcile an empty/partial RBAC state before OSS has seeded basic role permissions.
// This matters most during startup where fixed-role loading + basic-role permission refresh runs as another background service.
r.waitForBasicRolesSeeded(ctx)
r.reconcile(ctx)
// FIXME:
@@ -149,57 +145,6 @@ func (r *ZanzanaReconciler) Reconcile(ctx context.Context) error {
}
}
func (r *ZanzanaReconciler) hasBasicRolePermissions(ctx context.Context) bool {
var count int64
// Basic role permissions are stored on "basic:%" roles in the global org (0).
// In a fresh DB, this will be empty until fixed roles are registered and the basic role permission refresh runs.
type row struct {
Count int64 `xorm:"count"`
}
_ = r.store.WithDbSession(ctx, func(sess *db.Session) error {
var rr row
_, err := sess.SQL(
`SELECT COUNT(*) AS count
FROM role INNER JOIN permission AS p ON p.role_id = role.id
WHERE role.org_id = ? AND role.name LIKE ?`,
accesscontrol.GlobalOrgID,
accesscontrol.BasicRolePrefix+"%",
).Get(&rr)
if err != nil {
return err
}
count = rr.Count
return nil
})
return count > 0
}
func (r *ZanzanaReconciler) waitForBasicRolesSeeded(ctx context.Context) {
// Best-effort: don't block forever. If we can't observe basic roles, proceed anyway.
const (
maxWait = 15 * time.Second
interval = 1 * time.Second
)
deadline := time.NewTimer(maxWait)
defer deadline.Stop()
ticker := time.NewTicker(interval)
defer ticker.Stop()
for {
if r.hasBasicRolePermissions(ctx) {
return
}
select {
case <-ctx.Done():
return
case <-deadline.C:
return
case <-ticker.C:
}
}
}
func (r *ZanzanaReconciler) reconcile(ctx context.Context) {
run := func(ctx context.Context, namespace string) (ok bool) {
now := time.Now()
@@ -1,67 +0,0 @@
package dualwrite
import (
"context"
"testing"
"time"
"github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/services/accesscontrol"
)
func TestZanzanaReconciler_hasBasicRolePermissions(t *testing.T) {
env := setupTestEnv(t)
r := &ZanzanaReconciler{
store: env.db,
}
ctx := context.Background()
require.False(t, r.hasBasicRolePermissions(ctx))
err := env.db.WithDbSession(ctx, func(sess *db.Session) error {
now := time.Now()
_, err := sess.Exec(
`INSERT INTO role (org_id, uid, name, display_name, group_name, description, hidden, version, created, updated)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
accesscontrol.GlobalOrgID,
"basic_viewer_uid_test",
accesscontrol.BasicRolePrefix+"viewer",
"Viewer",
"Basic",
"Viewer role",
false,
1,
now,
now,
)
if err != nil {
return err
}
var roleID int64
if _, err := sess.SQL(`SELECT id FROM role WHERE org_id = ? AND uid = ?`, accesscontrol.GlobalOrgID, "basic_viewer_uid_test").Get(&roleID); err != nil {
return err
}
_, err = sess.Exec(
`INSERT INTO permission (role_id, action, scope, kind, attribute, identifier, created, updated)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
roleID,
"dashboards:read",
"dashboards:*",
"",
"",
"",
now,
now,
)
return err
})
require.NoError(t, err)
require.True(t, r.hasBasicRolePermissions(ctx))
}
-16
View File
@@ -1,7 +1,6 @@
package accesscontrol
import (
"context"
"encoding/json"
"errors"
"fmt"
@@ -595,18 +594,3 @@ type QueryWithOrg struct {
OrgId *int64 `json:"orgId"`
Global bool `json:"global"`
}
type SeedPermission struct {
BuiltInRole string `xorm:"builtin_role"`
Action string `xorm:"action"`
Scope string `xorm:"scope"`
Origin string `xorm:"origin"`
}
type RoleStore interface {
LoadRoles(ctx context.Context) (map[string]*RoleDTO, error)
SetRole(ctx context.Context, existingRole *RoleDTO, wantedRole RoleDTO) error
SetPermissions(ctx context.Context, existingRole *RoleDTO, wantedRole RoleDTO) error
CreateRole(ctx context.Context, role RoleDTO) error
DeleteRoles(ctx context.Context, roleUIDs []string) error
}
@@ -1,451 +0,0 @@
package seeding
import (
"context"
"fmt"
"regexp"
"slices"
"strings"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/pluginutils"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginaccesscontrol"
)
type Seeder struct {
log log.Logger
roleStore accesscontrol.RoleStore
backend SeedingBackend
builtinsPermissions map[accesscontrol.SeedPermission]struct{}
seededFixedRoles map[string]bool
seededPluginRoles map[string]bool
seededPlugins map[string]bool
hasSeededAlready bool
}
// SeedingBackend provides the seed-set specific operations needed to seed.
type SeedingBackend interface {
// LoadPrevious returns the currently stored permissions for previously seeded roles.
LoadPrevious(ctx context.Context) (map[accesscontrol.SeedPermission]struct{}, error)
// Apply updates the database to match the desired permissions.
Apply(ctx context.Context,
added, removed []accesscontrol.SeedPermission,
updated map[accesscontrol.SeedPermission]accesscontrol.SeedPermission,
) error
}
func New(log log.Logger, roleStore accesscontrol.RoleStore, backend SeedingBackend) *Seeder {
return &Seeder{
log: log,
roleStore: roleStore,
backend: backend,
builtinsPermissions: map[accesscontrol.SeedPermission]struct{}{},
seededFixedRoles: map[string]bool{},
seededPluginRoles: map[string]bool{},
seededPlugins: map[string]bool{},
hasSeededAlready: false,
}
}
// SetDesiredPermissions replaces the in-memory desired permission set used by Seed().
func (s *Seeder) SetDesiredPermissions(desired map[accesscontrol.SeedPermission]struct{}) {
if desired == nil {
s.builtinsPermissions = map[accesscontrol.SeedPermission]struct{}{}
return
}
s.builtinsPermissions = desired
}
// Seed loads current and desired permissions, diffs them (including scope updates), applies changes, and bumps versions.
func (s *Seeder) Seed(ctx context.Context) error {
previous, err := s.backend.LoadPrevious(ctx)
if err != nil {
return err
}
// - Do not remove plugin permissions when the plugin didn't register this run (Origin set but not in seededPlugins).
// - Preserve legacy plugin app access permissions in the persisted seed set (these are granted by default).
if len(previous) > 0 {
filtered := make(map[accesscontrol.SeedPermission]struct{}, len(previous))
for p := range previous {
if p.Action == pluginaccesscontrol.ActionAppAccess {
continue
}
if p.Origin != "" && !s.seededPlugins[p.Origin] {
continue
}
filtered[p] = struct{}{}
}
previous = filtered
}
added, removed, updated := s.permissionDiff(previous, s.builtinsPermissions)
if err := s.backend.Apply(ctx, added, removed, updated); err != nil {
return err
}
return nil
}
// SeedRoles populates the database with the roles and their assignments
// It will create roles that do not exist and update roles that have changed
// Do not use for provisioning. Validation is not enforced.
func (s *Seeder) SeedRoles(ctx context.Context, registrationList []accesscontrol.RoleRegistration) error {
roleMap, err := s.roleStore.LoadRoles(ctx)
if err != nil {
return err
}
missingRoles := make([]accesscontrol.RoleRegistration, 0, len(registrationList))
// Diff existing roles with the ones we want to seed.
// If a role is missing, we add it to the missingRoles list
for _, registration := range registrationList {
registration := registration
role, ok := roleMap[registration.Role.Name]
switch {
case registration.Role.IsFixed():
s.seededFixedRoles[registration.Role.Name] = true
case registration.Role.IsPlugin():
s.seededPluginRoles[registration.Role.Name] = true
// To be resilient to failed plugin loadings, we remember the plugins that have registered,
// later we'll ignore permissions and roles of other plugins
s.seededPlugins[pluginutils.PluginIDFromName(registration.Role.Name)] = true
}
s.rememberPermissionAssignments(&registration.Role, registration.Grants, registration.Exclude)
if !ok {
missingRoles = append(missingRoles, registration)
continue
}
if needsRoleUpdate(role, registration.Role) {
if err := s.roleStore.SetRole(ctx, role, registration.Role); err != nil {
return err
}
}
if needsPermissionsUpdate(role, registration.Role) {
if err := s.roleStore.SetPermissions(ctx, role, registration.Role); err != nil {
return err
}
}
}
for _, registration := range missingRoles {
if err := s.roleStore.CreateRole(ctx, registration.Role); err != nil {
return err
}
}
return nil
}
func needsPermissionsUpdate(existingRole *accesscontrol.RoleDTO, wantedRole accesscontrol.RoleDTO) bool {
if existingRole == nil {
return true
}
if len(existingRole.Permissions) != len(wantedRole.Permissions) {
return true
}
for _, p := range wantedRole.Permissions {
found := false
for _, ep := range existingRole.Permissions {
if ep.Action == p.Action && ep.Scope == p.Scope {
found = true
break
}
}
if !found {
return true
}
}
return false
}
func needsRoleUpdate(existingRole *accesscontrol.RoleDTO, wantedRole accesscontrol.RoleDTO) bool {
if existingRole == nil {
return true
}
if existingRole.Name != wantedRole.Name {
return false
}
if existingRole.DisplayName != wantedRole.DisplayName {
return true
}
if existingRole.Description != wantedRole.Description {
return true
}
if existingRole.Group != wantedRole.Group {
return true
}
if existingRole.Hidden != wantedRole.Hidden {
return true
}
return false
}
// Deprecated: SeedRole is deprecated and should not be used.
// SeedRoles only does boot up seeding and should not be used for runtime seeding.
func (s *Seeder) SeedRole(ctx context.Context, role accesscontrol.RoleDTO, builtInRoles []string) error {
addedPermissions := make(map[string]struct{}, len(role.Permissions))
permissions := make([]accesscontrol.Permission, 0, len(role.Permissions))
for _, p := range role.Permissions {
key := fmt.Sprintf("%s:%s", p.Action, p.Scope)
if _, ok := addedPermissions[key]; !ok {
addedPermissions[key] = struct{}{}
permissions = append(permissions, accesscontrol.Permission{Action: p.Action, Scope: p.Scope})
}
}
wantedRole := accesscontrol.RoleDTO{
OrgID: accesscontrol.GlobalOrgID,
Version: role.Version,
UID: role.UID,
Name: role.Name,
DisplayName: role.DisplayName,
Description: role.Description,
Group: role.Group,
Permissions: permissions,
Hidden: role.Hidden,
}
roleMap, err := s.roleStore.LoadRoles(ctx)
if err != nil {
return err
}
existingRole := roleMap[wantedRole.Name]
if existingRole == nil {
if err := s.roleStore.CreateRole(ctx, wantedRole); err != nil {
return err
}
} else {
if needsRoleUpdate(existingRole, wantedRole) {
if err := s.roleStore.SetRole(ctx, existingRole, wantedRole); err != nil {
return err
}
}
if needsPermissionsUpdate(existingRole, wantedRole) {
if err := s.roleStore.SetPermissions(ctx, existingRole, wantedRole); err != nil {
return err
}
}
}
// Remember seeded roles
if wantedRole.IsFixed() {
s.seededFixedRoles[wantedRole.Name] = true
}
isPluginRole := wantedRole.IsPlugin()
if isPluginRole {
s.seededPluginRoles[wantedRole.Name] = true
// To be resilient to failed plugin loadings, we remember the plugins that have registered,
// later we'll ignore permissions and roles of other plugins
s.seededPlugins[pluginutils.PluginIDFromName(role.Name)] = true
}
s.rememberPermissionAssignments(&wantedRole, builtInRoles, []string{})
return nil
}
func (s *Seeder) rememberPermissionAssignments(role *accesscontrol.RoleDTO, builtInRoles []string, excludedRoles []string) {
AppendDesiredPermissions(s.builtinsPermissions, s.log, role, builtInRoles, excludedRoles, true)
}
// AppendDesiredPermissions accumulates permissions from a role registration onto basic roles (Viewer/Editor/Admin/Grafana Admin).
// - It expands parents via accesscontrol.BuiltInRolesWithParents.
// - It can optionally ignore plugin app access permissions (which are granted by default).
func AppendDesiredPermissions(
out map[accesscontrol.SeedPermission]struct{},
logger log.Logger,
role *accesscontrol.RoleDTO,
builtInRoles []string,
excludedRoles []string,
ignorePluginAppAccess bool,
) {
if out == nil || role == nil {
return
}
for builtInRole := range accesscontrol.BuiltInRolesWithParents(builtInRoles) {
// Skip excluded grants
if slices.Contains(excludedRoles, builtInRole) {
continue
}
for _, perm := range role.Permissions {
if ignorePluginAppAccess && perm.Action == pluginaccesscontrol.ActionAppAccess {
logger.Debug("Role is attempting to grant access permission, but this permission is already granted by default and will be ignored",
"role", role.Name, "permission", perm.Action, "scope", perm.Scope)
continue
}
sp := accesscontrol.SeedPermission{
BuiltInRole: builtInRole,
Action: perm.Action,
Scope: perm.Scope,
}
if role.IsPlugin() {
sp.Origin = pluginutils.PluginIDFromName(role.Name)
}
out[sp] = struct{}{}
}
}
}
// permissionDiff returns:
// - added: present in desired permissions, not in previous permissions
// - removed: present in previous permissions, not in desired permissions
// - updated: same role + action, but scope changed
func (s *Seeder) permissionDiff(previous, desired map[accesscontrol.SeedPermission]struct{}) (added, removed []accesscontrol.SeedPermission, updated map[accesscontrol.SeedPermission]accesscontrol.SeedPermission) {
addedSet := make(map[accesscontrol.SeedPermission]struct{}, 0)
for n := range desired {
if _, already := previous[n]; !already {
addedSet[n] = struct{}{}
} else {
delete(previous, n)
}
}
// Check if any of the new permissions is actually an old permission with an updated scope
updated = make(map[accesscontrol.SeedPermission]accesscontrol.SeedPermission, 0)
for n := range addedSet {
for p := range previous {
if n.BuiltInRole == p.BuiltInRole && n.Action == p.Action {
updated[p] = n
delete(addedSet, n)
}
}
}
for p := range addedSet {
added = append(added, p)
}
for p := range previous {
if p.Action == pluginaccesscontrol.ActionAppAccess &&
p.Scope != pluginaccesscontrol.ScopeProvider.GetResourceAllScope() {
// Allows backward compatibility with plugins that have been seeded before the grant ignore rule was added
s.log.Info("This permission already existed so it will not be removed",
"role", p.BuiltInRole, "permission", p.Action, "scope", p.Scope)
continue
}
removed = append(removed, p)
}
return added, removed, updated
}
func (s *Seeder) ClearBasicRolesPluginPermissions(ID string) {
removable := []accesscontrol.SeedPermission{}
for key := range s.builtinsPermissions {
if matchPermissionByPluginID(key, ID) {
removable = append(removable, key)
}
}
for _, perm := range removable {
delete(s.builtinsPermissions, perm)
}
}
func matchPermissionByPluginID(perm accesscontrol.SeedPermission, pluginID string) bool {
if perm.Origin != pluginID {
return false
}
actionTemplate := regexp.MustCompile(fmt.Sprintf("%s[.:]", pluginID))
scopeTemplate := fmt.Sprintf(":%s", pluginID)
return actionTemplate.MatchString(perm.Action) || strings.HasSuffix(perm.Scope, scopeTemplate)
}
// RolesToUpgrade returns the unique basic roles that should have their version incremented.
func RolesToUpgrade(added, removed []accesscontrol.SeedPermission) []string {
set := map[string]struct{}{}
for _, p := range added {
set[p.BuiltInRole] = struct{}{}
}
for _, p := range removed {
set[p.BuiltInRole] = struct{}{}
}
out := make([]string, 0, len(set))
for r := range set {
out = append(out, r)
}
return out
}
func (s *Seeder) ClearPluginRoles(ID string) {
expectedPrefix := fmt.Sprintf("%s%s:", accesscontrol.PluginRolePrefix, ID)
for roleName := range s.seededPluginRoles {
if strings.HasPrefix(roleName, expectedPrefix) {
delete(s.seededPluginRoles, roleName)
}
}
}
func (s *Seeder) MarkSeededAlready() {
s.hasSeededAlready = true
}
func (s *Seeder) HasSeededAlready() bool {
return s.hasSeededAlready
}
func (s *Seeder) RemoveAbsentRoles(ctx context.Context) error {
roleMap, errGet := s.roleStore.LoadRoles(ctx)
if errGet != nil {
s.log.Error("failed to get fixed roles from store", "err", errGet)
return errGet
}
toRemove := []string{}
for _, r := range roleMap {
if r == nil {
continue
}
if r.IsFixed() {
if !s.seededFixedRoles[r.Name] {
s.log.Info("role is not seeded anymore, mark it for deletion", "role", r.Name)
toRemove = append(toRemove, r.UID)
}
continue
}
if r.IsPlugin() {
if !s.seededPlugins[pluginutils.PluginIDFromName(r.Name)] {
// To be resilient to failed plugin loadings
// ignore stored roles related to plugins that have not registered this time
s.log.Debug("plugin role has not been registered on this run skipping its removal", "role", r.Name)
continue
}
if !s.seededPluginRoles[r.Name] {
s.log.Info("role is not seeded anymore, mark it for deletion", "role", r.Name)
toRemove = append(toRemove, r.UID)
}
}
}
if errDelete := s.roleStore.DeleteRoles(ctx, toRemove); errDelete != nil {
s.log.Error("failed to delete absent fixed and plugin roles", "err", errDelete)
return errDelete
}
return nil
}
+3
View File
@@ -294,6 +294,9 @@ type DashboardProvisioning struct {
ExternalID string `xorm:"external_id"`
CheckSum string
Updated int64
// note: only used when writing metadata to unified storage resources - not saved in legacy table.
AllowUIUpdates bool `xorm:"-"`
}
type DeleteDashboardCommand struct {
@@ -1942,6 +1942,7 @@ func (dr *DashboardServiceImpl) saveProvisionedDashboardThroughK8s(ctx context.C
// HOWEVER, maybe OK to leave this for now and "fix" it by using file provisioning for mode 4
m.Kind = utils.ManagerKindClassicFP // nolint:staticcheck
m.Identity = provisioning.Name
m.AllowsEdits = provisioning.AllowUIUpdates
s.Path = provisioning.ExternalID
s.Checksum = provisioning.CheckSum
s.TimestampMillis = time.Unix(provisioning.Updated, 0).UnixMilli()
+7
View File
@@ -2075,6 +2075,13 @@ var (
FrontendOnly: true,
Owner: grafanaDashboardsSquad,
},
{
Name: "smoothingTransformation",
Description: "Enables the ASAP smoothing transformation for time series data",
Stage: FeatureStageExperimental,
FrontendOnly: true,
Owner: grafanaDataProSquad,
},
}
)
+1
View File
@@ -281,3 +281,4 @@ rudderstackUpgrade,experimental,@grafana/grafana-frontend-platform,false,false,t
kubernetesAlertingHistorian,experimental,@grafana/alerting-squad,false,true,false
useMTPlugins,experimental,@grafana/plugins-platform-backend,false,false,true
multiPropsVariables,experimental,@grafana/dashboards-squad,false,false,true
smoothingTransformation,experimental,@grafana/datapro,false,false,true
1 Name Stage Owner requiresDevMode RequiresRestart FrontendOnly
281 kubernetesAlertingHistorian experimental @grafana/alerting-squad false true false
282 useMTPlugins experimental @grafana/plugins-platform-backend false false true
283 multiPropsVariables experimental @grafana/dashboards-squad false false true
284 smoothingTransformation experimental @grafana/datapro false false true
+13
View File
@@ -3293,6 +3293,19 @@
"codeowner": "@grafana/dashboards-squad"
}
},
{
"metadata": {
"name": "smoothingTransformation",
"resourceVersion": "1767349656275",
"creationTimestamp": "2026-01-02T10:27:36Z"
},
"spec": {
"description": "Enables the ASAP smoothing transformation for time series data",
"stage": "experimental",
"codeowner": "@grafana/datapro",
"frontend": true
}
},
{
"metadata": {
"name": "sqlExpressions",
@@ -6,7 +6,6 @@ import (
"strconv"
"strings"
"github.com/grafana/grafana/pkg/middleware"
"github.com/grafana/grafana/pkg/plugins"
ac "github.com/grafana/grafana/pkg/services/accesscontrol"
contextmodel "github.com/grafana/grafana/pkg/services/contexthandler/model"
@@ -129,10 +128,6 @@ func (s *ServiceImpl) processAppPlugin(plugin pluginstore.Plugin, c *contextmode
}
if include.Type == "page" {
if !middleware.PageIsFeatureToggleEnabled(c.Req.Context(), include.Path) {
s.log.Debug("Skipping page", "plugin", plugin.ID, "path", include.Path)
continue
}
link := &navtree.NavLink{
Text: include.Name,
Icon: include.Icon,
@@ -358,6 +358,8 @@ func (fr *FileReader) saveDashboard(ctx context.Context, path string, folderID i
Name: fr.Cfg.Name,
Updated: resolvedFileInfo.ModTime().Unix(),
CheckSum: jsonFile.checkSum,
// adds `grafana.app/managerAllowsEdits` to the provisioned dashboards in unified storage. not used if in legacy.
AllowUIUpdates: fr.Cfg.AllowUIUpdates,
}
_, err := fr.dashboardProvisioningService.SaveProvisionedDashboard(ctx, dash, dp)
if err != nil {
@@ -33,6 +33,8 @@ import (
)
func TestIntegrationFolderTreeZanzana(t *testing.T) {
// TODO: Add back OSS seeding and enable this test
t.Skip("Skipping folder tree test with Zanzana")
testutil.SkipIntegrationTestInShortMode(t)
runIntegrationFolderTree(t, testinfra.GrafanaOpts{
+10 -1
View File
@@ -57,7 +57,7 @@ export class AppWrapper extends Component<AppWrapperProps, AppWrapperState> {
async componentDidMount() {
this.setState({ ready: true });
$('.preloader').remove();
this.removePreloader();
// clear any old icon caches
const cacheKeys = (await window.caches?.keys()) ?? [];
@@ -68,6 +68,15 @@ export class AppWrapper extends Component<AppWrapperProps, AppWrapperState> {
}
}
removePreloader() {
const preloader = document.querySelector('.preloader');
if (preloader) {
preloader.remove();
} else {
console.warn('Preloader element not found');
}
}
renderRoute = (route: RouteDescriptor) => {
return (
<Route
@@ -53,6 +53,14 @@ export interface GraphNGProps extends Themeable2 {
dataLinkPostProcessor?: DataLinkPostProcessor;
cursorSync?: DashboardCursorSync;
// Remove fields that are hidden from the visualization before rendering
// The fields will still be available for other things like data links
// this is a temporary hack that only works when:
// 1. renderLegend (above) does not render <PlotLegend>
// 2. does not have legend series toggle
// 3. passes through all fields required for link/action gen (including those with hideFrom.viz)
omitHideFromViz?: boolean;
/**
* needed for propsToDiff to re-init the plot & config
* this is a generic approach to plot re-init, without having to specify which panel-level options
@@ -179,6 +187,15 @@ export class GraphNG extends Component<GraphNGProps, GraphNGState> {
};
}
if (props.omitHideFromViz) {
const nonHiddenFields = alignedFrameFinal.fields.filter((field) => field.config.custom?.hideFrom?.viz !== true);
alignedFrameFinal = {
...alignedFrameFinal,
fields: nonHiddenFields,
length: nonHiddenFields.length,
};
}
let config = this.state?.config;
if (withConfig) {
@@ -7,7 +7,7 @@ import { UPlotConfigBuilder, VizLayout, VizLegend, VizLegendItem } from '@grafan
import { GraphNG, GraphNGProps } from '../GraphNG/GraphNG';
import { getXAxisConfig } from '../TimeSeries/utils';
import { getSeriesAndRest, preparePlotConfigBuilder, TimelineMode } from './utils';
import { preparePlotConfigBuilder, TimelineMode } from './utils';
/**
* @alpha
@@ -56,10 +56,8 @@ export const TimelineChart = (props: TimelineProps) => {
const prepConfig = useCallback(
(alignedFrame: DataFrame, allFrames: DataFrame[], getTimeRange: () => TimeRange) => {
const { seriesFrame } = getSeriesAndRest(alignedFrame);
return preparePlotConfigBuilder({
frame: seriesFrame,
frame: alignedFrame,
getTimeRange,
allFrames: frames,
...props,
@@ -68,7 +66,7 @@ export const TimelineChart = (props: TimelineProps) => {
timeZones: Array.isArray(timeZone) ? timeZone : [timeZone],
// When there is only one row, use the full space
rowHeight: seriesFrame.fields.length > 2 ? rowHeight : 1,
rowHeight: alignedFrame.fields.length > 2 ? rowHeight : 1,
getValueColor: getValueColor,
hoverMulti: tooltip?.mode === TooltipDisplayMode.Multi,
@@ -107,6 +105,7 @@ export const TimelineChart = (props: TimelineProps) => {
prepConfig={prepConfig}
propsToDiff={propsToDiff}
renderLegend={renderLegend}
omitHideFromViz={true}
/>
);
};
@@ -18,7 +18,6 @@ import { preparePlotFrame } from '../GraphNG/utils';
import {
findNextStateIndex,
fmtDuration,
getSeriesAndRest,
getThresholdItems,
hasSpecialMappedValue,
makeFramePerSeries,
@@ -564,95 +563,3 @@ describe('hasSpecialMappedValue', () => {
expect(hasSpecialMappedValue(field, valueMatch)).toEqual(expected);
});
});
describe('getSeriesAndRest', () => {
it('should return all fields as series when none are hidden', () => {
const frame = toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1, 2, 3] },
{ name: 'value1', type: FieldType.number, values: [10, 20, 30] },
{ name: 'value2', type: FieldType.string, values: ['a', 'b', 'c'] },
],
});
const result = getSeriesAndRest(frame);
expect(result.seriesFrame.fields).toHaveLength(3);
expect(result.restFields).toHaveLength(0);
expect(result.seriesFrame.fields.map((f) => f.name)).toEqual(['time', 'value1', 'value2']);
});
it('should separate hidden fields from visible fields', () => {
const frame = toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1, 2, 3] },
{
name: 'visible1',
type: FieldType.number,
values: [10, 20, 30],
config: { custom: { hideFrom: { viz: false, legend: true } } },
},
{
name: 'hidden1',
type: FieldType.string,
values: ['a', 'b', 'c'],
config: { custom: { hideFrom: { viz: true } } },
},
{
name: 'visible2',
type: FieldType.number,
values: [100, 200, 300],
},
{
name: 'hidden2',
type: FieldType.string,
values: ['x', 'y', 'z'],
config: { custom: { hideFrom: { viz: true, tooltip: false } } },
},
],
});
const result = getSeriesAndRest(frame);
expect(result.seriesFrame.fields).toHaveLength(3);
expect(result.restFields).toHaveLength(2);
expect(result.seriesFrame.fields.map((f) => f.name)).toEqual(['time', 'visible1', 'visible2']);
expect(result.restFields.map((f) => f.name)).toEqual(['hidden1', 'hidden2']);
});
it('should handle all fields being hidden', () => {
const frame = toDataFrame({
fields: [
{
name: 'time',
type: FieldType.time,
values: [1, 2, 3],
config: { custom: { hideFrom: { viz: true } } },
},
{
name: 'value1',
type: FieldType.number,
values: [10, 20, 30],
config: { custom: { hideFrom: { viz: true } } },
},
],
});
const result = getSeriesAndRest(frame);
expect(result.seriesFrame.fields).toHaveLength(0);
expect(result.restFields).toHaveLength(2);
expect(result.restFields.map((f) => f.name)).toEqual(['time', 'value1']);
});
it('should handle empty frame', () => {
const frame = toDataFrame({
fields: [],
});
const result = getSeriesAndRest(frame);
expect(result.seriesFrame.fields).toHaveLength(0);
expect(result.restFields).toHaveLength(0);
});
});
@@ -756,26 +756,3 @@ export function fmtDuration(milliSeconds: number): string {
: '0'
).trim();
}
export function getSeriesAndRest(alignedFrame: DataFrame) {
const seriesFields: Field[] = [];
const restFields: Field[] = [];
alignedFrame.fields.forEach((field) => {
if (field.config.custom?.hideFrom?.viz) {
restFields.push(field);
} else {
seriesFields.push(field);
}
});
const seriesFrame: DataFrame = {
...alignedFrame,
fields: seriesFields,
};
return {
seriesFrame: seriesFrame,
restFields: restFields,
};
}
+66 -1
View File
@@ -1,7 +1,15 @@
import { GrafanaConfig, locationUtil } from '@grafana/data';
import * as folderHooks from 'app/api/clients/folder/v1beta1/hooks';
import { backendSrv } from 'app/core/services/backend_srv';
import { AnnoKeyFolder, AnnoKeyMessage, AnnoReloadOnParamsChange } from 'app/features/apiserver/types';
import {
AnnoKeyFolder,
AnnoKeyManagerAllowsEdits,
AnnoKeyManagerKind,
AnnoKeyMessage,
AnnoKeySourcePath,
AnnoReloadOnParamsChange,
ManagerKind,
} from 'app/features/apiserver/types';
import { DashboardDataDTO } from 'app/types/dashboard';
import { DashboardWithAccessInfo } from './types';
@@ -215,6 +223,63 @@ describe('v1 dashboard API', () => {
expect(result.meta.reloadOnParamsChange).toBe(true);
});
describe('managed/provisioned dashboards', () => {
it('should not mark dashboard as provisioned when manager allows UI edits', async () => {
mockGet.mockResolvedValueOnce({
...mockDashboardDto,
metadata: {
...mockDashboardDto.metadata,
annotations: {
[AnnoKeyManagerKind]: ManagerKind.Terraform,
[AnnoKeyManagerAllowsEdits]: 'true',
[AnnoKeySourcePath]: 'dashboards/test.json',
},
},
});
const api = new K8sDashboardAPI();
const result = await api.getDashboardDTO('test');
expect(result.meta.provisioned).toBe(false);
expect(result.meta.provisionedExternalId).toBe('dashboards/test.json');
});
it('should mark dashboard as provisioned when manager does not allow UI edits', async () => {
mockGet.mockResolvedValueOnce({
...mockDashboardDto,
metadata: {
...mockDashboardDto.metadata,
annotations: {
[AnnoKeyManagerKind]: ManagerKind.Terraform,
[AnnoKeySourcePath]: 'dashboards/test.json',
},
},
});
const api = new K8sDashboardAPI();
const result = await api.getDashboardDTO('test');
expect(result.meta.provisioned).toBe(true);
expect(result.meta.provisionedExternalId).toBe('dashboards/test.json');
});
it('should not mark repository-managed dashboard as provisioned (locked)', async () => {
mockGet.mockResolvedValueOnce({
...mockDashboardDto,
metadata: {
...mockDashboardDto.metadata,
annotations: {
[AnnoKeyManagerKind]: ManagerKind.Repo,
[AnnoKeySourcePath]: 'dashboards/test.json',
},
},
});
const api = new K8sDashboardAPI();
const result = await api.getDashboardDTO('test');
expect(result.meta.provisioned).toBe(false);
expect(result.meta.provisionedExternalId).toBe('dashboards/test.json');
});
});
describe('saveDashboard', () => {
beforeEach(() => {
locationUtil.initialize({
+5 -1
View File
@@ -164,7 +164,11 @@ export class K8sDashboardAPI implements DashboardAPI<DashboardDTO, Dashboard> {
const managerKind = annotations[AnnoKeyManagerKind];
if (managerKind) {
result.meta.provisioned = annotations[AnnoKeyManagerAllowsEdits] === 'true' || managerKind === ManagerKind.Repo;
// `meta.provisioned` is used by the save/delete UI to decide if a dashboard is locked
// (i.e. it can't be saved from the UI). This should match the legacy behavior where
// `allowUiUpdates: true` keeps the dashboard editable/savable.
const allowsEdits = annotations[AnnoKeyManagerAllowsEdits] === 'true';
result.meta.provisioned = !allowsEdits && managerKind !== ManagerKind.Repo;
result.meta.provisionedExternalId = annotations[AnnoKeySourcePath];
}
@@ -78,6 +78,7 @@ export const BasicProvisionedDashboardsEmptyPage = ({ datasourceUid }: Props) =>
sourceEntryPoint: SOURCE_ENTRY_POINTS.DATASOURCE_PAGE,
libraryItemId: dashboard.uid,
creationOrigin: CREATION_ORIGINS.DASHBOARD_LIBRARY_DATASOURCE_DASHBOARD,
contentKind: CONTENT_KINDS.DATASOURCE_DASHBOARD,
});
const templateUrl = `${DASHBOARD_LIBRARY_ROUTES.Template}?${params.toString()}`;
@@ -0,0 +1,125 @@
import { screen, waitFor } from '@testing-library/react';
import React from 'react';
import { render } from 'test/test-utils';
import { CommunityDashboardSection } from './CommunityDashboardSection';
import { fetchCommunityDashboards } from './api/dashboardLibraryApi';
import { GnetDashboard } from './types';
import { onUseCommunityDashboard } from './utils/communityDashboardHelpers';
jest.mock('./api/dashboardLibraryApi', () => ({
fetchCommunityDashboards: jest.fn(),
}));
jest.mock('./utils/communityDashboardHelpers', () => ({
...jest.requireActual('./utils/communityDashboardHelpers'),
onUseCommunityDashboard: jest.fn(),
}));
jest.mock('@grafana/runtime', () => ({
...jest.requireActual('@grafana/runtime'),
getDataSourceSrv: () => ({
getInstanceSettings: jest.fn((uid: string) => ({
uid,
name: `DataSource ${uid}`,
type: 'test',
})),
}),
}));
const mockFetchCommunityDashboards = fetchCommunityDashboards as jest.MockedFunction<typeof fetchCommunityDashboards>;
const mockOnUseCommunityDashboard = onUseCommunityDashboard as jest.MockedFunction<typeof onUseCommunityDashboard>;
const createMockGnetDashboard = (overrides: Partial<GnetDashboard> = {}): GnetDashboard => ({
id: 1,
name: 'Test Dashboard',
description: 'Test Description',
downloads: 2000,
datasource: 'Prometheus',
slug: 'test-dashboard',
...overrides,
});
const setup = async (
props: Partial<React.ComponentProps<typeof CommunityDashboardSection>> = {},
successScenario = true
) => {
const renderResult = render(
<CommunityDashboardSection onShowMapping={jest.fn()} datasourceType="test" {...props} />,
{
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-datasource-uid'],
},
}
);
if (successScenario) {
await waitFor(() => {
expect(screen.getByText('Test Dashboard')).toBeInTheDocument();
});
}
return renderResult;
};
describe('CommunityDashboardSection', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('should render', async () => {
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 5,
items: [
createMockGnetDashboard(),
createMockGnetDashboard({ id: 2, name: 'Test Dashboard 2' }),
createMockGnetDashboard({ id: 3, name: 'Test Dashboard 3' }),
],
});
await setup();
await waitFor(() => {
expect(screen.getByText('Test Dashboard')).toBeInTheDocument();
expect(screen.getByText('Test Dashboard 2')).toBeInTheDocument();
expect(screen.getByText('Test Dashboard 3')).toBeInTheDocument();
});
});
it('should show error when fetching a specific community dashboard after clicking use dashboard button fails', async () => {
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 5,
items: [createMockGnetDashboard()],
});
mockOnUseCommunityDashboard.mockRejectedValue(new Error('Failed to use community dashboard'));
const { user } = await setup();
await waitFor(() => {
expect(screen.getByText('Test Dashboard')).toBeInTheDocument();
});
const useDashboardButton = screen.getByRole('button', { name: 'Use dashboard' });
await user.click(useDashboardButton);
await waitFor(() => {
expect(screen.getByText('Error loading community dashboard')).toBeInTheDocument();
});
});
it('should show error when fetching community dashboards list fails', async () => {
const consoleErrorSpy = jest.spyOn(console, 'error').mockImplementation();
mockFetchCommunityDashboards.mockRejectedValue(new Error('Failed to fetch community dashboards'));
await setup(undefined, false);
await waitFor(() => {
expect(screen.getByText('Error loading community dashboards')).toBeInTheDocument();
});
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboards', expect.any(Error));
consoleErrorSpy.mockRestore();
});
});
@@ -1,12 +1,12 @@
import { css } from '@emotion/css';
import { useEffect, useRef, useState } from 'react';
import { useSearchParams } from 'react-router-dom-v5-compat';
import { useAsync, useDebounce } from 'react-use';
import { useAsyncFn, useAsyncRetry, useDebounce } from 'react-use';
import { GrafanaTheme2 } from '@grafana/data';
import { Trans, t } from '@grafana/i18n';
import { getDataSourceSrv } from '@grafana/runtime';
import { Button, useStyles2, Stack, Grid, EmptyState, Alert, Pagination, FilterInput } from '@grafana/ui';
import { Button, useStyles2, Stack, Grid, EmptyState, Alert, FilterInput, Box } from '@grafana/ui';
import { DashboardCard } from './DashboardCard';
import { MappingContext } from './SuggestedDashboardsModal';
@@ -24,6 +24,8 @@ import {
getLogoUrl,
buildDashboardDetails,
onUseCommunityDashboard,
COMMUNITY_PAGE_SIZE_QUERY,
COMMUNITY_RESULT_SIZE,
} from './utils/communityDashboardHelpers';
interface Props {
@@ -31,8 +33,6 @@ interface Props {
datasourceType?: string;
}
// Constants for community dashboard pagination and API params
const COMMUNITY_PAGE_SIZE = 9;
const SEARCH_DEBOUNCE_MS = 500;
const DEFAULT_SORT_ORDER = 'downloads';
const DEFAULT_SORT_DIRECTION = 'desc';
@@ -42,7 +42,6 @@ const INCLUDE_SCREENSHOTS = true;
export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Props) => {
const [searchParams] = useSearchParams();
const datasourceUid = searchParams.get('dashboardLibraryDatasourceUid');
const [currentPage, setCurrentPage] = useState(1);
const [searchQuery, setSearchQuery] = useState('');
const hasTrackedLoaded = useRef(false);
@@ -55,18 +54,12 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
[searchQuery]
);
// Reset to page 1 when debounced search query changes
useEffect(() => {
if (debouncedSearchQuery) {
setCurrentPage(1);
}
}, [debouncedSearchQuery]);
const {
value: response,
loading,
error,
} = useAsync(async () => {
retry,
} = useAsyncRetry(async () => {
if (!datasourceUid) {
return null;
}
@@ -80,8 +73,8 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
const apiResponse = await fetchCommunityDashboards({
orderBy: DEFAULT_SORT_ORDER,
direction: DEFAULT_SORT_DIRECTION,
page: currentPage,
pageSize: COMMUNITY_PAGE_SIZE,
page: 1,
pageSize: COMMUNITY_PAGE_SIZE_QUERY,
includeLogo: INCLUDE_LOGO,
includeScreenshots: INCLUDE_SCREENSHOTS,
dataSourceSlugIn: ds.type,
@@ -100,15 +93,14 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
}
return {
dashboards: apiResponse.items,
pages: apiResponse.pages,
dashboards: apiResponse.items.slice(0, COMMUNITY_RESULT_SIZE),
datasourceType: ds.type,
};
} catch (err) {
console.error('Error loading community dashboards', err);
throw err;
}
}, [datasourceUid, currentPage, debouncedSearchQuery]);
}, [datasourceUid, debouncedSearchQuery]);
// Track analytics only once on first successful load
useEffect(() => {
@@ -128,37 +120,49 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
// Determine what to show in results area
const dashboards = Array.isArray(response?.dashboards) ? response.dashboards : [];
const totalPages = response?.pages || 1;
const showEmptyState = !loading && (!response?.dashboards || response.dashboards.length === 0);
const showError = !loading && error;
const onPreviewCommunityDashboard = (dashboard: GnetDashboard) => {
if (!response) {
return;
}
const [{ error: isPreviewDashboardError }, onPreviewCommunityDashboard] = useAsyncFn(
async (dashboard: GnetDashboard) => {
if (!response) {
return;
}
// Track item click
DashboardLibraryInteractions.itemClicked({
contentKind: CONTENT_KINDS.COMMUNITY_DASHBOARD,
datasourceTypes: [response.datasourceType],
libraryItemId: String(dashboard.id),
libraryItemTitle: dashboard.name,
sourceEntryPoint: SOURCE_ENTRY_POINTS.DATASOURCE_PAGE,
eventLocation: EVENT_LOCATIONS.MODAL_COMMUNITY_TAB,
discoveryMethod: debouncedSearchQuery.trim() ? DISCOVERY_METHODS.SEARCH : DISCOVERY_METHODS.BROWSE,
});
// Track item click
DashboardLibraryInteractions.itemClicked({
contentKind: CONTENT_KINDS.COMMUNITY_DASHBOARD,
datasourceTypes: [response.datasourceType],
libraryItemId: String(dashboard.id),
libraryItemTitle: dashboard.name,
sourceEntryPoint: SOURCE_ENTRY_POINTS.DATASOURCE_PAGE,
eventLocation: EVENT_LOCATIONS.MODAL_COMMUNITY_TAB,
discoveryMethod: debouncedSearchQuery.trim() ? DISCOVERY_METHODS.SEARCH : DISCOVERY_METHODS.BROWSE,
});
onUseCommunityDashboard({
dashboard,
datasourceUid: datasourceUid || '',
datasourceType: response.datasourceType,
eventLocation: EVENT_LOCATIONS.MODAL_COMMUNITY_TAB,
onShowMapping,
});
};
await onUseCommunityDashboard({
dashboard,
datasourceUid: datasourceUid || '',
datasourceType: response.datasourceType,
eventLocation: EVENT_LOCATIONS.MODAL_COMMUNITY_TAB,
onShowMapping,
});
},
[response, datasourceUid, debouncedSearchQuery, onShowMapping]
);
return (
<Stack direction="column" gap={2} height="100%">
{isPreviewDashboardError && (
<div>
<Alert
title={t('dashboard-library.community-error-title', 'Error loading community dashboard')}
severity="error"
>
<Trans i18nKey="dashboard-library.community-error-description">Failed to load community dashboard.</Trans>
</Alert>
</div>
)}
<FilterInput
placeholder={
datasourceType
@@ -183,7 +187,7 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
lg: 3,
}}
>
{Array.from({ length: COMMUNITY_PAGE_SIZE }).map((_, i) => (
{Array.from({ length: COMMUNITY_RESULT_SIZE }).map((_, i) => (
<DashboardCard.Skeleton key={`skeleton-${i}`} />
))}
</Grid>
@@ -197,7 +201,7 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
Failed to load community dashboards. Please try again.
</Trans>
</Alert>
<Button variant="secondary" onClick={() => setCurrentPage(1)}>
<Button variant="secondary" onClick={retry}>
<Trans i18nKey="dashboard-library.retry">Retry</Trans>
</Button>
</Stack>
@@ -233,42 +237,47 @@ export const CommunityDashboardSection = ({ onShowMapping, datasourceType }: Pro
)}
</EmptyState>
) : (
<Grid
gap={4}
columns={{
xs: 1,
sm: dashboards.length >= 2 ? 2 : 1,
lg: dashboards.length >= 3 ? 3 : dashboards.length >= 2 ? 2 : 1,
}}
>
{dashboards.map((dashboard) => {
const thumbnailUrl = getThumbnailUrl(dashboard);
const logoUrl = getLogoUrl(dashboard);
const imageUrl = thumbnailUrl || logoUrl;
const isLogo = !thumbnailUrl;
const details = buildDashboardDetails(dashboard);
<Stack direction="column" gap={2}>
<Grid
gap={4}
columns={{
xs: 1,
sm: dashboards.length >= 2 ? 2 : 1,
lg: dashboards.length >= 3 ? 3 : dashboards.length >= 2 ? 2 : 1,
}}
>
{dashboards.map((dashboard) => {
const thumbnailUrl = getThumbnailUrl(dashboard);
const logoUrl = getLogoUrl(dashboard);
const imageUrl = thumbnailUrl || logoUrl;
const isLogo = !thumbnailUrl;
const details = buildDashboardDetails(dashboard);
return (
<DashboardCard
key={dashboard.id}
title={dashboard.name}
imageUrl={imageUrl}
dashboard={dashboard}
onClick={() => onPreviewCommunityDashboard(dashboard)}
isLogo={isLogo}
details={details}
kind="suggested_dashboard"
/>
);
})}
</Grid>
return (
<DashboardCard
key={dashboard.id}
title={dashboard.name}
imageUrl={imageUrl}
dashboard={dashboard}
onClick={() => onPreviewCommunityDashboard(dashboard)}
isLogo={isLogo}
details={details}
kind="suggested_dashboard"
/>
);
})}
</Grid>
<Box display="flex" justifyContent="end" gap={2} paddingRight={1.5}>
<Button
variant="secondary"
onClick={() => window.open('https://grafana.com/grafana/dashboards/', '_blank')}
>
<Trans i18nKey="dashboard-library.browse-grafana-com">Browse Grafana.com</Trans>
</Button>
</Box>
</Stack>
)}
</div>
{totalPages > 1 && (
<div className={styles.paginationWrapper}>
<Pagination currentPage={currentPage} numberOfPages={totalPages} onNavigate={setCurrentPage} />
</div>
)}
</Stack>
);
};
@@ -277,18 +286,9 @@ function getStyles(theme: GrafanaTheme2) {
return {
resultsContainer: css({
width: '100%',
position: 'relative',
flex: 1,
overflow: 'auto',
}),
paginationWrapper: css({
position: 'sticky',
bottom: 0,
backgroundColor: theme.colors.background.primary,
padding: theme.spacing(2),
display: 'flex',
justifyContent: 'flex-end',
zIndex: 2,
paddingBottom: theme.spacing(2),
}),
};
}
@@ -1,41 +1,8 @@
import { screen } from '@testing-library/react';
import { render } from 'test/test-utils';
import { PluginDashboard } from 'app/types/plugins';
import { DashboardCard } from './DashboardCard';
import { GnetDashboard } from './types';
// Helper functions for creating mock objects
const createMockPluginDashboard = (overrides: Partial<PluginDashboard> = {}): PluginDashboard => ({
dashboardId: 1,
description: 'Test description',
imported: false,
importedRevision: 0,
importedUri: '',
importedUrl: '',
path: '',
pluginId: 'test-plugin',
removed: false,
revision: 1,
slug: 'test-dashboard',
title: 'Test Dashboard',
uid: 'test-uid',
...overrides,
});
const createMockGnetDashboard = (overrides: Partial<GnetDashboard> = {}): GnetDashboard => ({
id: 123,
name: 'Test Dashboard',
description: 'Test description',
datasource: 'Prometheus',
orgName: 'Test Org',
userName: 'testuser',
publishedAt: '',
updatedAt: '',
downloads: 0,
...overrides,
});
import { createMockGnetDashboard, createMockPluginDashboard } from './utils/test-utils';
const createMockDetails = (overrides = {}) => ({
id: '123',
@@ -0,0 +1,273 @@
import { screen, waitFor, within } from '@testing-library/react';
import { render } from 'test/test-utils';
import { locationService } from '@grafana/runtime';
import { DashboardLibrarySection } from './DashboardLibrarySection';
import { fetchProvisionedDashboards } from './api/dashboardLibraryApi';
import { DashboardLibraryInteractions } from './interactions';
import { createMockPluginDashboard } from './utils/test-utils';
jest.mock('./api/dashboardLibraryApi', () => ({
fetchProvisionedDashboards: jest.fn(),
}));
jest.mock('@grafana/runtime', () => ({
...jest.requireActual('@grafana/runtime'),
getDataSourceSrv: () => ({
getInstanceSettings: jest.fn((uid?: string) => {
if (uid) {
return {
uid,
name: `DataSource ${uid}`,
type: 'test-datasource',
};
}
return null;
}),
}),
locationService: {
push: jest.fn(),
getHistory: jest.fn(() => ({
listen: jest.fn(() => jest.fn()),
})),
},
}));
jest.mock('./interactions', () => ({
...jest.requireActual('./interactions'),
DashboardLibraryInteractions: {
loaded: jest.fn(),
itemClicked: jest.fn(),
},
}));
jest.mock('./DashboardCard', () => {
const DashboardCardComponent = ({ title, onClick }: { title: string; onClick: () => void }) => (
<div data-testid={`dashboard-card-${title}`} onClick={onClick}>
{title}
</div>
);
const DashboardCardSkeleton = () => <div data-testid="dashboard-card-skeleton">Skeleton</div>;
return {
DashboardCard: Object.assign(DashboardCardComponent, {
Skeleton: DashboardCardSkeleton,
}),
};
});
const mockFetchProvisionedDashboards = fetchProvisionedDashboards as jest.MockedFunction<
typeof fetchProvisionedDashboards
>;
const mockLocationServicePush = locationService.push as jest.MockedFunction<typeof locationService.push>;
const mockDashboardLibraryInteractionsLoaded = DashboardLibraryInteractions.loaded as jest.MockedFunction<
typeof DashboardLibraryInteractions.loaded
>;
const mockDashboardLibraryInteractionsItemClicked = DashboardLibraryInteractions.itemClicked as jest.MockedFunction<
typeof DashboardLibraryInteractions.itemClicked
>;
describe('DashboardLibrarySection', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('should render dashboards when they are available', async () => {
const dashboards = [
createMockPluginDashboard({ title: 'Dashboard 1', uid: 'uid-1' }),
createMockPluginDashboard({ title: 'Dashboard 2', uid: 'uid-2' }),
];
mockFetchProvisionedDashboards.mockResolvedValue(dashboards);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Dashboard 1')).toBeInTheDocument();
expect(screen.getByTestId('dashboard-card-Dashboard 2')).toBeInTheDocument();
});
});
it('should show empty state when there are no dashboards', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([]);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByText('No test-datasource provisioned dashboards found')).toBeInTheDocument();
expect(
screen.getByText(
'Provisioned dashboards are provided by data source plugins. You can find more plugins on Grafana.com.'
)
).toBeInTheDocument();
const browseButton = screen.getByRole('button', { name: 'Browse plugins' });
expect(browseButton).toBeInTheDocument();
});
});
it('should show empty state without datasource type when datasourceUid is not provided', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([]);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test'],
},
});
await waitFor(() => {
expect(screen.getByText('No provisioned dashboards found')).toBeInTheDocument();
});
});
it('should render pagination when there are more than 9 dashboards', async () => {
const dashboards = Array.from({ length: 18 }, (_, i) =>
createMockPluginDashboard({ title: `Dashboard ${i + 1}`, uid: `uid-${i + 1}` })
);
mockFetchProvisionedDashboards.mockResolvedValue(dashboards);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
const pagination = screen.getByRole('navigation');
expect(pagination).toBeInTheDocument();
expect(within(pagination).getByText('1')).toBeInTheDocument();
expect(within(pagination).getByText('2')).toBeInTheDocument();
});
});
it('should not render pagination when there are 9 or fewer dashboards', async () => {
const dashboards = Array.from({ length: 9 }, (_, i) =>
createMockPluginDashboard({ title: `Dashboard ${i + 1}`, uid: `uid-${i + 1}` })
);
mockFetchProvisionedDashboards.mockResolvedValue(dashboards);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Dashboard 1')).toBeInTheDocument();
});
const pagination = screen.queryByRole('navigation');
expect(pagination).not.toBeInTheDocument();
});
it('should navigate to template route when clicking on a dashboard', async () => {
const dashboard = createMockPluginDashboard({
title: 'Test Dashboard',
uid: 'test-uid-123',
pluginId: 'test-plugin',
path: 'test/path.json',
});
mockFetchProvisionedDashboards.mockResolvedValue([dashboard]);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Test Dashboard')).toBeInTheDocument();
});
const dashboardCard = screen.getByTestId('dashboard-card-Test Dashboard');
dashboardCard.click();
await waitFor(() => {
expect(mockLocationServicePush).toHaveBeenCalled();
const callArgs = mockLocationServicePush.mock.calls[0][0];
expect(callArgs).toContain('/dashboard/template');
expect(callArgs).toContain('datasource=test-uid');
expect(callArgs).toContain('title=Test+Dashboard');
expect(callArgs).toContain('pluginId=test-plugin');
expect(callArgs).toContain('path=test%2Fpath.json');
expect(callArgs).toContain('libraryItemId=test-uid-123');
});
});
it('should track analytics when dashboards are loaded', async () => {
const dashboards = [
createMockPluginDashboard({ title: 'Dashboard 1', uid: 'uid-1' }),
createMockPluginDashboard({ title: 'Dashboard 2', uid: 'uid-2' }),
];
mockFetchProvisionedDashboards.mockResolvedValue(dashboards);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Dashboard 1')).toBeInTheDocument();
});
await waitFor(() => {
expect(mockDashboardLibraryInteractionsLoaded).toHaveBeenCalledWith({
numberOfItems: 2,
contentKinds: ['datasource_dashboard'],
datasourceTypes: ['test-datasource'],
sourceEntryPoint: 'datasource_page',
eventLocation: 'suggested_dashboards_modal_provisioned_tab',
});
});
});
it('should track analytics when a dashboard is clicked', async () => {
const dashboard = createMockPluginDashboard({
title: 'Test Dashboard',
uid: 'test-uid-123',
pluginId: 'test-plugin',
});
mockFetchProvisionedDashboards.mockResolvedValue([dashboard]);
render(<DashboardLibrarySection />, {
historyOptions: {
initialEntries: ['/test?dashboardLibraryDatasourceUid=test-uid'],
},
});
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Test Dashboard')).toBeInTheDocument();
});
const dashboardCard = screen.getByTestId('dashboard-card-Test Dashboard');
dashboardCard.click();
await waitFor(() => {
expect(mockDashboardLibraryInteractionsItemClicked).toHaveBeenCalledWith({
contentKind: 'datasource_dashboard',
datasourceTypes: ['test-plugin'],
libraryItemId: 'test-uid-123',
libraryItemTitle: 'Test Dashboard',
sourceEntryPoint: 'datasource_page',
eventLocation: 'suggested_dashboards_modal_provisioned_tab',
discoveryMethod: 'browse',
});
});
});
});
@@ -0,0 +1,186 @@
import { screen, waitFor } from '@testing-library/react';
import { render } from 'test/test-utils';
import { SuggestedDashboards } from './SuggestedDashboards';
import { fetchCommunityDashboards, fetchProvisionedDashboards } from './api/dashboardLibraryApi';
import { createMockGnetDashboard, createMockPluginDashboard } from './utils/test-utils';
jest.mock('./api/dashboardLibraryApi', () => ({
fetchProvisionedDashboards: jest.fn(),
fetchCommunityDashboards: jest.fn(),
}));
jest.mock('./utils/communityDashboardHelpers', () => ({
...jest.requireActual('./utils/communityDashboardHelpers'),
onUseCommunityDashboard: jest.fn(),
}));
jest.mock('./SuggestedDashboardsModal', () => ({
SuggestedDashboardsModal: () => <div data-testid="suggested-dashboards-modal">Modal</div>,
}));
jest.mock('./DashboardCard', () => {
const DashboardCardComponent = ({ title, onClick }: { title: string; onClick: () => void }) => (
<div data-testid={`dashboard-card-${title}`} onClick={onClick}>
{title}
</div>
);
const DashboardCardSkeleton = () => <div data-testid="dashboard-card-skeleton">Skeleton</div>;
return {
DashboardCard: Object.assign(DashboardCardComponent, {
Skeleton: DashboardCardSkeleton,
}),
};
});
jest.mock('@grafana/runtime', () => ({
...jest.requireActual('@grafana/runtime'),
getDataSourceSrv: () => ({
getInstanceSettings: jest.fn((uid?: string) => {
if (uid) {
return {
uid,
name: `DataSource ${uid}`,
type: 'test-datasource',
};
}
return null;
}),
}),
}));
jest.mock('./interactions', () => ({
...jest.requireActual('./interactions'),
DashboardLibraryInteractions: {
loaded: jest.fn(),
itemClicked: jest.fn(),
},
}));
const mockFetchProvisionedDashboards = fetchProvisionedDashboards as jest.MockedFunction<
typeof fetchProvisionedDashboards
>;
const mockFetchCommunityDashboards = fetchCommunityDashboards as jest.MockedFunction<typeof fetchCommunityDashboards>;
describe('SuggestedDashboards', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('should render when there are dashboards', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([createMockPluginDashboard()]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [createMockGnetDashboard()],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.getByTestId('suggested-dashboards')).toBeInTheDocument();
});
});
it('should not render when there are no dashboards', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.queryByTestId('suggested-dashboards')).not.toBeInTheDocument();
});
});
it('should render provisioned dashboard cards', async () => {
const provisionedDashboard = createMockPluginDashboard({ title: 'Provisioned Dashboard 1' });
mockFetchProvisionedDashboards.mockResolvedValue([provisionedDashboard]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Provisioned Dashboard 1')).toBeInTheDocument();
});
});
it('should render community dashboard cards', async () => {
const communityDashboard = createMockGnetDashboard({ name: 'Community Dashboard 1' });
mockFetchProvisionedDashboards.mockResolvedValue([]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [communityDashboard],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.getByTestId('dashboard-card-Community Dashboard 1')).toBeInTheDocument();
});
});
it('should show "View all" button when hasMoreDashboards is true', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([
createMockPluginDashboard(),
createMockPluginDashboard({ title: 'Provisioned Dashboard 2' }),
]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.getByRole('button', { name: 'View all' })).toBeInTheDocument();
});
});
it('should not show "View all" button when hasMoreDashboards is false', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([createMockPluginDashboard()]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [createMockGnetDashboard()],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(screen.queryByRole('button', { name: 'View all' })).not.toBeInTheDocument();
});
});
it('should render title and subtitle with datasource type when datasourceUid is provided', async () => {
mockFetchProvisionedDashboards.mockResolvedValue([createMockPluginDashboard()]);
mockFetchCommunityDashboards.mockResolvedValue({
page: 1,
pages: 1,
items: [],
});
render(<SuggestedDashboards datasourceUid="test-uid" />);
await waitFor(() => {
expect(
screen.getByText('Build a dashboard using suggested options for your test-datasource data source')
).toBeInTheDocument();
expect(
screen.getByText('Browse and select from data-source provided or community dashboards')
).toBeInTheDocument();
});
});
});
@@ -1,12 +1,12 @@
import { css } from '@emotion/css';
import { useEffect, useMemo, useRef, useState } from 'react';
import { useSearchParams } from 'react-router-dom-v5-compat';
import { useAsync } from 'react-use';
import { useAsync, useAsyncFn } from 'react-use';
import { GrafanaTheme2 } from '@grafana/data';
import { Trans, t } from '@grafana/i18n';
import { getDataSourceSrv, locationService } from '@grafana/runtime';
import { Button, useStyles2, Grid } from '@grafana/ui';
import { Button, useStyles2, Grid, Alert } from '@grafana/ui';
import { PluginDashboard } from 'app/types/plugins';
import { DashboardCard } from './DashboardCard';
@@ -26,6 +26,8 @@ import {
getLogoUrl,
buildDashboardDetails,
onUseCommunityDashboard,
COMMUNITY_PAGE_SIZE_QUERY,
COMMUNITY_RESULT_SIZE,
} from './utils/communityDashboardHelpers';
import { getProvisionedDashboardImageUrl } from './utils/provisionedDashboardHelpers';
@@ -43,7 +45,7 @@ type SuggestedDashboardsResult = {
};
// Constants for suggested dashboards API params
const SUGGESTED_COMMUNITY_PAGE_SIZE = 2;
const MAX_SUGGESTED_DASHBOARDS_PREVIEW = 2;
const DEFAULT_SORT_ORDER = 'downloads';
const DEFAULT_SORT_DIRECTION = 'desc';
const INCLUDE_SCREENSHOTS = true;
@@ -91,14 +93,14 @@ export const SuggestedDashboards = ({ datasourceUid }: Props) => {
orderBy: DEFAULT_SORT_ORDER,
direction: DEFAULT_SORT_DIRECTION,
page: 1,
pageSize: SUGGESTED_COMMUNITY_PAGE_SIZE,
pageSize: COMMUNITY_PAGE_SIZE_QUERY,
includeScreenshots: INCLUDE_SCREENSHOTS,
dataSourceSlugIn: ds.type,
includeLogo: INCLUDE_LOGO,
}),
]);
const community = communityResponse.items;
const community = communityResponse.items.slice(0, COMMUNITY_RESULT_SIZE);
// Mix: 1 provisioned + 2 community
const mixed: MixedDashboard[] = [];
@@ -130,7 +132,7 @@ export const SuggestedDashboards = ({ datasourceUid }: Props) => {
// Determine if there are more dashboards available beyond what we're showing
// Show "View all" if: more than 1 provisioned exists OR we got the full page size of community dashboards
const hasMoreDashboards = provisioned.length > 1 || community.length >= SUGGESTED_COMMUNITY_PAGE_SIZE;
const hasMoreDashboards = provisioned.length > 1 || community.length > MAX_SUGGESTED_DASHBOARDS_PREVIEW;
return { dashboards: mixed, hasMoreDashboards };
} catch (error) {
@@ -233,35 +235,38 @@ export const SuggestedDashboards = ({ datasourceUid }: Props) => {
locationService.push(`/dashboard/template?${params.toString()}`);
};
const onPreviewCommunityDashboard = (dashboard: GnetDashboard) => {
if (!datasourceUid) {
return;
}
const [{ error: isPreviewCommunityDashboardError }, onPreviewCommunityDashboard] = useAsyncFn(
async (dashboard: GnetDashboard) => {
if (!datasourceUid) {
return;
}
const ds = getDataSourceSrv().getInstanceSettings(datasourceUid);
if (!ds) {
return;
}
const ds = getDataSourceSrv().getInstanceSettings(datasourceUid);
if (!ds) {
return;
}
// Track item click
DashboardLibraryInteractions.itemClicked({
contentKind: CONTENT_KINDS.COMMUNITY_DASHBOARD,
datasourceTypes: [ds.type],
libraryItemId: String(dashboard.id),
libraryItemTitle: dashboard.name,
sourceEntryPoint: SOURCE_ENTRY_POINTS.DATASOURCE_PAGE,
eventLocation: EVENT_LOCATIONS.EMPTY_DASHBOARD,
discoveryMethod: DISCOVERY_METHODS.BROWSE,
});
// Track item click
DashboardLibraryInteractions.itemClicked({
contentKind: CONTENT_KINDS.COMMUNITY_DASHBOARD,
datasourceTypes: [ds.type],
libraryItemId: String(dashboard.id),
libraryItemTitle: dashboard.name,
sourceEntryPoint: SOURCE_ENTRY_POINTS.DATASOURCE_PAGE,
eventLocation: EVENT_LOCATIONS.EMPTY_DASHBOARD,
discoveryMethod: DISCOVERY_METHODS.BROWSE,
});
onUseCommunityDashboard({
dashboard,
datasourceUid,
datasourceType: ds.type,
eventLocation: EVENT_LOCATIONS.EMPTY_DASHBOARD,
onShowMapping: onShowMapping,
});
};
await onUseCommunityDashboard({
dashboard,
datasourceUid,
datasourceType: ds.type,
eventLocation: EVENT_LOCATIONS.EMPTY_DASHBOARD,
onShowMapping: onShowMapping,
});
},
[datasourceUid, onShowMapping]
);
// Don't render if no dashboards or still loading
if (!loading && (!result || result.dashboards.length === 0)) {
@@ -297,7 +302,16 @@ export const SuggestedDashboards = ({ datasourceUid }: Props) => {
</Button>
)}
</div>
{isPreviewCommunityDashboardError && (
<div>
<Alert
title={t('dashboard-library.community-error-title', 'Error loading community dashboard')}
severity="error"
>
<Trans i18nKey="dashboard-library.community-error-description">Failed to load community dashboard.</Trans>
</Alert>
</div>
)}
<Grid
gap={4}
columns={{
@@ -0,0 +1,101 @@
import { screen } from '@testing-library/react';
import { render } from 'test/test-utils';
import { DashboardJson } from 'app/features/manage-dashboards/types';
import { SuggestedDashboardsModal } from './SuggestedDashboardsModal';
import { CONTENT_KINDS, EVENT_LOCATIONS } from './interactions';
jest.mock('./DashboardLibrarySection', () => ({
DashboardLibrarySection: () => <div data-testid="dashboard-library-section">Dashboard Library Section</div>,
}));
jest.mock('./CommunityDashboardSection', () => ({
CommunityDashboardSection: () => <div data-testid="community-dashboard-section">Community Dashboard Section</div>,
}));
jest.mock('./CommunityDashboardMappingForm', () => ({
CommunityDashboardMappingForm: () => (
<div data-testid="community-dashboard-mapping-form">Community Dashboard Mapping Form</div>
),
}));
describe('SuggestedDashboardsModal', () => {
const defaultProps = {
isOpen: true,
onDismiss: jest.fn(),
};
beforeEach(() => {
jest.clearAllMocks();
});
it('should render when isOpen is true', () => {
render(<SuggestedDashboardsModal {...defaultProps} />);
expect(screen.getByRole('dialog')).toBeInTheDocument();
});
it('should not render when isOpen is false', () => {
render(<SuggestedDashboardsModal {...defaultProps} isOpen={false} />);
expect(screen.queryByRole('dialog')).not.toBeInTheDocument();
});
it('should render both tabs: Data-source provided and Community', () => {
render(<SuggestedDashboardsModal {...defaultProps} />);
expect(screen.getByRole('tab', { name: 'Data-source provided' })).toBeInTheDocument();
expect(screen.getByRole('tab', { name: 'Community' })).toBeInTheDocument();
});
it('should render tablist with both tabs', () => {
render(<SuggestedDashboardsModal {...defaultProps} />);
const tablist = screen.getByRole('tablist');
expect(tablist).toBeInTheDocument();
const tabs = screen.getAllByRole('tab');
expect(tabs).toHaveLength(2);
expect(tabs[0]).toHaveTextContent('Data-source provided');
expect(tabs[1]).toHaveTextContent('Community');
});
it('should render DashboardLibrarySection when activeView is datasource', () => {
render(<SuggestedDashboardsModal {...defaultProps} defaultTab="datasource" />);
expect(screen.getByTestId('dashboard-library-section')).toBeInTheDocument();
expect(screen.queryByTestId('community-dashboard-section')).not.toBeInTheDocument();
expect(screen.queryByTestId('community-dashboard-mapping-form')).not.toBeInTheDocument();
});
it('should render CommunityDashboardSection when activeView is community', () => {
render(<SuggestedDashboardsModal {...defaultProps} defaultTab="community" />);
expect(screen.getByTestId('community-dashboard-section')).toBeInTheDocument();
expect(screen.queryByTestId('dashboard-library-section')).not.toBeInTheDocument();
expect(screen.queryByTestId('community-dashboard-mapping-form')).not.toBeInTheDocument();
});
it('should render CommunityDashboardMappingForm when activeView is mapping', () => {
render(
<SuggestedDashboardsModal
{...defaultProps}
initialMappingContext={{
dashboardName: 'Test Dashboard',
dashboardJson: { title: 'Test Dashboard', panels: [], schemaVersion: 41 } as DashboardJson,
unmappedDsInputs: [],
constantInputs: [],
existingMappings: [],
onInterpolateAndNavigate: jest.fn(),
eventLocation: EVENT_LOCATIONS.MODAL_COMMUNITY_TAB,
contentKind: CONTENT_KINDS.COMMUNITY_DASHBOARD,
}}
/>
);
expect(screen.getByTestId('community-dashboard-mapping-form')).toBeInTheDocument();
expect(screen.queryByTestId('dashboard-library-section')).not.toBeInTheDocument();
expect(screen.queryByTestId('community-dashboard-section')).not.toBeInTheDocument();
});
});
@@ -3,6 +3,7 @@ import { DashboardJson } from 'app/features/manage-dashboards/types';
import { PluginDashboard } from 'app/types/plugins';
import { GnetDashboard } from '../types';
import { createMockGnetDashboard, createMockPluginDashboard } from '../utils/test-utils';
import {
fetchCommunityDashboard,
@@ -14,8 +15,16 @@ import {
jest.mock('@grafana/runtime', () => ({
getBackendSrv: jest.fn(),
reportInteraction: jest.fn(),
}));
jest.mock('../interactions', () => ({
...jest.requireActual('../interactions'),
DashboardLibraryInteractions: {
...jest.requireActual('../interactions').DashboardLibraryInteractions,
communityDashboardFiltered: jest.fn(),
},
}));
const mockGetBackendSrv = getBackendSrv as jest.MockedFunction<typeof getBackendSrv>;
// Helper to create mock BackendSrv
@@ -26,31 +35,9 @@ const createMockBackendSrv = (overrides: Partial<BackendSrv> = {}): BackendSrv =
}) as unknown as BackendSrv;
// Helper functions for creating mock objects
const createMockGnetDashboard = (overrides: Partial<GnetDashboard> = {}): GnetDashboard => ({
id: 1,
name: 'Test Dashboard',
description: 'Test Description',
downloads: 100,
datasource: 'Prometheus',
...overrides,
});
const createMockPluginDashboard = (overrides: Partial<PluginDashboard> = {}): PluginDashboard => ({
dashboardId: 1,
uid: 'dash-uid',
title: 'Test Dashboard',
pluginId: 'prometheus',
path: 'dashboards/test.json',
description: 'Test plugin dashboard',
imported: false,
importedRevision: 0,
importedUri: '',
importedUrl: '',
removed: false,
revision: 1,
slug: 'test-dashboard',
...overrides,
});
const createMockGnetDashboardWithDownloads = (overrides: Partial<GnetDashboard> = {}): GnetDashboard => {
return createMockGnetDashboard({ ...overrides, downloads: 10000 });
};
const defaultFetchParams: FetchCommunityDashboardsParams = {
orderBy: 'downloads',
@@ -80,8 +67,54 @@ describe('dashboardLibraryApi', () => {
});
describe('fetchCommunityDashboards', () => {
describe('filterNotSafeDashboards', () => {
it('should filter out dashboards with panel types that can contain JavaScript code', async () => {
const safeDashboard = createMockGnetDashboardWithDownloads({ id: 1 });
const mockDashboards = [
safeDashboard,
createMockGnetDashboardWithDownloads({ id: 2, panelTypeSlugs: ['ae3e-plotly-panel'] }),
];
const mockResponse = {
page: 1,
pages: 5,
items: mockDashboards,
};
mockGet.mockResolvedValue(mockResponse);
const result = await fetchCommunityDashboards(defaultFetchParams);
expect(result).toEqual({
page: 1,
pages: 5,
items: [safeDashboard],
});
});
it('should filter out dashboards with low downloads', async () => {
const safeDashboard = createMockGnetDashboardWithDownloads({ id: 1 });
const mockDashboards = [safeDashboard, createMockGnetDashboard({ id: 2, downloads: 999 })];
const mockResponse = {
page: 1,
pages: 5,
items: mockDashboards,
};
mockGet.mockResolvedValue(mockResponse);
const result = await fetchCommunityDashboards(defaultFetchParams);
expect(result).toEqual({
page: 1,
pages: 5,
items: [safeDashboard],
});
});
});
it('should fetch community dashboards with correct query parameters', async () => {
const mockDashboards = [createMockGnetDashboard({ id: 1 }), createMockGnetDashboard({ id: 2 })];
const mockDashboards = [
createMockGnetDashboardWithDownloads({ id: 1 }),
createMockGnetDashboardWithDownloads({ id: 2 }),
];
const mockResponse = {
page: 1,
pages: 5,
@@ -93,7 +126,7 @@ describe('dashboardLibraryApi', () => {
const result = await fetchCommunityDashboards(defaultFetchParams);
expect(mockGet).toHaveBeenCalledWith(
'/api/gnet/dashboards?orderBy=downloads&direction=desc&page=1&pageSize=10&includeLogo=1&includeScreenshots=true',
'/api/gnet/dashboards?orderBy=downloads&direction=desc&page=1&pageSize=10&includeLogo=1&includeScreenshots=true&includePanelTypeSlugs=true',
undefined,
undefined,
{ showErrorAlert: false }
@@ -154,7 +187,7 @@ describe('dashboardLibraryApi', () => {
});
it('should use fallback values when page/pages are missing', async () => {
const items = [createMockGnetDashboard()];
const items = [createMockGnetDashboardWithDownloads()];
mockGet.mockResolvedValue({
items,
@@ -2,7 +2,35 @@ import { getBackendSrv } from '@grafana/runtime';
import { DashboardJson } from 'app/features/manage-dashboards/types';
import { PluginDashboard } from 'app/types/plugins';
import { GnetDashboardsResponse, Link } from '../types';
import { GnetDashboard, GnetDashboardsResponse, Link } from '../types';
/**
* Panel types that are known to allow JavaScript code execution.
* These panels are filtered out due to security concerns.
*/
const UNSAFE_PANEL_TYPE_SLUGS = [
'aceiot-svg-panel',
'ae3e-plotly-panel',
'gapit-htmlgraphics-panel',
'marcusolsson-dynamictext-panel',
'volkovlabs-echarts-panel',
'volkovlabs-form-panel',
];
/**
* Minimum number of downloads required for a community dashboard to be shown as a suggestion.
*
* Rationale:
* - Dashboards with higher download counts have been vetted by a larger community
* - This acts as a heuristic for quality and trustworthiness
* - Reduces risk of malicious or poorly-maintained dashboards
*
* Trade-offs:
* - May filter out legitimate but less popular dashboards
* - Newer dashboards with good content but low download counts won't be shown
* - The threshold of 10,000 is somewhat arbitrary and may need tuning based on ecosystem growth
*/
const MIN_DOWNLOADS_FILTER = 10000;
/**
* Parameters for fetching community dashboards from Grafana.com
@@ -56,6 +84,7 @@ export async function fetchCommunityDashboards(
pageSize: params.pageSize.toString(),
includeLogo: params.includeLogo ? '1' : '0',
includeScreenshots: params.includeScreenshots ? 'true' : 'false',
includePanelTypeSlugs: 'true',
});
if (params.dataSourceSlugIn) {
@@ -69,13 +98,13 @@ export async function fetchCommunityDashboards(
showErrorAlert: false,
});
// Grafana.com API returns format: { page: number, pages: number, items: GnetDashboard[] }
// We normalize it to use "dashboards" instead of "items" for consistency
if (result && Array.isArray(result.items)) {
const dashboards = filterNonSafeDashboards(result.items);
return {
page: result.page || params.page,
pages: result.pages || 1,
items: result.items,
items: dashboards,
};
}
@@ -109,3 +138,20 @@ export async function fetchProvisionedDashboards(datasourceType: string): Promis
return [];
}
}
// We only show dashboards with at least MIN_DOWNLOADS_FILTER downloads
// They are already ordered by downloads amount
const filterNonSafeDashboards = (dashboards: GnetDashboard[]): GnetDashboard[] => {
return dashboards.filter((item: GnetDashboard) => {
const hasUnsafePanelTypes = item.panelTypeSlugs?.some((slug: string) => UNSAFE_PANEL_TYPE_SLUGS.includes(slug));
const hasLowDownloads = typeof item.downloads === 'number' && item.downloads < MIN_DOWNLOADS_FILTER;
if (hasUnsafePanelTypes || hasLowDownloads) {
console.warn(
`Community dashboard ${item.id} ${item.name} filtered out due to low downloads ${item.downloads} or panel types ${item.panelTypeSlugs?.join(', ')} that can embed JavaScript`
);
return false;
}
return true;
});
};
@@ -8,6 +8,7 @@ export const EVENT_LOCATIONS = {
MODAL_PROVISIONED_TAB: 'suggested_dashboards_modal_provisioned_tab',
MODAL_COMMUNITY_TAB: 'suggested_dashboards_modal_community_tab',
BROWSE_DASHBOARDS_PAGE: 'browse_dashboards_page',
COMMUNITY_DASHBOARD_LOADED: 'community_dashboard_loaded',
} as const;
export const CONTENT_KINDS = {
@@ -24,6 +24,7 @@ export interface GnetDashboard {
id: number;
name: string;
description: string;
slug: string;
downloads: number;
datasource: string;
screenshots?: Screenshot[];
@@ -38,6 +39,7 @@ export interface GnetDashboard {
orgSlug?: string;
userId?: number;
userName?: string;
panelTypeSlugs?: string[];
}
export interface GnetDashboardsResponse {
@@ -11,7 +11,6 @@ import { InputMapping, tryAutoMapDatasources, parseConstantInputs } from './auto
import {
buildDashboardDetails,
buildGrafanaComUrl,
createSlug,
getLogoUrl,
navigateToTemplate,
onUseCommunityDashboard,
@@ -27,6 +26,14 @@ jest.mock('./autoMapDatasources', () => ({
parseConstantInputs: jest.fn(),
}));
jest.mock('../interactions', () => ({
...jest.requireActual('../interactions'),
DashboardLibraryInteractions: {
...jest.requireActual('../interactions').DashboardLibraryInteractions,
communityDashboardFiltered: jest.fn(),
},
}));
// Mock function references
const mockFetchCommunityDashboard = fetchCommunityDashboard as jest.MockedFunction<typeof fetchCommunityDashboard>;
const mockTryAutoMapDatasources = tryAutoMapDatasources as jest.MockedFunction<typeof tryAutoMapDatasources>;
@@ -43,6 +50,7 @@ const createMockGnetDashboard = (overrides: Partial<GnetDashboard> = {}): GnetDa
publishedAt: '',
updatedAt: '2025-11-05T16:55:41.000Z',
downloads: 0,
slug: 'test-dashboard',
...overrides,
});
@@ -61,25 +69,11 @@ const createMockDashboardJson = (overrides: Partial<DashboardJson> = {}): Dashbo
}) as DashboardJson;
describe('communityDashboardHelpers', () => {
describe('createSlug', () => {
it('should convert to lower case', () => {
expect(createSlug('Test')).toBe('test');
});
it('should replace non-alphanumeric characters with hyphens', () => {
expect(createSlug('Test@#example')).toBe('test-example');
});
it('should remove leading and trailing hyphens', () => {
expect(createSlug('-test-')).toBe('test');
});
});
describe('buildGrafanaComUrl', () => {
it('should build a valid URL', () => {
const gnetDashboard = createMockGnetDashboard({
id: 1,
name: 'Test',
slug: 'test',
});
expect(buildGrafanaComUrl(gnetDashboard)).toBe('https://grafana.com/grafana/dashboards/1-test/');
@@ -91,6 +85,7 @@ describe('communityDashboardHelpers', () => {
const gnetDashboard = createMockGnetDashboard({
id: 1,
name: 'Test',
slug: 'test',
datasource: 'Test',
orgName: 'Org',
updatedAt: '2025-11-05T16:55:41.000Z',
@@ -170,6 +165,10 @@ describe('communityDashboardHelpers', () => {
});
describe('onUseCommunityDashboard', () => {
let consoleWarnSpy: jest.SpyInstance;
let consoleErrorSpy: jest.SpyInstance;
let locationServicePushSpy: jest.SpyInstance;
async function setup(options?: {
dashboard?: Partial<GnetDashboard>;
dashboardJson?: Partial<DashboardJson>;
@@ -206,7 +205,16 @@ describe('communityDashboardHelpers', () => {
}
beforeEach(() => {
consoleWarnSpy = jest.spyOn(console, 'warn').mockImplementation();
consoleErrorSpy = jest.spyOn(console, 'error').mockImplementation();
locationServicePushSpy = jest.spyOn(locationService, 'push').mockImplementation();
});
afterEach(() => {
jest.clearAllMocks();
consoleWarnSpy.mockRestore();
consoleErrorSpy.mockRestore();
locationServicePushSpy.mockRestore();
});
it('should navigate directly when all datasources are auto-mapped and no constants', async () => {
@@ -218,8 +226,8 @@ describe('communityDashboardHelpers', () => {
},
});
expect(locationService.push).toHaveBeenCalled();
expect(locationService.push).toHaveBeenCalledWith(
expect(locationServicePushSpy).toHaveBeenCalled();
expect(locationServicePushSpy).toHaveBeenCalledWith(
expect.objectContaining({
pathname: expect.any(String),
search: expect.stringContaining('gnetId=123'),
@@ -249,7 +257,7 @@ describe('communityDashboardHelpers', () => {
});
expect(mockOnShowMapping).toHaveBeenCalled();
expect(locationService.push).not.toHaveBeenCalled();
expect(locationServicePushSpy).not.toHaveBeenCalled();
expect(mockOnShowMapping).toHaveBeenCalledWith(
expect.objectContaining({
dashboardName: 'Test Dashboard',
@@ -281,7 +289,7 @@ describe('communityDashboardHelpers', () => {
});
expect(mockOnShowMapping).toHaveBeenCalled();
expect(locationService.push).not.toHaveBeenCalled();
expect(locationServicePushSpy).not.toHaveBeenCalled();
expect(mockOnShowMapping).toHaveBeenCalledWith(
expect.objectContaining({
dashboardName: 'Test Dashboard',
@@ -294,17 +302,312 @@ describe('communityDashboardHelpers', () => {
const consoleErrorSpy = jest.spyOn(console, 'error').mockImplementation();
mockFetchCommunityDashboard.mockRejectedValue(new Error('API failed'));
await onUseCommunityDashboard({
dashboard: createMockGnetDashboard(),
datasourceUid: 'test-ds-uid',
datasourceType: 'prometheus',
eventLocation: 'empty_dashboard',
});
await expect(
onUseCommunityDashboard({
dashboard: createMockGnetDashboard(),
datasourceUid: 'test-ds-uid',
datasourceType: 'prometheus',
eventLocation: 'empty_dashboard',
})
).rejects.toThrow('API failed');
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationService.push).not.toHaveBeenCalled();
expect(locationServicePushSpy).not.toHaveBeenCalled();
consoleErrorSpy.mockRestore();
});
describe('when the dashboard contains JavaScript code', () => {
it('should throw an error if the dashboard contains JavaScript code in options', async () => {
const dashboardJson = createMockDashboardJson({
// eslint-disable-next-line @typescript-eslint/no-explicit-any
panels: [{ type: 'panel', options: { template: '{{ javascript:alert("XSS") }}' } } as any],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains JavaScript code in targets/queries', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
targets: [
{
expr: 'function() { return bad(); }',
refId: 'A',
},
],
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains JavaScript code in transformations', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
transformations: [
{
id: 'calculateField',
options: {
mode: 'binary',
binary: {
reducer: 'sum',
left: 'A',
right: 'B',
},
replaceFields: false,
alias: 'function() { alert("XSS"); }',
},
},
],
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains JavaScript code in fieldConfig', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
fieldConfig: {
defaults: {
custom: {
displayMode: 'function() { return "bad"; }',
},
},
overrides: [],
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains javascript: URLs in links', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
links: [
{
title: 'Bad Link',
url: 'javascript:alert("XSS")',
targetBlank: false,
},
],
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains <script> tags in any property', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {
content: '<script>alert("XSS")</script>',
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains arrow functions', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {
customCode: '() => { alert("XSS"); }',
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains setTimeout or setInterval', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {
handler: 'setTimeout(() => alert("XSS"), 1000)',
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains suspicious key names like beforeRender', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
beforeRender: 'alert("XSS")',
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains suspicious key names like afterRender', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
afterRender: 'alert("XSS")',
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains suspicious key names like handler', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {},
handler: 'alert("XSS")',
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains return statements', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {
customLogic: 'function test() { return malicious(); }',
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
it('should throw an error if the dashboard contains event handlers like onclick', async () => {
const dashboardJson = createMockDashboardJson({
panels: [
{
type: 'panel',
options: {
html: '<div onclick="alert(\'XSS\')">Click me</div>',
},
// eslint-disable-next-line @typescript-eslint/no-explicit-any
} as any,
],
});
await expect(setup({ dashboardJson })).rejects.toThrow(
'Community dashboard 123 "Test Dashboard" might contain JavaScript code'
);
expect(consoleErrorSpy).toHaveBeenCalledWith('Error loading community dashboard:', expect.any(Error));
expect(locationServicePushSpy).not.toHaveBeenCalled();
});
});
});
});
@@ -1,5 +1,11 @@
import { PanelModel } from '@grafana/data';
import { t } from '@grafana/i18n';
import { locationService } from '@grafana/runtime';
import { notifyApp } from 'app/core/actions';
import { createErrorNotification } from 'app/core/copy/appNotification';
import { DataSourceInput } from 'app/features/manage-dashboards/state/reducers';
import { DashboardJson } from 'app/features/manage-dashboards/types';
import { dispatch } from 'app/types/store';
import { DASHBOARD_LIBRARY_ROUTES } from '../../types';
import { MappingContext } from '../SuggestedDashboardsModal';
@@ -9,6 +15,12 @@ import { GnetDashboard, Link } from '../types';
import { InputMapping, tryAutoMapDatasources, parseConstantInputs, isDataSourceInput } from './autoMapDatasources';
// Constants for community dashboard pagination and API params
// We want to get the most 6 downloaded dashboards, but we first query 12
// to be sure the next filters we apply to that list doesn not reduce it below 6
export const COMMUNITY_PAGE_SIZE_QUERY = 12;
export const COMMUNITY_RESULT_SIZE = 6;
/**
* Extract thumbnail URL from dashboard screenshots
*/
@@ -39,21 +51,11 @@ export function formatDate(dateString?: string): string {
return date.toLocaleDateString(undefined, { year: 'numeric', month: 'short', day: 'numeric' });
}
/**
* Create URL-friendly slug from dashboard name
*/
export function createSlug(name: string): string {
return name
.toLowerCase()
.replace(/[^a-z0-9]+/g, '-')
.replace(/^-+|-+$/g, '');
}
/**
* Build Grafana.com URL for a dashboard
*/
export function buildGrafanaComUrl(dashboard: GnetDashboard): string {
return `https://grafana.com/grafana/dashboards/${dashboard.id}-${createSlug(dashboard.name)}/`;
return `https://grafana.com/grafana/dashboards/${dashboard.id}-${dashboard.slug}/`;
}
/**
@@ -121,12 +123,110 @@ interface UseCommunityDashboardParams {
onShowMapping?: (context: MappingContext) => void;
}
/**
* Check if a panel contains JavaScript code using heuristic pattern matching.
*
* IMPORTANT: This is a heuristic-based detection, not a perfect mechanism.
*
* Patterns checked:
* - HTML/Script tags: Direct XSS attack vectors
* - Event handlers: Common JS injection points (onclick, onload, etc.)
* - Function declarations: Actual executable code patterns
* - eval/Function constructor: Dynamic code execution
* - setTimeout/setInterval: Deferred code execution
*
* What we DON'T check:
* - Panel title and description are excluded (already sanitized by Grafana's rendering layer)
* - Only the panel's options and configuration are scanned
*
* @param panel - The panel model to check
* @returns true if the panel might contain JavaScript code, false otherwise
*/
function canPanelContainJS(panel: PanelModel): boolean {
// Create a copy of the panel without title and description, as they are already sanitized
// This reduces false positives while still checking all other properties for JavaScript code
const { title, description, ...panelWithoutSanitizedFields } = panel;
let panelJson: string;
try {
panelJson = JSON.stringify(panelWithoutSanitizedFields);
} catch (e) {
console.warn('Failed to stringify panel', e);
return true;
}
// Patterns that indicate actual JavaScript code in values
const valuePatterns = [
/<script\b/i, // HTML script tags
/\bon\w+\s*=\s*/i, // HTML event handlers: onclick=, onload=, etc.
/\bjavascript\s*:/i,
/\bfunction\s*\(/, // Anonymous function declarations: function(
/\bfunction\s+[\w$]+\s*\(/, // Named function declarations: function name(
/=>\s*\{[^}]*\breturn\b/, // Arrow function with return statement: () => { return ... }
/\beval\s*\(/i, // eval() calls
/\bnew\s+Function\s*\(/i, // new Function() constructor
/\bsetTimeout\s*\(/i, // setTimeout calls
/\bsetInterval\s*\(/i, // setInterval calls
];
// Patterns for suspicious JSON keys that might indicate JS hooks
const keyPatterns = [
/"on[a-zA-Z]+"\s*:/, // Event handlers as keys (both camelCase and lowercase): "onClick": or "onclick":
/"beforeRender"\s*:/i, // beforeRender hook as JSON key
/"afterRender"\s*:/i, // afterRender hook as JSON key
/"javascript"\s*:/i, // "javascript" as a key
/"customCode"\s*:/i, // Common pattern for custom code injection
/"script"\s*:/i, // "script" as a JSON key
/"handler"\s*:/i, // "handler" as a JSON key - common for event handlers
];
const hasSuspiciousValue = valuePatterns.some((pattern) => {
if (pattern.test(panelJson)) {
console.warn('Panel contains JavaScript code in value');
return true;
}
return false;
});
const hasSuspiciousKey = keyPatterns.some((pattern) => {
if (pattern.test(panelJson)) {
console.warn('Panel contains JavaScript code in key');
return true;
}
return false;
});
return hasSuspiciousValue || hasSuspiciousKey;
}
function isPanelModel(panel: unknown): panel is PanelModel {
if (!panel || typeof panel !== 'object') {
return false;
}
return 'options' in panel && 'type' in panel;
}
/**
* Check if a dashboard contains JavaScript code. This is not a perfect check, but good enough
* Used as a second filter after the first filter of panel types (see api/dashboardLibraryApi.ts)
*/
const canDashboardContainJS = (dashboard: DashboardJson): boolean => {
return dashboard.panels?.some((panel) => {
// Skip library panels - they don't have options/type and are already validated
if (isPanelModel(panel)) {
return canPanelContainJS(panel);
}
return false;
});
};
/**
* Handles the flow when a user selects a community dashboard:
* 1. Tracks analytics
* 2. Fetches full dashboard JSON with __inputs
* 3. Attempts auto-mapping of datasources
* 4. Either navigates directly or shows mapping form
* 3. Filters out dashboards that contain JavaScript code due to security reasons
* 4. Attempts auto-mapping of datasources
* 5. Either navigates directly or shows mapping form
*/
export async function onUseCommunityDashboard({
dashboard,
@@ -142,6 +242,10 @@ export async function onUseCommunityDashboard({
const fullDashboard = await fetchCommunityDashboard(dashboard.id);
const dashboardJson = fullDashboard.json;
if (canDashboardContainJS(dashboardJson)) {
throw new Error(`Community dashboard ${dashboard.id} "${dashboard.name}" might contain JavaScript code`);
}
// Parse datasource requirements from __inputs
const dsInputs: DataSourceInput[] = dashboardJson.__inputs?.filter(isDataSourceInput) || [];
@@ -199,6 +303,11 @@ export async function onUseCommunityDashboard({
}
} catch (err) {
console.error('Error loading community dashboard:', err);
// TODO: Show error notification
dispatch(
notifyApp(
createErrorNotification(t('dashboard-library.community-error-title', 'Error loading community dashboard'))
)
);
throw err;
}
}
@@ -0,0 +1,34 @@
import { PluginDashboard } from 'app/types/plugins';
import { GnetDashboard } from '../types';
export const createMockPluginDashboard = (overrides: Partial<PluginDashboard> = {}): PluginDashboard => ({
dashboardId: 1,
uid: 'dash-uid',
title: 'Test Provisioned Dashboard',
description: 'Test plugin dashboard',
path: 'dashboards/test.json',
pluginId: 'prometheus',
imported: false,
importedRevision: 0,
importedUri: '',
importedUrl: '',
removed: false,
revision: 1,
slug: 'test-dashboard',
...overrides,
});
export const createMockGnetDashboard = (overrides: Partial<GnetDashboard> = {}): GnetDashboard => ({
id: 123,
name: 'Test Dashboard',
description: 'Test description',
datasource: 'Prometheus',
orgName: 'Test Org',
userName: 'testuser',
publishedAt: '',
updatedAt: '',
downloads: 0,
slug: 'test-dashboard',
...overrides,
});
@@ -1,13 +1,10 @@
import $ from 'jquery';
import _, { isFunction } from 'lodash'; // eslint-disable-line lodash/import-scope
import moment from 'moment'; // eslint-disable-line no-restricted-imports
import { isFunction } from 'lodash'; // eslint-disable-line lodash/import-scope
import { AppEvents, dateMath, UrlQueryMap, UrlQueryValue } from '@grafana/data';
import { getBackendSrv, isFetchError, locationService } from '@grafana/runtime';
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
import { backendSrv } from 'app/core/services/backend_srv';
import impressionSrv from 'app/core/services/impression_srv';
import kbn from 'app/core/utils/kbn';
import { getDashboardScenePageStateManager } from 'app/features/dashboard-scene/pages/DashboardScenePageStateManager';
import { getDatasourceSrv } from 'app/features/plugins/datasource_srv';
import { DashboardDTO } from 'app/types/dashboard';
@@ -68,7 +65,15 @@ abstract class DashboardLoaderSrvBase<T> implements DashboardLoaderSrvLike<T> {
);
}
private executeScript(result: any) {
private async executeScript(result: any) {
// Async-load dependencies used only in scripted dashboards to avoid them being in the main bundle, if not needed
const [jQuery, moment, lodash, kbn] = await Promise.all([
import('jquery'),
import('moment'),
import('lodash'),
import('app/core/utils/kbn'),
]);
const services = {
dashboardSrv: getDashboardSrv(),
datasourceSrv: getDatasourceSrv(),
@@ -90,12 +95,12 @@ abstract class DashboardLoaderSrvBase<T> implements DashboardLoaderSrvLike<T> {
locationService.getSearchObject(),
kbn,
dateMath,
_,
lodash,
moment,
window,
document,
$,
$,
jQuery,
jQuery,
services
);
@@ -1612,6 +1612,53 @@ ${buildImageContent(
`;
},
},
smoothing: {
name: 'Smoothing',
getHelperDocs: function (imageRenderType: ImageRenderType = ImageRenderType.ShortcodeFigure) {
return `
Use this transformation to reduce noise in time series data through adaptive smoothing. This transformation creates smoother, cleaner visualizations while preserving all original time points and important trends and patterns in your data.
The smoothing transformation uses the ASAP (Automatic Smoothing for Attention Prioritization) algorithm internally to generate a smoothed curve, which is then interpolated back onto all original time points. This ensures your visualization maintains continuous lines without gaps while reducing noise.
#### Available options
- **Resolution** - Controls smoothing intensity (1-1000). Lower values create more aggressive smoothing, while higher values preserve more detail. The output preserves all original time points.
#### When to use smoothing
This transformation is useful for:
- Noisy time series data that obscures underlying trends
- Clearer trend analysis and pattern recognition
#### Example
Consider noisy sensor data with thousands of points:
**Before smoothing:**
| Time | Temperature |
| ------------------- | ----------- |
| 2020-07-07 10:00:00 | 23.1 |
| 2020-07-07 10:00:01 | 23.3 |
| 2020-07-07 10:00:02 | 22.9 |
| 2020-07-07 10:00:03 | 23.2 |
| ... (thousands more) | ... |
**After smoothing (Resolution: 100):**
| Time | Temperature (smoothed) |
| ------------------- | ---------------------- |
| 2020-07-07 10:00:00 | 23.1 |
| 2020-07-07 10:00:01 | 23.1 |
| 2020-07-07 10:00:02 | 23.0 |
| 2020-07-07 10:00:03 | 23.0 |
| ... (same count) | ... |
The transformation preserves all original time points while reducing noise, resulting in smoother curves that maintain continuous lines without gaps.
`;
},
},
};
function buildImageContent(source: string, imageRenderType: ImageRenderType, imageAltText: string) {
@@ -0,0 +1,72 @@
<svg width="114" height="48" viewBox="0 0 114 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0_smoothing_dark)">
<path d="M0 0.611699V7.64012H7V0H0.601504C0.441975 0 0.28898 0.0644467 0.176176 0.179162C0.0633725 0.293878 0 0.449466 0 0.611699Z" fill="url(#paint0_linear_smoothing_dark)"/>
<path d="M8 0H15V7.64012H8V0Z" fill="url(#paint1_linear_smoothing_dark)"/>
<path d="M16 0H23V7.64012H16V0Z" fill="url(#paint2_linear_smoothing_dark)"/>
<path d="M24 0H31V7.64012H24V0Z" fill="url(#paint3_linear_smoothing_dark)"/>
<path d="M32 0H39V7.64012H32V0Z" fill="url(#paint4_linear_smoothing_dark)"/>
<path d="M40 7.64012H48V0.611699C48 0.449466 47.9366 0.293878 47.8238 0.179162C47.711 0.0644467 47.558 0 47.3985 0H40V7.64012Z" fill="url(#paint5_linear_smoothing_dark)"/>
<path d="M2 38L5 28L8 35L11 25L14 32L17 22L20 30L23 20L26 28L29 24L32 31L35 26L38 33L41 29L44 36L46 32" stroke="#84AFF1" stroke-width="2" fill="none" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M2 38L5 28L8 35L11 25L14 32L17 22L20 30L23 20L26 28L29 24L32 31L35 26L38 33L41 29L44 36L46 32L46 48L2 48Z" fill="url(#paint6_linear_smoothing_dark)" opacity="0.3"/>
</g>
<path d="M57.91 30C58.6 30 64 26 64 24C64 22 58.7 18 57.91 18C57.08 18 56.4 18.5 56.4 19.48C56.4 20.45 59.91 22.92 59.91 22.92C59.91 22.92 52.25 22.25 52 22.92C51.75 23.59 51.75 24.41 52 25.08C52.25 25.75 59.91 25.08 59.91 25.08C59.91 25.08 56.4 27.75 56.4 28.53C56.4 29.31 57.21 30 57.91 30Z" fill="#CCCCDC"/>
<g clip-path="url(#clip1_smoothing_dark)">
<path d="M66 0.611699V7.64012H81V0H66.6015C66.442 0 66.289 0.0644467 66.1762 0.179162C66.0634 0.293878 66 0.449466 66 0.611699Z" fill="url(#paint7_linear_smoothing_dark)"/>
<path d="M82 0H97V7.64012H82V0Z" fill="url(#paint8_linear_smoothing_dark)"/>
<path d="M98 7.64012H114V0.611699C114 0.449466 113.937 0.293878 113.824 0.179162C113.711 0.0644467 113.558 0 113.399 0H98V7.64012Z" fill="url(#paint9_linear_smoothing_dark)"/>
<path d="M68 36Q77 28 86 24Q95 21 112 25" stroke="#84AFF1" stroke-width="2.5" fill="none" stroke-linecap="round"/>
<path d="M68 36Q77 28 86 24Q95 21 112 25L112 48L68 48Z" fill="url(#paint10_linear_smoothing_dark)" opacity="0.3"/>
</g>
<defs>
<linearGradient id="paint0_linear_smoothing_dark" x1="0" y1="3.82" x2="7" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint1_linear_smoothing_dark" x1="8" y1="3.82" x2="15" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint2_linear_smoothing_dark" x1="16" y1="3.82" x2="23" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint3_linear_smoothing_dark" x1="24" y1="3.82" x2="31" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint4_linear_smoothing_dark" x1="32" y1="3.82" x2="39" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint5_linear_smoothing_dark" x1="40" y1="3.82" x2="48" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint6_linear_smoothing_dark" x1="24" y1="48" x2="24" y2="20" gradientUnits="userSpaceOnUse">
<stop stop-color="#1F60C4" stop-opacity="0"/>
<stop offset="1" stop-color="#3865AB"/>
</linearGradient>
<linearGradient id="paint7_linear_smoothing_dark" x1="66" y1="3.82" x2="81" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint8_linear_smoothing_dark" x1="82" y1="3.82" x2="97" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint9_linear_smoothing_dark" x1="98" y1="3.82" x2="114" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint10_linear_smoothing_dark" x1="90" y1="48" x2="90" y2="21" gradientUnits="userSpaceOnUse">
<stop stop-color="#1F60C4" stop-opacity="0"/>
<stop offset="1" stop-color="#3865AB"/>
</linearGradient>
<clipPath id="clip0_smoothing_dark">
<rect width="48" height="48" fill="white"/>
</clipPath>
<clipPath id="clip1_smoothing_dark">
<rect width="48" height="48" fill="white" transform="translate(66)"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 4.6 KiB

@@ -0,0 +1,72 @@
<svg width="114" height="48" viewBox="0 0 114 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0_smoothing_light)">
<path d="M0 0.611699V7.64012H7V0H0.601504C0.441975 0 0.28898 0.0644467 0.176176 0.179162C0.0633725 0.293878 0 0.449466 0 0.611699Z" fill="url(#paint0_linear_smoothing_light)"/>
<path d="M8 0H15V7.64012H8V0Z" fill="url(#paint1_linear_smoothing_light)"/>
<path d="M16 0H23V7.64012H16V0Z" fill="url(#paint2_linear_smoothing_light)"/>
<path d="M24 0H31V7.64012H24V0Z" fill="url(#paint3_linear_smoothing_light)"/>
<path d="M32 0H39V7.64012H32V0Z" fill="url(#paint4_linear_smoothing_light)"/>
<path d="M40 7.64012H48V0.611699C48 0.449466 47.9366 0.293878 47.8238 0.179162C47.711 0.0644467 47.558 0 47.3985 0H40V7.64012Z" fill="url(#paint5_linear_smoothing_light)"/>
<path d="M2 38L5 28L8 35L11 25L14 32L17 22L20 30L23 20L26 28L29 24L32 31L35 26L38 33L41 29L44 36L46 32" stroke="#84AFF1" stroke-width="2" fill="none" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M2 38L5 28L8 35L11 25L14 32L17 22L20 30L23 20L26 28L29 24L32 31L35 26L38 33L41 29L44 36L46 32L46 48L2 48Z" fill="url(#paint6_linear_smoothing_light)" opacity="0.3"/>
</g>
<path d="M57.91 30C58.6 30 64 26 64 24C64 22 58.7 18 57.91 18C57.08 18 56.4 18.5 56.4 19.48C56.4 20.45 59.91 22.92 59.91 22.92C59.91 22.92 52.25 22.25 52 22.92C51.75 23.59 51.75 24.41 52 25.08C52.25 25.75 59.91 25.08 59.91 25.08C59.91 25.08 56.4 27.75 56.4 28.53C56.4 29.31 57.21 30 57.91 30Z" fill="#24292E"/>
<g clip-path="url(#clip1_smoothing_light)">
<path d="M66 0.611699V7.64012H81V0H66.6015C66.442 0 66.289 0.0644467 66.1762 0.179162C66.0634 0.293878 66 0.449466 66 0.611699Z" fill="url(#paint7_linear_smoothing_light)"/>
<path d="M82 0H97V7.64012H82V0Z" fill="url(#paint8_linear_smoothing_light)"/>
<path d="M98 7.64012H114V0.611699C114 0.449466 113.937 0.293878 113.824 0.179162C113.711 0.0644467 113.558 0 113.399 0H98V7.64012Z" fill="url(#paint9_linear_smoothing_light)"/>
<path d="M68 36Q77 28 86 24Q95 21 112 25" stroke="#84AFF1" stroke-width="2.5" fill="none" stroke-linecap="round"/>
<path d="M68 36Q77 28 86 24Q95 21 112 25L112 48L68 48Z" fill="url(#paint10_linear_smoothing_light)" opacity="0.3"/>
</g>
<defs>
<linearGradient id="paint0_linear_smoothing_light" x1="0" y1="3.82" x2="7" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint1_linear_smoothing_light" x1="8" y1="3.82" x2="15" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint2_linear_smoothing_light" x1="16" y1="3.82" x2="23" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint3_linear_smoothing_light" x1="24" y1="3.82" x2="31" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint4_linear_smoothing_light" x1="32" y1="3.82" x2="39" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint5_linear_smoothing_light" x1="40" y1="3.82" x2="48" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint6_linear_smoothing_light" x1="24" y1="48" x2="24" y2="20" gradientUnits="userSpaceOnUse">
<stop stop-color="#1F60C4" stop-opacity="0"/>
<stop offset="1" stop-color="#3865AB"/>
</linearGradient>
<linearGradient id="paint7_linear_smoothing_light" x1="66" y1="3.82" x2="81" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint8_linear_smoothing_light" x1="82" y1="3.82" x2="97" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint9_linear_smoothing_light" x1="98" y1="3.82" x2="114" y2="3.82" gradientUnits="userSpaceOnUse">
<stop stop-color="#F2CC0C"/>
<stop offset="1" stop-color="#FF9830"/>
</linearGradient>
<linearGradient id="paint10_linear_smoothing_light" x1="90" y1="48" x2="90" y2="21" gradientUnits="userSpaceOnUse">
<stop stop-color="#1F60C4" stop-opacity="0"/>
<stop offset="1" stop-color="#3865AB"/>
</linearGradient>
<clipPath id="clip0_smoothing_light">
<rect width="48" height="48" fill="white"/>
</clipPath>
<clipPath id="clip1_smoothing_light">
<rect width="48" height="48" fill="white" transform="translate(66)"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 4.6 KiB

@@ -0,0 +1,130 @@
import { asapSmooth, DataPoint, ASAPOptions } from './asap';
describe('asapSmooth', () => {
describe('Basic functionality', () => {
it('should return smoothed data with valid DataPoint objects', () => {
const data: DataPoint[] = [
{ x: 0, y: 0 },
{ x: 1, y: 1 },
{ x: 2, y: 2 },
{ x: 3, y: 3 },
{ x: 4, y: 4 },
];
const options: ASAPOptions = { resolution: 3 };
const result = asapSmooth(data, options);
expect(result.length).toBeGreaterThan(0);
result.forEach((point) => {
expect(point).toHaveProperty('x');
expect(point).toHaveProperty('y');
expect(typeof point.x).toBe('number');
expect(typeof point.y).toBe('number');
});
});
it('should maintain x-axis ordering', () => {
const data: DataPoint[] = Array.from({ length: 20 }, (_, i) => ({
x: i,
y: Math.random() * 100,
}));
const options: ASAPOptions = { resolution: 10 };
const result = asapSmooth(data, options);
// check that x values are in ascending order
for (let i = 1; i < result.length; i++) {
expect(result[i].x).toBeGreaterThanOrEqual(result[i - 1].x);
}
});
});
describe('Edge cases', () => {
it('should handle empty array', () => {
const data: DataPoint[] = [];
const options: ASAPOptions = { resolution: 10 };
const result = asapSmooth(data, options);
expect(result).toEqual([]);
});
it('should handle single data point', () => {
const data: DataPoint[] = [{ x: 1, y: 42 }];
const options: ASAPOptions = { resolution: 10 };
const result = asapSmooth(data, options);
expect(result.length).toBeGreaterThan(0);
expect(result[0].x).toBe(1);
expect(result[0].y).toBe(42);
});
it('should filter out NaN values', () => {
const data: DataPoint[] = [
{ x: 0, y: 0 },
{ x: 1, y: NaN },
{ x: 2, y: 2 },
{ x: 3, y: NaN },
{ x: 4, y: 4 },
];
const options: ASAPOptions = { resolution: 3 };
const result = asapSmooth(data, options);
expect(result.length).toBeGreaterThan(0);
result.forEach((point) => {
expect(isNaN(point.x)).toBe(false);
expect(isNaN(point.y)).toBe(false);
});
});
it('should return empty array when all values are NaN', () => {
const data: DataPoint[] = [
{ x: 0, y: NaN },
{ x: 1, y: NaN },
{ x: 2, y: NaN },
];
const options: ASAPOptions = { resolution: 3 };
const result = asapSmooth(data, options);
expect(result).toEqual([]);
});
it('should sort unsorted data', () => {
const data: DataPoint[] = [
{ x: 3, y: 3 },
{ x: 1, y: 1 },
{ x: 4, y: 4 },
{ x: 0, y: 0 },
{ x: 2, y: 2 },
];
const options: ASAPOptions = { resolution: 3 };
const result = asapSmooth(data, options);
expect(result.length).toBeGreaterThan(0);
// result should be sorted by x
for (let i = 1; i < result.length; i++) {
expect(result[i].x).toBeGreaterThanOrEqual(result[i - 1].x);
}
});
it('should handle negative values', () => {
const data: DataPoint[] = Array.from({ length: 10 }, (_, i) => ({
x: i,
y: -i * 2,
}));
const options: ASAPOptions = { resolution: 5 };
const result = asapSmooth(data, options);
expect(result.length).toBeGreaterThan(0);
result.forEach((point) => {
expect(isFinite(point.y)).toBe(true);
});
});
});
});
@@ -0,0 +1,40 @@
import { ASAP } from 'downsample';
export interface DataPoint {
x: number;
y: number;
}
export interface ASAPOptions {
resolution: number;
}
export function asapSmooth(data: DataPoint[], options: ASAPOptions): DataPoint[] {
const { resolution } = options;
if (!data || data.length === 0) {
return [];
}
// Filter invalid points and convert to tuple format for ASAP library
const inputData: Array<[number, number]> = data
.filter((point) => point != null && !isNaN(point.x) && !isNaN(point.y))
.map((point) => [point.x, point.y]);
if (inputData.length === 0) {
return [];
}
// this prevents O(m×n) degradation if inputData is unsorted data
inputData.sort((a, b) => a[0] - b[0]);
// ASAP always returns objects with x and y properties
const smoothedData = ASAP(inputData, resolution);
// Convert back to DataPoint format
const result: DataPoint[] = Array.from(smoothedData).filter(
(item): item is DataPoint => item !== null && typeof item === 'object' && 'x' in item && 'y' in item
);
return result;
}
@@ -0,0 +1,744 @@
import {
DataFrame,
DataTransformContext,
FieldType,
toDataFrame,
TransformationApplicabilityLevels,
} from '@grafana/data';
import { calculateMaxSourcePoints, getSmoothingTransformer, SmoothingTransformerOptions } from './smoothing';
describe('Smoothing transformer', () => {
const smoothingTransformer = getSmoothingTransformer();
const ctx: DataTransformContext = {
interpolate: (v: string) => v,
};
describe('isApplicable', () => {
it('should return Applicable for time series frames', () => {
const frames = [
toDataFrame({
name: 'time series',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
expect(smoothingTransformer.isApplicable!(frames)).toBe(TransformationApplicabilityLevels.Applicable);
});
it('should return NotApplicable for frames without time field', () => {
const frames = [
toDataFrame({
name: 'no time field',
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
expect(smoothingTransformer.isApplicable!(frames)).toBe(TransformationApplicabilityLevels.NotApplicable);
});
it('should return Applicable if at least one frame is a time series', () => {
const frames = [
toDataFrame({
name: 'not time series',
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'label', type: FieldType.string, values: ['X', 'Y', 'Z'] },
],
}),
toDataFrame({
name: 'time series',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
expect(smoothingTransformer.isApplicable!(frames)).toBe(TransformationApplicabilityLevels.Applicable);
});
it('should return NotApplicable for empty data', () => {
const frames: DataFrame[] = [];
expect(smoothingTransformer.isApplicable!(frames)).toBe(TransformationApplicabilityLevels.NotApplicable);
});
});
describe('Basic functionality', () => {
it('should smooth time series data with default settings', () => {
const source = [
toDataFrame({
name: 'test data',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15, 25, 18] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// first frame should be the original, unchanged
expect(result[0].name).toBe('test data');
expect(result[0].fields).toHaveLength(2);
expect(result[0].fields[0].name).toBe('time');
expect(result[0].fields[1].name).toBe('value');
expect(result[0].fields[1].values).toEqual([10, 20, 15, 25, 18]);
// second frame should be the smoothed version
expect(result[1].name).toBe('Smoothed');
expect(result[1].fields).toHaveLength(2);
expect(result[1].fields[0].name).toBe('time');
expect(result[1].fields[1].name).toBe('value');
// should preserve original time points
expect(result[1].fields[0].values).toEqual([1000, 2000, 3000, 4000, 5000]);
// should have corresponding smoothed values
expect(result[1].fields[1].values.length).toBe(5);
});
it('should handle multiple numeric fields', () => {
const source = [
toDataFrame({
name: 'multi field data',
refId: 'B',
fields: [
{ name: 'timestamp', type: FieldType.time, values: [1000, 2000, 3000, 4000] },
{ name: 'cpu', type: FieldType.number, values: [50, 75, 60, 80] },
{ name: 'memory', type: FieldType.number, values: [40, 55, 45, 65] },
{ name: 'label', type: FieldType.string, values: ['a', 'b', 'c', 'd'] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 3 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// first frame is original
expect(result[0].name).toBe('multi field data');
expect(result[0].fields[1].name).toBe('cpu');
expect(result[0].fields[2].name).toBe('memory');
// second frame is smoothed
expect(result[1].fields).toHaveLength(4);
expect(result[1].fields[0].name).toBe('timestamp');
expect(result[1].fields[1].name).toBe('cpu');
expect(result[1].fields[2].name).toBe('memory');
expect(result[1].fields[3].name).toBe('label');
// all numeric fields should be smoothed and preserve original time points
expect(result[1].fields[0].values.length).toBe(4);
expect(result[1].fields[1].values.length).toBe(4);
expect(result[1].fields[2].values.length).toBe(4);
});
it('should preserve non-numeric and non-time fields', () => {
const source = [
toDataFrame({
name: 'mixed data',
refId: 'C',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'active', type: FieldType.boolean, values: [true, false, true] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 2 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve non-numeric fields
expect(result[1].fields[2].name).toBe('category');
expect(result[1].fields[2].type).toBe(FieldType.string);
expect(result[1].fields[3].name).toBe('active');
expect(result[1].fields[3].type).toBe(FieldType.boolean);
});
});
describe('Configuration options', () => {
it('should use default resolution when not specified', () => {
const source = [
toDataFrame({
name: 'default test',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: Array.from({ length: 200 }, (_, i) => i * 1000) },
{ name: 'value', type: FieldType.number, values: Array.from({ length: 200 }, () => Math.random() * 100) },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve all original time points
expect(result[1].fields[0].values.length).toBe(200);
expect(result[1].fields[1].values.length).toBe(200);
});
it('should respect custom resolution settings', () => {
const source = [
toDataFrame({
name: 'resolution test',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: Array.from({ length: 100 }, (_, i) => i * 1000) },
{ name: 'value', type: FieldType.number, values: Array.from({ length: 100 }, () => Math.random() * 100) },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 25 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve all original time points regardless of resolution
expect(result[1].fields[0].values.length).toBe(100);
expect(result[1].fields[1].values.length).toBe(100);
});
it('should clamp resolution to minimum value', () => {
const source = [
toDataFrame({
name: 'small resolution test',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15, 25, 18] },
],
}),
];
// request resolution below minimum, it should be clamped to 1
const config: SmoothingTransformerOptions = { resolution: 2 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve all original time points and clamp resolution to minimum
expect(result[1].fields[0].values.length).toBe(5);
expect(result[1].fields[1].values.length).toBe(5);
});
});
describe('Edge cases', () => {
it('should handle empty data frames', () => {
const source: DataFrame[] = [];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
expect(result).toEqual([]);
});
it('should handle frames without time fields', () => {
const source = [
toDataFrame({
name: 'no time field',
refId: 'A',
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return original frame unchanged
expect(result).toHaveLength(1);
expect(result[0]).toEqual(source[0]);
});
it('should handle frames without numeric fields', () => {
const source = [
toDataFrame({
name: 'no numeric fields',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return original frame unchanged
expect(result).toHaveLength(1);
expect(result[0]).toEqual(source[0]);
});
it('should filter out NaN values when smoothing', () => {
const source = [
toDataFrame({
name: 'data with NaN',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'value', type: FieldType.number, values: [10, NaN, 15, 25, NaN] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 3 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve all time points
expect(result[1].fields[0].values.length).toBe(5);
expect(result[1].fields[1].values.length).toBe(5);
// all values should be interpolated from smoothed curve (no nulls)
const values = result[1].fields[1].values;
values.forEach((value) => {
expect(value).not.toBeNull();
expect(typeof value).toBe('number');
expect(isNaN(value)).toBe(false);
});
});
it('should handle data with all NaN values', () => {
const source = [
toDataFrame({
name: 'all NaN data',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [NaN, NaN, NaN] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// When all values are NaN, only original frame should be returned (no smoothed frame)
expect(result).toHaveLength(1);
expect(result[0].fields[1].name).toBe('value'); // No "(smoothed)" suffix
expect(result[0].fields[1].values).toEqual([NaN, NaN, NaN]);
expect(result[0].name).toBe('all NaN data'); // Original name preserved
});
it('should handle data with null values', () => {
const source = [
toDataFrame({
name: 'data with nulls',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000] },
{ name: 'value', type: FieldType.number, values: [10, null, 15, 25] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 3 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// smoothed frame should preserve all time points
expect(result[1].fields[0].values.length).toBe(4);
expect(result[1].fields[1].values.length).toBe(4);
// all values should be interpolated (no nulls in output)
const values = result[1].fields[1].values;
values.forEach((value) => {
expect(value).not.toBeNull();
expect(typeof value).toBe('number');
expect(isNaN(value)).toBe(false);
});
});
it('should handle single data point', () => {
const source = [
toDataFrame({
name: 'single point',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000] },
{ name: 'value', type: FieldType.number, values: [42] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
expect(result[1].fields[0].values).toHaveLength(1);
expect(result[1].fields[1].values).toHaveLength(1);
expect(result[1].fields[1].values[0]).toBe(42);
});
it('should handle empty numeric field values', () => {
const source = [
toDataFrame({
name: 'empty values',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return original frame since no numeric data to smooth
expect(result[0]).toEqual(source[0]);
});
});
describe('Data integrity', () => {
it('should maintain time ordering in smoothed data', () => {
const source = [
toDataFrame({
name: 'ordered data',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15, 25, 18] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 4 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// check smoothed frame's time values
const timeValues = result[1].fields[0].values as number[];
// check that time values are in ascending order
for (let i = 1; i < timeValues.length; i++) {
expect(timeValues[i]).toBeGreaterThanOrEqual(timeValues[i - 1]);
}
});
it('should preserve original frame metadata', () => {
const source = [
toDataFrame({
name: 'original name',
refId: 'TEST',
meta: { custom: { test: 'value' } },
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
// original frame unchanged
expect(result[0].refId).toBe('TEST');
expect(result[0].meta).toEqual(source[0].meta);
expect(result[0].name).toBe('original name');
// smoothed frame preserves metadata
expect(result[1].refId).toBe('TEST');
expect(result[1].meta).toEqual(source[0].meta);
expect(result[1].name).toBe('Smoothed');
});
it('should handle frames with no name', () => {
const source = [
toDataFrame({
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
expect(result[1].name).toBe('Smoothed');
});
});
describe('Real-world scenarios', () => {
it('should handle sparse data with irregular intervals', () => {
// based on real user data with ~10 points over 30 minutes
const source = [
toDataFrame({
name: 'temperature',
refId: 'A',
fields: [
{
name: 'time',
type: FieldType.time,
values: [
1733999700000, 1733999790000, 1734000000000, 1734000210000, 1734000420000, 1734000630000, 1734000840000,
1734001050000, 1734001260000, 1734001470000,
],
},
{
name: 'value',
type: FieldType.number,
values: [31.1, 31.1, 30.2, 30.8, 29.8, 30.0, 29.3, 28.6, 29.6, 30.5],
},
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 20 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return both original and smoothed frames
expect(result).toHaveLength(2);
expect(result[1].fields[0].values.length).toBe(10);
expect(result[1].fields[1].values.length).toBe(10);
// all values should be non-null numbers
const values = result[1].fields[1].values;
values.forEach((value) => {
expect(value).not.toBeNull();
expect(typeof value).toBe('number');
expect(isNaN(value)).toBe(false);
});
});
});
describe('Multiple frames', () => {
it('should process multiple frames independently', () => {
const source = [
toDataFrame({
name: 'frame1',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
toDataFrame({
name: 'frame2',
refId: 'B',
fields: [
{ name: 'timestamp', type: FieldType.time, values: [4000, 5000, 6000] },
{ name: 'metric', type: FieldType.number, values: [30, 40, 35] },
],
}),
];
const config: SmoothingTransformerOptions = { resolution: 2 };
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return original frames + smoothed frames (2 original + 2 smoothed = 4 total)
expect(result).toHaveLength(4);
// original frames first
expect(result[0].name).toBe('frame1');
expect(result[0].refId).toBe('A');
expect(result[1].name).toBe('frame2');
expect(result[1].refId).toBe('B');
// smoothed frames after
expect(result[2].name).toBe('Smoothed');
expect(result[2].refId).toBe('A');
expect(result[3].name).toBe('Smoothed');
expect(result[3].refId).toBe('B');
});
it('should handle mixed frame types', () => {
const source = [
toDataFrame({
name: 'valid frame',
refId: 'A',
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
toDataFrame({
name: 'invalid frame',
refId: 'B',
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'label', type: FieldType.string, values: ['X', 'Y', 'Z'] },
],
}),
];
const config: SmoothingTransformerOptions = {};
const result = smoothingTransformer.transformer(config, ctx)(source);
// should return 2 original frames + 1 smoothed frame (only valid frame gets smoothed)
expect(result).toHaveLength(3);
// original frames first
expect(result[0].name).toBe('valid frame');
expect(result[1]).toEqual(source[1]);
// smoothed frame after
expect(result[2].name).toBe('Smoothed');
});
});
describe('calculateMaxSourcePoints', () => {
it('should return 0 for empty frames', () => {
expect(calculateMaxSourcePoints([])).toBe(0);
});
it('should return 0 for frames without time fields', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(0);
});
it('should return 0 for frames without numeric fields', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'category', type: FieldType.string, values: ['A', 'B', 'C'] },
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(0);
});
it('should count valid data points, filtering out null and NaN', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'value', type: FieldType.number, values: [10, null, 15, NaN, 18] },
],
}),
];
// Only 3 valid points: 10, 15, 18
expect(calculateMaxSourcePoints(frames)).toBe(3);
});
it('should return maximum across multiple numeric fields', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'cpu', type: FieldType.number, values: [10, null, 15] }, // 2 valid points
{ name: 'memory', type: FieldType.number, values: [20, 25, 30, 35] }, // 4 valid points
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(4);
});
it('should return maximum across multiple frames', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 15] },
],
}),
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000, 5000] },
{ name: 'metric', type: FieldType.number, values: [30, 40, 35, 45, 50] },
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(5);
});
it('should handle frames with all valid points', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000, 4000] },
{ name: 'value', type: FieldType.number, values: [10, 20, 30, 40] },
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(4);
});
it('should handle frames with all null values', () => {
const frames = [
toDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1000, 2000, 3000] },
{ name: 'value', type: FieldType.number, values: [null, null, null] },
],
}),
];
expect(calculateMaxSourcePoints(frames)).toBe(0);
});
});
});
@@ -0,0 +1,267 @@
import { map } from 'rxjs';
import {
DataFrame,
DataTransformerID,
FieldType,
SynchronousDataTransformerInfo,
isTimeSeriesFrame,
TransformationApplicabilityLevels,
} from '@grafana/data';
import { t } from '@grafana/i18n';
import { asapSmooth, DataPoint } from './asap';
export interface SmoothingTransformerOptions {
resolution?: number;
}
export const DEFAULTS = {
resolution: 100,
};
export const RESOLUTION_LIMITS = {
min: 1,
max: 1000,
};
const MAX_RESOLUTION_MULTIPLIER = 2;
// converts time and value arrays into valid DataPoints, filtering out null/NaN values
export const createDataPoints = (timeValues: number[], sourceField: Array<number | null | undefined>): DataPoint[] => {
return timeValues
.map((time, index) => ({
x: time,
y: sourceField[index],
}))
.filter((point): point is DataPoint => point.y != null && !isNaN(point.y));
};
// calculates effective resolution capped at 2x source points
export const calculateEffectiveResolution = (resolution: number, sourcePointCount: number): number => {
return Math.min(resolution, sourcePointCount * MAX_RESOLUTION_MULTIPLIER);
};
// calculates the maximum number of source points across all numeric fields in all frames
export const calculateMaxSourcePoints = (frames: DataFrame[]): number => {
let maxSourcePoints = 0;
for (const frame of frames) {
const timeField = frame.fields.find((f) => f.type === FieldType.time);
if (!timeField) {
continue;
}
for (const field of frame.fields) {
if (field.type === FieldType.number) {
const sourcePoints = createDataPoints(timeField.values, field.values);
if (sourcePoints.length > maxSourcePoints) {
maxSourcePoints = sourcePoints.length;
}
}
}
}
return maxSourcePoints;
};
// performs linear interpolation between two points
export const linearInterpolate = (leftPoint: DataPoint, rightPoint: DataPoint, targetTime: number): number => {
// exact match
if (leftPoint.x === targetTime) {
return leftPoint.y;
}
if (rightPoint.x === targetTime) {
return rightPoint.y;
}
// same point (shouldn't happen but handle gracefully)
if (leftPoint.x === rightPoint.x) {
return leftPoint.y;
}
// linear interpolation
const ratio = (targetTime - leftPoint.x) / (rightPoint.x - leftPoint.x);
return leftPoint.y + ratio * (rightPoint.y - leftPoint.y);
};
// finds the two points in smoothedData that bracket the targetTime
export const findBracketingPoints = (
smoothedData: DataPoint[],
targetTime: number,
lastIndex: number
): { leftPoint: DataPoint; rightPoint: DataPoint; newIndex: number } => {
// find the two points to interpolate between, starting from last known position
// if target is before our current search position, reset to beginning
let searchStart = Math.min(lastIndex, smoothedData.length - 2);
if (targetTime < smoothedData[searchStart].x) {
searchStart = 0;
}
let leftPoint = smoothedData[searchStart];
let rightPoint = smoothedData[searchStart + 1];
let newIndex = searchStart;
for (let i = searchStart; i < smoothedData.length - 1; i++) {
if (smoothedData[i].x <= targetTime && smoothedData[i + 1].x >= targetTime) {
leftPoint = smoothedData[i];
rightPoint = smoothedData[i + 1];
newIndex = i;
break;
}
}
return { leftPoint, rightPoint, newIndex };
};
// interpolates smoothed data back to original time points
export const interpolateToTimePoints = (smoothedData: DataPoint[], timeValues: number[]): number[] => {
const firstPoint = smoothedData[0];
const lastPoint = smoothedData[smoothedData.length - 1];
let lastIndex = 0;
return timeValues.map((targetTime) => {
// handle out of bounds, use edge values instead of null
if (targetTime <= firstPoint.x) {
return firstPoint.y;
}
if (targetTime >= lastPoint.x) {
return lastPoint.y;
}
const { leftPoint, rightPoint, newIndex } = findBracketingPoints(smoothedData, targetTime, lastIndex);
lastIndex = newIndex;
return linearInterpolate(leftPoint, rightPoint, targetTime);
});
};
// smooths a time series by creating a smoothed curve and interpolating back to original time points
export const interpolateFromSmoothedCurve = (
sourceField: Array<number | null | undefined>,
timeValues: number[],
resolution: number
): Array<number | null> | null => {
const sourcePoints = createDataPoints(timeValues, sourceField);
// if no valid source points, return null to signal this field should not be smoothed
if (sourcePoints.length === 0) {
return null;
}
// smooth the source field's data with effective resolution
const effectiveFieldResolution = calculateEffectiveResolution(resolution, sourcePoints.length);
const smoothedData = asapSmooth(sourcePoints, { resolution: effectiveFieldResolution });
if (smoothedData.length === 0) {
return timeValues.map(() => null);
}
// handle single point case - return the same value for all time points
if (smoothedData.length === 1) {
const singleValue = smoothedData[0].y;
return timeValues.map(() => singleValue);
}
// this prevents O(m×n) degradation if asapSmooth returns unsorted data
smoothedData.sort((a, b) => a.x - b.x);
// interpolate smoothed data back to original time points
return interpolateToTimePoints(smoothedData, timeValues);
};
export const getSmoothingTransformer: () => SynchronousDataTransformerInfo<SmoothingTransformerOptions> = () => ({
id: DataTransformerID.smoothing,
name: t('transformers.smoothing.name', 'Smoothing'),
description: t(
'transformers.smoothing.description',
'Reduce noise in time series data through adaptive downsampling.'
),
isApplicable: (data) => {
for (const frame of data) {
if (isTimeSeriesFrame(frame)) {
return TransformationApplicabilityLevels.Applicable;
}
}
return TransformationApplicabilityLevels.NotApplicable;
},
isApplicableDescription: t(
'transformers.smoothing.is-applicable-description',
'The Smoothing transformation requires at least one time series frame to function. You currently have none.'
),
operator: (options, ctx) => {
const transformer = getSmoothingTransformer().transformer(options, ctx);
return (source) => source.pipe(map(transformer));
},
transformer: (options, ctx) => {
return (frames: DataFrame[]) => {
// clamp resolution to valid range to handle edge cases from API/plugins
const rawResolution = options.resolution ?? DEFAULTS.resolution;
const resolution = Math.max(RESOLUTION_LIMITS.min, Math.min(RESOLUTION_LIMITS.max, rawResolution));
if (frames.length === 0) {
return frames;
}
const smoothedFrames: DataFrame[] = [];
for (const frame of frames) {
const timeField = frame.fields.find((f) => f.type === FieldType.time);
if (!timeField) {
continue;
}
// check if there's at least one numeric field with valid data
const hasValidNumericField = frame.fields.some((f) => {
if (f.type !== FieldType.number || f.values.length === 0) {
return false;
}
return f.values.some((v) => v != null && !isNaN(v));
});
if (!hasValidNumericField) {
continue;
}
// create smoothed fields for all numeric fields
const smoothedFields = [timeField]; // keep original time field
let anyFieldSmoothed = false;
for (const field of frame.fields) {
if (field.type === FieldType.number) {
const smoothedValues = interpolateFromSmoothedCurve(field.values, timeField.values, resolution);
// if smoothing returned null (no valid data), skip this field
if (smoothedValues === null) {
continue;
}
anyFieldSmoothed = true;
smoothedFields.push({
...field,
values: smoothedValues,
state: undefined,
});
} else if (field.type !== FieldType.time) {
// include other non-numeric, non-time fields (like labels)
smoothedFields.push(field);
}
}
// only create a smoothed frame if at least one field was smoothed
if (anyFieldSmoothed) {
const smoothedFrame: DataFrame = {
...frame,
name: 'Smoothed',
fields: smoothedFields,
};
smoothedFrames.push(smoothedFrame);
}
}
// return original frames followed by smoothed frames
return [...frames, ...smoothedFrames];
};
},
});
@@ -0,0 +1,93 @@
import { css } from '@emotion/css';
import { useMemo } from 'react';
import { DataTransformerID, TransformerRegistryItem, TransformerUIProps, TransformerCategory } from '@grafana/data';
import { t } from '@grafana/i18n';
import { InlineField, InlineFieldRow, Tooltip, useTheme2 } from '@grafana/ui';
import { NumberInput } from 'app/core/components/OptionsUI/NumberInput';
import { getTransformationContent } from '../docs/getTransformationContent';
import darkImage from '../images/dark/smoothing.svg';
import lightImage from '../images/light/smoothing.svg';
import {
DEFAULTS,
RESOLUTION_LIMITS,
SmoothingTransformerOptions,
getSmoothingTransformer,
calculateEffectiveResolution,
calculateMaxSourcePoints,
} from './smoothing';
export const SmoothingTransformerEditor = ({
input,
options,
onChange,
}: TransformerUIProps<SmoothingTransformerOptions>) => {
const theme = useTheme2();
const resolution = options.resolution ?? DEFAULTS.resolution;
const maxSourcePoints = useMemo(() => calculateMaxSourcePoints(input), [input]);
const effectiveResolution = maxSourcePoints > 0 ? calculateEffectiveResolution(resolution, maxSourcePoints) : null;
const showEffectiveResolution = effectiveResolution !== null && effectiveResolution < resolution;
return (
<InlineFieldRow>
<InlineField
label={t('transformers.smoothing.resolution.label', 'Resolution')}
labelWidth={12}
tooltip={t(
'transformers.smoothing.resolution.tooltip',
'Controls smoothing intensity. Lower values create more aggressive smoothing. Both original and smoothed data are displayed.'
)}
>
<NumberInput
value={resolution}
onChange={(v) => onChange({ ...options, resolution: v })}
min={RESOLUTION_LIMITS.min}
max={RESOLUTION_LIMITS.max}
width={20}
suffix={
showEffectiveResolution ? (
<Tooltip
content={t(
'transformers.smoothing.effective-resolution-tooltip',
'Resolution is limited to 2× the number of data points ({{points}}).',
{ points: maxSourcePoints }
)}
>
<span
className={css({
marginLeft: '8px',
color: theme.colors.text.secondary,
fontSize: theme.typography.bodySmall.fontSize,
})}
>
{t('transformers.smoothing.effective-resolution', 'Effective: {{value}}', {
value: effectiveResolution,
})}
</span>
</Tooltip>
) : undefined
}
/>
</InlineField>
</InlineFieldRow>
);
};
export const getSmoothingTransformerRegistryItem: () => TransformerRegistryItem<SmoothingTransformerOptions> = () => {
const smoothingTransformer = getSmoothingTransformer();
return {
id: DataTransformerID.smoothing,
editor: SmoothingTransformerEditor,
transformation: smoothingTransformer,
name: smoothingTransformer.name,
description: smoothingTransformer.description,
categories: new Set([TransformerCategory.CalculateNewFields]),
imageDark: darkImage,
imageLight: lightImage,
help: getTransformationContent(DataTransformerID.smoothing).helperDocs,
tags: new Set(['ASAP', 'Autosmooth']),
};
};
@@ -1,4 +1,5 @@
import { TransformerRegistryItem } from '@grafana/data';
import { config } from '@grafana/runtime';
import { getFilterByValueTransformRegistryItem } from './FilterByValueTransformer/FilterByValueTransformerEditor';
import { getHeatmapTransformRegistryItem } from './calculateHeatmap/HeatmapTransformerEditor';
@@ -31,6 +32,7 @@ import { getPartitionByValuesTransformRegistryItem } from './partitionByValues/P
import { getPrepareTimeseriesTransformerRegistryItem } from './prepareTimeSeries/PrepareTimeSeriesEditor';
import { getRegressionTransformerRegistryItem } from './regression/regressionEditor';
import { getRowsToFieldsTransformRegistryItem } from './rowsToFields/RowsToFieldsTransformerEditor';
import { getSmoothingTransformerRegistryItem } from './smoothing/smoothingEditor';
import { getSpatialTransformRegistryItem } from './spatial/SpatialTransformerEditor';
import { getTimeSeriesTableTransformRegistryItem } from './timeSeriesTable/TimeSeriesTableTransformEditor';
@@ -66,6 +68,7 @@ export const getStandardTransformers = (): TransformerRegistryItem[] => {
getPartitionByValuesTransformRegistryItem(),
getFormatStringTransformerRegistryItem(),
getGroupToNestedTableTransformRegistryItem(),
...(config.featureToggles.smoothingTransformation ? [getSmoothingTransformerRegistryItem()] : []),
getFormatTimeTransformerRegistryItem(),
getTimeSeriesTableTransformRegistryItem(),
getTransposeTransformerRegistryItem(),
@@ -560,6 +560,23 @@ describe('Language completion provider', () => {
start: 1560153109000,
});
});
it('should interpolate variables in stream selector', async () => {
const datasource = setup({});
jest.spyOn(datasource, 'getTimeRangeParams').mockReturnValue({ start: 0, end: 1 });
jest
.spyOn(datasource, 'interpolateString')
.mockImplementation((string: string) => string.replace(/\$test_var/g, 'age'));
const languageProvider = new LanguageProvider(datasource);
languageProvider.request = jest.fn().mockResolvedValue([]);
await languageProvider.fetchLabels({ streamSelector: '{age="new", $test_var="new"}' });
expect(languageProvider.request).toHaveBeenCalledWith('labels', {
end: 1,
query: '{age="new", age="new"}',
start: 0,
});
});
});
it('should filter internal labels', async () => {
@@ -175,7 +175,8 @@ export default class LokiLanguageProvider extends LanguageProvider {
const { start, end } = this.datasource.getTimeRangeParams(range);
const params: Record<string, string | number> = { start, end };
if (options?.streamSelector && options?.streamSelector !== EMPTY_SELECTOR) {
params['query'] = options.streamSelector;
const interpolatedStreamSelector = this.datasource.interpolateString(options.streamSelector);
params['query'] = interpolatedStreamSelector;
}
const res = await this.request(url, params);
if (Array.isArray(res)) {
@@ -14,7 +14,6 @@ import {
import { TimeRange2, TooltipHoverMode } from '@grafana/ui/internal';
import { TimelineChart } from 'app/core/components/TimelineChart/TimelineChart';
import {
getSeriesAndRest,
prepareTimelineFields,
prepareTimelineLegendItems,
TimelineMode,
@@ -99,13 +98,10 @@ export const StateTimelinePanel = ({
annotationLanes={options.annotations?.multiLane ? getXAnnotationFrames(data.annotations).length : undefined}
>
{(builder, alignedFrame) => {
// TODO: refactor frame prep not to do this here, should be memod at panel level once GraphNG is dissolved
const { seriesFrame, restFields } = getSeriesAndRest(alignedFrame);
return (
<>
{cursorSync !== DashboardCursorSync.Off && (
<EventBusPlugin config={builder} eventBus={eventBus} frame={seriesFrame} />
<EventBusPlugin config={builder} eventBus={eventBus} frame={alignedFrame} />
)}
<XAxisInteractionAreaPlugin config={builder} queryZoom={onChangeTimeRange} />
{options.tooltip.mode !== TooltipDisplayMode.None && (
@@ -118,7 +114,7 @@ export const StateTimelinePanel = ({
syncMode={cursorSync}
syncScope={eventsScope}
getDataLinks={(seriesIdx, dataIdx) =>
seriesFrame.fields[seriesIdx].getLinks?.({ valueRowIndex: dataIdx }) ?? []
alignedFrame.fields[seriesIdx].getLinks?.({ valueRowIndex: dataIdx }) ?? []
}
render={(u, dataIdxs, seriesIdx, isPinned, dismiss, timeRange2, viaSync, dataLinks) => {
if (enableAnnotationCreation && timeRange2 != null) {
@@ -136,7 +132,7 @@ export const StateTimelinePanel = ({
return (
<StateTimelineTooltip
series={seriesFrame}
series={alignedFrame}
dataIdxs={dataIdxs}
seriesIdx={seriesIdx}
mode={viaSync ? TooltipDisplayMode.Multi : options.tooltip.mode}
@@ -149,14 +145,13 @@ export const StateTimelinePanel = ({
replaceVariables={replaceVariables}
dataLinks={dataLinks}
canExecuteActions={userCanExecuteActions}
_rest={restFields}
/>
);
}}
maxWidth={options.tooltip.maxWidth}
/>
)}
{seriesFrame.fields[0].config.custom?.axisPlacement !== AxisPlacement.Hidden && (
{alignedFrame.fields[0].config.custom?.axisPlacement !== AxisPlacement.Hidden && (
<AnnotationsPlugin2
replaceVariables={replaceVariables}
multiLane={options.annotations?.multiLane}
@@ -35,7 +35,6 @@ export const StateTimelineTooltip = ({
maxHeight,
replaceVariables,
dataLinks,
_rest,
}: StateTimelineTooltipProps) => {
const pluginContext = usePluginContext();
const xField = series.fields[0];
@@ -46,17 +45,7 @@ export const StateTimelineTooltip = ({
mode = isPinned ? TooltipDisplayMode.Single : mode;
const contentItems = getContentItems(
series.fields,
xField,
dataIdxs,
seriesIdx,
mode,
sortOrder,
undefined,
undefined,
_rest
);
const contentItems = getContentItems(series.fields, xField, dataIdxs, seriesIdx, mode, sortOrder);
let endTime = null;
// append duration in single mode
@@ -15,7 +15,6 @@ import {
import { TimeRange2, TooltipHoverMode } from '@grafana/ui/internal';
import { TimelineChart } from 'app/core/components/TimelineChart/TimelineChart';
import {
getSeriesAndRest,
prepareTimelineFields,
prepareTimelineLegendItems,
TimelineMode,
@@ -114,13 +113,10 @@ export const StatusHistoryPanel = ({
annotationLanes={options.annotations?.multiLane ? getXAnnotationFrames(data.annotations).length : undefined}
>
{(builder, alignedFrame) => {
// TODO: refactor frame prep not to do this here, should be memod at panel level once GraphNG is dissolved
const { seriesFrame, restFields } = getSeriesAndRest(alignedFrame);
return (
<>
{cursorSync !== DashboardCursorSync.Off && (
<EventBusPlugin config={builder} eventBus={eventBus} frame={seriesFrame} />
<EventBusPlugin config={builder} eventBus={eventBus} frame={alignedFrame} />
)}
<XAxisInteractionAreaPlugin config={builder} queryZoom={onChangeTimeRange} />
{options.tooltip.mode !== TooltipDisplayMode.None && (
@@ -133,7 +129,7 @@ export const StatusHistoryPanel = ({
syncMode={cursorSync}
syncScope={eventsScope}
getDataLinks={(seriesIdx, dataIdx) =>
seriesFrame.fields[seriesIdx].getLinks?.({ valueRowIndex: dataIdx }) ?? []
alignedFrame.fields[seriesIdx].getLinks?.({ valueRowIndex: dataIdx }) ?? []
}
render={(u, dataIdxs, seriesIdx, isPinned, dismiss, timeRange2, viaSync, dataLinks) => {
if (enableAnnotationCreation && timeRange2 != null) {
@@ -151,7 +147,7 @@ export const StatusHistoryPanel = ({
return (
<StateTimelineTooltip
series={seriesFrame}
series={alignedFrame}
dataIdxs={dataIdxs}
seriesIdx={seriesIdx}
mode={viaSync ? TooltipDisplayMode.Multi : options.tooltip.mode}
@@ -164,14 +160,13 @@ export const StatusHistoryPanel = ({
replaceVariables={replaceVariables}
dataLinks={dataLinks}
canExecuteActions={userCanExecuteActions}
_rest={restFields}
/>
);
}}
maxWidth={options.tooltip.maxWidth}
/>
)}
{seriesFrame.fields[0].config.custom?.axisPlacement !== AxisPlacement.Hidden && (
{alignedFrame.fields[0].config.custom?.axisPlacement !== AxisPlacement.Hidden && (
<AnnotationsPlugin2
replaceVariables={replaceVariables}
multiLane={options.annotations?.multiLane}
+14
View File
@@ -3157,9 +3157,12 @@
"table": "Tabulka"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Definujte podmínku, která musí být splněna před spuštěním pravidla výstrahy",
"description-configure-firing-alert-instances-routed-contact": "Nakonfigurujte způsob přesměrování instancí spouštění výstrah do kontaktních bodů",
"description-configure-receives-notifications": "Nakonfigurujte, kdo obdrží oznámení a jak jsou odesílána",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Pravidla výstrah",
"title-contact-points": "Kontaktní body",
"title-notification-policies": "Zásady oznamování"
@@ -14508,6 +14511,17 @@
"series-to-rows": "Řady na řádky"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Seřadit pole v rámci."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabelle"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Legen Sie die Bedingung fest, die erfüllt sein muss, bevor eine Warnregel ausgelöst wird",
"description-configure-firing-alert-instances-routed-contact": "Konfigurieren Sie, wie ausgelöste Warnungsinstanzen an Kontaktpunkte weitergeleitet werden",
"description-configure-receives-notifications": "Konfigurieren Sie, wer Benachrichtigungen erhält und wie sie gesendet werden",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Warnregeln",
"title-contact-points": "Kontaktpunkte",
"title-notification-policies": "Benachrichtigungsrichtlinien"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Reihen zu Zeilen"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Sortieren Sie Felder in einem Frame."
+13 -1
View File
@@ -5799,7 +5799,8 @@
"community-empty-title": "No community dashboards found",
"community-empty-title-with-datasource": "No {{datasourceType}} community dashboards found",
"community-error": "Failed to load community dashboards. Please try again.",
"community-error-title": "Error loading community dashboards",
"community-error-description": "Failed to load community dashboard.",
"community-error-title": "Error loading community dashboard",
"community-mapping-form": {
"auto-mapped_one": "{{count}} datasources were automatically configured:",
"auto-mapped_other": "{{count}} datasources were automatically configured:",
@@ -14399,6 +14400,17 @@
"series-to-rows": "Series to rows"
}
},
"smoothing": {
"description": "Reduce noise in time series data through adaptive downsampling.",
"effective-resolution": "Effective: {{value}}",
"effective-resolution-tooltip": "Resolution is limited to 2× the number of data points ({{points}}).",
"is-applicable-description": "The Smoothing transformation requires at least one time series frame to function. You currently have none.",
"name": "Smoothing",
"resolution": {
"label": "Resolution",
"tooltip": "Controls smoothing intensity. Lower values create more aggressive smoothing. Both original and smoothed data are displayed."
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Sort fields in a frame."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabla"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Define la condición que debe cumplirse antes de que se active una regla de alerta",
"description-configure-firing-alert-instances-routed-contact": "Configurar cómo se enrutan las instancias de alerta de activación a los puntos de contacto",
"description-configure-receives-notifications": "Configurar quién recibe las notificaciones y cómo se envían",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Reglas de alerta",
"title-contact-points": "Puntos de contacto",
"title-notification-policies": "Políticas de notificación"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Series a filas"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Ordenar campos en un marco."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tableau"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Définir la condition qui doit être remplie avant quune règle dalerte ne se déclenche",
"description-configure-firing-alert-instances-routed-contact": "Configurer la façon dont les instances dalerte de déclenchement sont acheminées vers les points de contact",
"description-configure-receives-notifications": "Configurer les destinataires des notifications et la manière dont elles sont envoyées",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Règles d'alerte",
"title-contact-points": "Points de contact",
"title-notification-policies": "Règles de notification"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Convertir la série en lignes"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Triez les champs dans une trame."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Táblázat"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Határozza meg azt a feltételt, amelynek teljesülnie kell, mielőtt egy riasztási szabály aktiválódik",
"description-configure-firing-alert-instances-routed-contact": "Konfigurálja, hogyan történjen az aktív riasztáspéldányok továbbítása a kapcsolattartási pontokhoz",
"description-configure-receives-notifications": "Állítsa be, hogy ki kapjon értesítéseket, és hogyan legyenek elküldve",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Riasztási szabályok",
"title-contact-points": "Kapcsolattartási pontok",
"title-notification-policies": "Értesítési irányelvek"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Sorozatok sorokká alakítása"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Mezők rendezése egy keretben."
+14
View File
@@ -3124,9 +3124,12 @@
"table": "Tabel"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Tentukan syarat yang harus dipenuhi sebelum aturan peringatan menyala",
"description-configure-firing-alert-instances-routed-contact": "Konfigurasikan cara instans peringatan yang menyala dirutekan ke titik kontak",
"description-configure-receives-notifications": "Konfigurasikan penerima pemberitahuan dan cara pemberitahuan dikirim",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Aturan peringatan",
"title-contact-points": "Titik kontak",
"title-notification-policies": "Kebijakan pemberitahuan"
@@ -14340,6 +14343,17 @@
"series-to-rows": "Deret ke baris"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Urutkan bidang dalam bingkai."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabella"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Definisci la condizione che deve essere soddisfatta prima che venga attivata una regola di avviso",
"description-configure-firing-alert-instances-routed-contact": "Configura il modo in cui le istanze di avviso attivate vengono instradate ai punti di contatto",
"description-configure-receives-notifications": "Configura chi riceve le notifiche e come vengono inviate",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Regole di avviso",
"title-contact-points": "Punti di contatto",
"title-notification-policies": "Politiche di notifica"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Serie a righe"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Ordina i campi in un frame."
+14
View File
@@ -3124,9 +3124,12 @@
"table": "テーブル"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "アラートルールが発生される前に満たすべき条件を定義します",
"description-configure-firing-alert-instances-routed-contact": "発生中のアラートインスタンスを連絡先にルーティングする方法を設定",
"description-configure-receives-notifications": "通知の受信者と送信方法を設定",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "アラートルール",
"title-contact-points": "コンタクトポイント",
"title-notification-policies": "通知ポリシー"
@@ -14340,6 +14343,17 @@
"series-to-rows": "系列を行に変換"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "フレーム内のフィールドを並べ替えます。"
+14
View File
@@ -3124,9 +3124,12 @@
"table": "표"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "경고 규칙이 발동되기 전에 충족되어야 하는 조건을 정의합니다",
"description-configure-firing-alert-instances-routed-contact": "경고 발생 인스턴스가 연락처로 라우팅되는 방식을 구성합니다",
"description-configure-receives-notifications": "알림을 받는 사람과 전송 방식을 구성합니다",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "경고 규칙",
"title-contact-points": "연락처",
"title-notification-policies": "알림 정책"
@@ -14340,6 +14343,17 @@
"series-to-rows": "계열에서 행으로"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "프레임에서 필드를 정렬합니다."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabel"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Definieer de voorwaarde waaraan moet worden voldaan voordat een waarschuwingsregel geactiveerd wordt",
"description-configure-firing-alert-instances-routed-contact": "Configureer hoe geactiveerde waarschuwingsinstanties worden gerouteerd naar contactpunten",
"description-configure-receives-notifications": "Configureer wie meldingen ontvangt en hoe deze worden verzonden",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Waarschuwingsregels",
"title-contact-points": "Contactpunten",
"title-notification-policies": "Meldingsbeleid"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Reeksen naar rijen"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Velden in een frame sorteren."
+14
View File
@@ -3157,9 +3157,12 @@
"table": "Tabela"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Zdefiniuj warunek, który musi zostać spełniony przed uruchomieniem reguły alertu",
"description-configure-firing-alert-instances-routed-contact": "Skonfiguruj, w jaki sposób uruchamiane instancje alertów są kierowane do punktów kontaktu",
"description-configure-receives-notifications": "Skonfiguruj, kto otrzymuje powiadomienia i jak są one wysyłane",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Reguły alertu",
"title-contact-points": "Punkty kontaktowe",
"title-notification-policies": "Zasady powiadamiania"
@@ -14508,6 +14511,17 @@
"series-to-rows": "Szeregi do wierszy"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Sortuj pola w ramce."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabela"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Defina a condição que deve ser atendida antes que uma regra de alerta seja acionada",
"description-configure-firing-alert-instances-routed-contact": "Configure como as instâncias de alertas ativos são encaminhadas para os pontos de contato",
"description-configure-receives-notifications": "Configure quem recebe notificações e como elas são enviadas",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Regras de alerta",
"title-contact-points": "Pontos de contato",
"title-notification-policies": "Política de notificações"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Série para fileiras"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Classificar os campos em um quadro."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabela"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Definir a condição que deve ser cumprida antes de uma regra de alerta ser acionada",
"description-configure-firing-alert-instances-routed-contact": "Configurar como as instâncias de alerta de ativação são encaminhadas para os pontos de contacto",
"description-configure-receives-notifications": "Configurar quem recebe notificações e como são enviadas",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Regras de alerta",
"title-contact-points": "Pontos de contacto",
"title-notification-policies": "Políticas de notificação"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Série para linhas"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Ordenar campos num quadro."
+14
View File
@@ -3157,9 +3157,12 @@
"table": "Таблица"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Задайте условие, которое должно быть выполнено до того, как автивируется правило оповещения.",
"description-configure-firing-alert-instances-routed-contact": "Установите способ направления активных экземпляров оповещений в точки контакта.",
"description-configure-receives-notifications": "Установите получателей уведомлений и способы их отправки.",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Правила оповещения",
"title-contact-points": "Точки контакта",
"title-notification-policies": "Политики уведомления"
@@ -14508,6 +14511,17 @@
"series-to-rows": "Ряды в строки"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Сортируйте поля в фрейме."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tabell"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Definiera villkoret som måste uppfyllas innan en larmregel utlöses",
"description-configure-firing-alert-instances-routed-contact": "Konfigurera hur utlösta larminstanser dirigeras till kontaktpunkter",
"description-configure-receives-notifications": "Konfigurera vem som tar emot aviseringar och hur de skickas",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Varningsregler",
"title-contact-points": "Kontaktpunkter",
"title-notification-policies": "Aviseringspolicyer"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Serie till rader"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Sortera fält i en ram."
+14
View File
@@ -3135,9 +3135,12 @@
"table": "Tablo"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "Bir uyarı kuralının tetiklenmesi için karşılanması gereken koşulu tanımlayın",
"description-configure-firing-alert-instances-routed-contact": "Tetiklenen uyarı örneklerinin iletişim noktalarına nasıl yönlendirileceğini yapılandırın",
"description-configure-receives-notifications": "Bildirimlerin kime gönderileceğini ve nasıl gönderileceğini yapılandırın",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "Uyarı kuralları",
"title-contact-points": "İletişim noktaları",
"title-notification-policies": "Bildirim politikaları"
@@ -14396,6 +14399,17 @@
"series-to-rows": "Seriden satırlara"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "Alanları bir çerçevede sıralayın."
+14
View File
@@ -3124,9 +3124,12 @@
"table": "表格"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "定义警报规则触发前必须满足的条件",
"description-configure-firing-alert-instances-routed-contact": "配置如何将触发的警报实例路由到联络点",
"description-configure-receives-notifications": "配置接收通知的人员以及通知发送方式",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "警报规则",
"title-contact-points": "联络点",
"title-notification-policies": "通知策略"
@@ -14340,6 +14343,17 @@
"series-to-rows": "序列到行"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "对帧中的字段进行排序。"
+14
View File
@@ -3124,9 +3124,12 @@
"table": "表格"
},
"welcome-header": {
"description-alert-activity": "",
"description-alert-rules": "定義警報規則觸發前必須滿足的條件",
"description-configure-firing-alert-instances-routed-contact": "設定如何將觸發的警報執行個體傳送至聯絡點",
"description-configure-receives-notifications": "設定接收通知的對象以及傳送方式",
"href-text-alert-activity": "",
"title-alert-activity": "",
"title-alert-rules": "警報規則",
"title-contact-points": "聯絡點",
"title-notification-policies": "通知政策"
@@ -14340,6 +14343,17 @@
"series-to-rows": "將序列轉為列"
}
},
"smoothing": {
"description": "",
"effective-resolution": "",
"effective-resolution-tooltip": "",
"is-applicable-description": "",
"name": "",
"resolution": {
"label": "",
"tooltip": ""
}
},
"sort-by-transformer-editor": {
"description": {
"sort-fields": "對框架中的欄位進行排序。"
+8
View File
@@ -16497,6 +16497,13 @@ __metadata:
languageName: node
linkType: hard
"downsample@npm:1.4.0":
version: 1.4.0
resolution: "downsample@npm:1.4.0"
checksum: 10/ad0ab937e368546b577b564b13d7f39cd85a92bf29d56562aaa6ed10bac19e91ee75ab58f38050a9e8bf601c1abcfda942541880a84c89ba78d1775a229636d1
languageName: node
linkType: hard
"downshift@npm:^9.0.6":
version: 9.0.10
resolution: "downshift@npm:9.0.10"
@@ -19629,6 +19636,7 @@ __metadata:
date-fns: "npm:4.1.0"
debounce-promise: "npm:3.1.2"
diff: "npm:^8.0.0"
downsample: "npm:1.4.0"
enquirer: "npm:^2.4.1"
esbuild: "npm:0.25.8"
esbuild-loader: "npm:4.3.0"