Compare commits

..

53 Commits

Author SHA1 Message Date
alexandra vargas afdbc63250 Add dashboard extraction test 2026-01-14 12:09:58 +01:00
alexandra vargas f67bd022be refactor tests to be consistent with the rest, use require 2026-01-14 10:33:55 +01:00
alexandra vargas ad989ae200 Improve dashboard detection and add unit test 2026-01-14 10:17:47 +01:00
alexandra vargas fe6c2cdfee Create unit tests for parser.go 2026-01-13 17:04:28 +01:00
alexandra vargas 84b081ce37 Merge branch 'axelav/dash-validator-app-mvp' into axelav/dash-validator-app-prometheus-poc 2026-01-13 15:20:06 +01:00
alexandra vargas d4ae044801 Merge remote-tracking branch 'origin' into axelav/dash-validator-app-mvp 2026-01-13 14:58:15 +01:00
Misi c9a14f1774 IAM: Target resource authorization for TeamBinding (#116117)
* wip

* Review VerbGet vs VerbGetPermissions

* Fix tests
2026-01-13 14:45:18 +01:00
Will Browne d2b788eb53 Plugins: Remove angular details from meta API (#116194)
remove angular details from meta API
2026-01-13 13:41:40 +00:00
Ashley Harrison dffae66fdc Storybook: Add workflow to deploy canary storybook (#116138)
* add first attempt at storybook deploy action for canary

* don't run on push to main yet!

* add CODEOWNER
2026-01-13 13:10:04 +00:00
alerting-team[bot] 5dbbe8164b Alerting: Update alerting module to 98a49ed9557fd9b5f33ecb77cbaa0748f13dc568 (#116197)
* [create-pull-request] automated change

* update prometheus-alertmanager

---------

Co-authored-by: titolins <8942194+titolins@users.noreply.github.com>
Co-authored-by: Tito Lins <tito.linsesilva@grafana.com>
2026-01-13 12:27:35 +00:00
Tobias Skarhed d1064da4cd Scopes: Add RTK Query API client for caching (#115494)
* Scopes API client

* Initial RTK query commit

* Copy API client from generated enterprise folder

* Mock ScopesApiClient for integration tests

* Update e2e tests

* Handle group expansion for dashboard navigation

* Extract integration test mocks

* Move mock to only be for integration tests

* Update path for enterprise sync script

* Re-export mockData

* Disregard caching for search

* Leave name parameters empty

* Disable subscriptions for client requests

* Add functionality to reset cache between mocked requests

* Use grafana-test-utils for scopes integration tests

* Rollback mock setup

* Remove store form window object

* Remove cache helper

* Restore scopenode search functionality

* Improve request erro handling

* Clean up subscription in case subscription: false lies

* Fix logging security risk

* Rewrite tests to cover RTK query usage and improve error catching

* Update USE_LIVE_DATA to be consistent

* Remove unused timout parameter

* Fix error handling

* Make dashboard-navigation test pass
2026-01-13 13:09:08 +01:00
Tito Lins b57b8d4359 fix: handle go mod issues (#116187) 2026-01-13 12:48:16 +01:00
Mustafa Sencer Özcan 5219ccddb6 fix: improve resilience for unified storage and search service grpc clients (#116122)
* fix: reliability

* fix: resilience

* fix: add connection backoff

* fix: reduce backoff
2026-01-13 11:42:21 +00:00
Ashley Harrison c95e3da2d5 Theme: Convert themes to json and define schemas using zod (#116006)
* convert all theme files to json

* automatically discover extra themes in go backend

* use zod

* error tidy up

* error tidy up p2

* generate theme json schema from zod

* generate theme list at build time, don't do it at runtime

* make name and id required in the theme schema
2026-01-13 11:13:11 +00:00
Gareth 43d9fbc056 Tempo: Fix search streaming queries (#116136)
* Tempo: Fix search queries

* apply variables for metrics streaming queries
2026-01-13 19:47:44 +09:00
Konrad Lalik 7b80c44ac7 Alerting: Fix label value search not filtering results (#116133)
Fixes the issue where typing in the label value dropdown would display
all values instead of filtering them based on the search input.

The bug was in `createAsyncValuesLoader` which was ignoring the
`valueQuery` parameter and returning all combined values instead of
the filtered subset.

Changes:
- Rename `_inputValue` parameter to `valueQuery` to indicate it should be used
- Filter combined values based on case-insensitive search query
- Return only filtered values instead of all values

Tests:
- Add test to verify correct values are shown for each label key
- Add test to verify search filtering works correctly
- Improve test infrastructure with proper portal container and element mocking
  for virtualized dropdown rendering
2026-01-13 11:43:07 +01:00
alexandra vargas 7038ced64e Merge branch 'axelav/dash-validator-app-mvp' into axelav/dash-validator-app-prometheus-poc 2026-01-13 11:38:03 +01:00
Rafael Bortolon Paulovic 98f271f345 chore(unified): remove unifiedStorageSearchSprinkles feature toggle (#116139)
chore: remove unifiedStorageSearchSprinkles feature flag

The feature flag is no longer needed because:
- OSS: usageinsights code doesn't exist in OSS builds
- Enterprise On-Prem: uses local SQL storage when enable_search=true
- Cloud: explicitly configures sprinkles_api_server URL

The sprinkles functionality now works automatically based on:
- enable_search config (enforced true for unified storage mode 5)
- sprinkles_api_server config (empty = local storage, set = remote API)
2026-01-13 11:24:13 +01:00
Vardan Torosyan 60c4fab063 [Docs] Add Synthentic Monitoring app to the list of RBAC supported apps (#116167)
* [Docs] Add Synthentic Monitoring app to the list of RBAC supported apps

* Run prettier
2026-01-13 11:23:33 +01:00
alexandra vargas f117691340 Merge remote-tracking branch 'origin' into axelav/dash-validator-app-mvp 2026-01-13 11:13:53 +01:00
alexandra vargas c99eb8c62e Add dashvalidator app to Dockerfile for Go workspace validation 2026-01-13 11:13:32 +01:00
Ihor Yeromin ce8663ac24 SQL Expressions: Filter Dashboard datasource queries from schema fetching (#116129)
* fix(sql expression): sql schema frontend datasources filtering

* add one more test
2026-01-13 10:26:33 +01:00
Yulia Shanyrova 5dd9a14903 Plugins: Fix the flaky configuration tab on the plugin details page for cloud instances (#114922)
Fix flaky configuration tab for plugin details page at cloud instances
2026-01-13 09:55:52 +01:00
Roberto Jiménez Sánchez 68bf19d840 Provisioning: handle resource version conflicts in connection CRUDL test (#116184)
fix: handle resource version conflicts in connection CRUDL test

After updating a connection resource, the controller may update the
resource status, changing the resource version. This causes the delete
operation to fail with a resource version conflict.

Add retry logic to handle conflicts gracefully by retrying the delete
operation when encountering resource version conflicts.
2026-01-13 08:53:54 +00:00
Costa Alexoglou 220c29de89 fix: 401 in grafana live spam (#116140) 2026-01-13 09:46:06 +01:00
Oscar Kilhed 91ab753368 Dynamic Dashboards: Fix navigation to repeated panels and update outline when lazy items repeat (#116030)
Dashboard Outline: Fix navigation to repeated panels and lazy-loaded repeats

- Remove cursor: not-allowed styling from repeated panels in outline
- Add RepeatsUpdatedEvent to notify when panel repeats are populated
- Subscribe to RepeatsUpdatedEvent in DashboardEditPane to refresh outline
- Remove memoization from visibleChildren to ensure outline updates on re-render
2026-01-13 08:43:50 +01:00
Alex Khomenko 250ca7985f Provisioning: Add Connections page (#116060)
* Provisioning: Add connections page

* Provisioning: Add connections form

* Provisioning: Add connections form

* Update fields

* Fix generated name

* Update connection name

* Add edit page

* error handling

* Form validation

* Add Connections button

* Cleanup

* Extract ConnectionFormData type

* Add list test and separate empty states

* Add form test

* Update tests

* i18n

* Cleanup

* Use SecretTextArea from grafana-ui

* Fix breadcrumbs

* tweaks

* Add missing URL

* Switch to ShowConfirmModalEvent

* i18n

* redirect to list on success

* add timeout

* Fix tags invalidation
2026-01-13 08:25:40 +02:00
Hugo Häggmark b57ed32484 chore: remove app/core/config barrel files (#116068) 2026-01-13 06:23:21 +01:00
Galen Kistler d0217588a3 LogsDrilldown: Remove exploreLogsLimitedTimeRange flag (#116177)
chore: remove flag
2026-01-12 22:43:01 +00:00
Denis Vodopianov ce9ab6a89a Add non-boolean feature flags support to the StaticProvider (#115085)
* initial commit

* add support of integerts

* finialise the static provider

* minor refactoring

* the rest

* revert:  the rest

* add new thiongs

* more tests added

* add ff parsing tests to check if types are handled correctly

* update tests according to recent changes

* address golint issues

* Update pkg/setting/setting_feature_toggles.go

Co-authored-by: Dave Henderson <dave.henderson@grafana.com>

* fix rebase issues

* addressing review comments

* add test cases for enterprise

* handle enterprise cases

* minor refactoring to make api a bit easier to debug

* make test names a bit more precise

* fix linter

* add openfeature sdk to goleak ignore in testutil

* Remove only boolean check in ff gen tests

* add non-boolean types top the doc in default.ini and doc string in FeatureFlag type

* apply remarks, add docs to sample.ini

* reflect changes in feature flags in the public grafana configuration doc

* fix doc formatting

* apply suggestions to the doc file

---------

Co-authored-by: Dave Henderson <dave.henderson@grafana.com>
2026-01-12 22:53:23 +01:00
Will Assis 8c8efd2494 unified-storage: skip sqlkv/sqlbackend compatibility tests in sqlite (#116164) 2026-01-12 16:31:29 -05:00
alexandra vargas c7986976e4 Fix issue with remote prometheus api 2026-01-08 17:38:04 +01:00
alexandra vargas e4009a42a1 fix issue with raw results in the modal 2026-01-08 14:41:27 +01:00
alexandra vargas 06d11d739b Handle datasource variables in validator query grouping 2026-01-08 14:20:47 +01:00
alexandra vargas 74548dbb73 fix api endpoint witht correct path, using getApiNamespace 2026-01-07 17:40:34 +01:00
alexandra vargas 751a399b03 Wire up compatibility check to CommunityDashboardSection 2026-01-07 16:02:57 +01:00
alexandra vargas c9e044b2c7 Add 'Check Compatibility' button to DashboardCard" 2026-01-06 15:41:26 +01:00
alexandra vargas 92041e5a05 Create compatibility modal for mvp 2026-01-06 15:10:40 +01:00
alexandra vargas 6ee1a6ea7f remove file mistake 2026-01-06 14:10:42 +01:00
alexandra vargas 4f66b1df5a Implemented TypeScript API client for calling the dashboard validator backend from the Grafana frontend. 2026-01-06 14:09:29 +01:00
alexandra vargas 01f959be97 Improve error handling for validator
- Surface error codes for datasource possible errors (not found, unreachabel, auth, timeout)
2026-01-06 13:27:34 +01:00
alexandra vargas f81deced02 Merge branch 'axelav/dash-validator-app-mvp' into axelav/dash-validator-app-prometheus-poc 2026-01-06 10:50:39 +01:00
alexandra vargas ca3bce54a8 Fix go workspace 2026-01-06 10:50:07 +01:00
alexandra vargas 3d3eeb4472 Merge remote-tracking branch 'origin' into axelav/dash-validator-app-mvp 2026-01-06 10:49:27 +01:00
alexandra vargas 2b6e2c5737 Merge branch 'axelav/dash-validator-app-mvp' into axelav/dash-validator-app-prometheus-poc 2026-01-05 16:55:59 +01:00
alexandra vargas 306aee16a5 Fix codeowners 2026-01-05 16:55:33 +01:00
alexandra vargas 8319f62ef4 Merge branch 'axelav/dash-validator-app-mvp' into axelav/dash-validator-app-prometheus-poc 2026-01-05 15:54:10 +01:00
alexandra vargas b8b792f78a Fix linting 2026-01-05 15:53:23 +01:00
alexandra vargas 8a0f2fa9f3 SuggestedDashboards- Extend dashvalidator POC: Prometheus validator 2025-12-31 17:06:49 +01:00
alexandra vargas 9d980a9244 Add 'check' endpoint to manifest, fix issue with grafana core depedencies 2025-12-31 15:14:46 +01:00
alexandra vargas e442720cdc Wire dashvalidator app with grafana 2025-12-31 14:39:45 +01:00
alexandra vargas e616d04010 Create app.go file and define app configuration, custom route and basic handler for that route 2025-12-31 13:48:48 +01:00
alexandra vargas 8e9675ce1c Create dashboard validator app
- Generate scaffolding
- Create DashboardCompatibilityScore Kind in CUE
2025-12-30 15:37:34 +01:00
297 changed files with 17187 additions and 3186 deletions
+3
View File
@@ -101,6 +101,7 @@
/apps/example/ @grafana/grafana-app-platform-squad
/apps/logsdrilldown/ @grafana/observability-logs
/apps/annotation/ @grafana/grafana-backend-services-squad
/apps/dashvalidator/ @grafana/sharing-squad
/pkg/api/ @grafana/grafana-backend-group
/pkg/apis/ @grafana/grafana-app-platform-squad
/pkg/apis/query @grafana/grafana-datasources-core-services
@@ -1190,6 +1191,7 @@ embed.go @grafana/grafana-as-code
/pkg/registry/apps/advisor @grafana/plugins-platform-backend
/pkg/registry/apps/alerting @grafana/alerting-backend
/pkg/registry/apps/plugins @grafana/plugins-platform-backend
/pkg/registry/apps/dashvalidator @grafana/sharing-squad
/pkg/codegen/ @grafana/grafana-as-code
/pkg/codegen/generators @grafana/grafana-as-code
/pkg/kinds/*/*_gen.go @grafana/grafana-as-code
@@ -1275,6 +1277,7 @@ embed.go @grafana/grafana-as-code
/.github/workflows/i18n-crowdin-download.yml @grafana/grafana-frontend-platform
/.github/workflows/i18n-crowdin-create-tasks.yml @grafana/grafana-frontend-platform
/.github/workflows/i18n-verify.yml @grafana/grafana-frontend-platform
/.github/workflows/deploy-storybook.yml @grafana/grafana-frontend-platform
/.github/workflows/deploy-storybook-preview.yml @grafana/grafana-frontend-platform
/.github/workflows/scripts/crowdin/create-tasks.ts @grafana/grafana-frontend-platform
/.github/workflows/scripts/publish-frontend-metrics.mts @grafana/grafana-frontend-platform
+79
View File
@@ -0,0 +1,79 @@
name: Deploy Storybook
on:
workflow_dispatch:
# push:
# branches:
# - main
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
detect-changes:
# Only run in grafana/grafana
if: github.repository == 'grafana/grafana'
name: Detect whether code changed
runs-on: ubuntu-latest
permissions:
contents: read
outputs:
changed-frontend-packages: ${{ steps.detect-changes.outputs.frontend-packages }}
steps:
- uses: actions/checkout@v5
with:
persist-credentials: true # required to get more history in the changed-files action
fetch-depth: 2
- name: Detect changes
id: detect-changes
uses: ./.github/actions/change-detection
with:
self: .github/workflows/deploy-storybook.yml
deploy-storybook:
name: Deploy Storybook
runs-on: ubuntu-latest
needs: detect-changes
# Only run in grafana/grafana
if: github.repository == 'grafana/grafana' && needs.detect-changes.outputs.changed-frontend-packages == 'true'
permissions:
contents: read
id-token: write
env:
BUCKET_NAME: grafana-storybook
steps:
- name: Checkout code
uses: actions/checkout@v5
with:
persist-credentials: false
- name: Setup Node.js
uses: ./.github/actions/setup-node
- name: Install dependencies
run: yarn install --immutable
- name: Build storybook
run: yarn storybook:build
# Create the GCS folder name
# Right now, this just returns "canary"
# But we'll expand this to work for "latest" as well in the future
- name: Create deploy name
id: create-deploy-name
run: |
echo "deploy-name=canary" >> "$GITHUB_OUTPUT"
- name: Upload Storybook
uses: grafana/shared-workflows/actions/push-to-gcs@main
with:
environment: prod
bucket: ${{ env.BUCKET_NAME }}
bucket_path: ${{ steps.create-deploy-name.outputs.deploy-name }}
path: packages/grafana-ui/dist/storybook
service_account: github-gf-storybook-deploy@grafanalabs-workload-identity.iam.gserviceaccount.com
parent: false
+1
View File
@@ -107,6 +107,7 @@ COPY apps/scope apps/scope
COPY apps/logsdrilldown apps/logsdrilldown
COPY apps/advisor apps/advisor
COPY apps/dashboard apps/dashboard
COPY apps/dashvalidator apps/dashvalidator
COPY apps/folder apps/folder
COPY apps/iam apps/iam
COPY apps apps
+6 -1
View File
@@ -135,7 +135,7 @@ i18n-extract-enterprise:
@echo "Skipping i18n extract for Enterprise: not enabled"
else
i18n-extract-enterprise:
@echo "Extracting i18n strings for Enterprise"
@echo "Extracting i18n strings for Enterprise"
cd public/locales/enterprise && yarn run i18next-cli extract --sync-primary
endif
@@ -227,6 +227,10 @@ fix-cue:
gen-jsonnet:
go generate ./devenv/jsonnet
.PHONY: gen-themes
gen-themes:
go generate ./pkg/services/preference
.PHONY: update-workspace
update-workspace: gen-go
@echo "updating workspace"
@@ -244,6 +248,7 @@ build-go-fast: ## Build all Go binaries without updating workspace.
.PHONY: build-backend
build-backend: ## Build Grafana backend.
@echo "build backend"
$(MAKE) gen-themes
$(GO) run build.go $(GO_BUILD_FLAGS) build-backend
.PHONY: build-air
+1 -1
View File
@@ -4,7 +4,7 @@ go 1.25.5
require (
github.com/go-kit/log v0.2.1
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4
github.com/grafana/grafana-app-sdk v0.48.7
github.com/grafana/grafana-app-sdk/logging v0.48.7
+2 -2
View File
@@ -243,8 +243,8 @@ github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/googleapis/gax-go/v2 v2.0.4/go.mod h1:0Wqv26UfaUD9n4G6kQubkQ+KchISgw+vpHVxEJEs9eg=
github.com/googleapis/gax-go/v2 v2.0.5/go.mod h1:DWXyrwAJ9X0FpwwEdw+IPEYBICEFu5mhpdKc/us6bOk=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f h1:Br4SaUL3dnVopKKNhDavCLgehw60jdtl/sIxdfzmVts=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f h1:3bXOyht68qkfvD6Y8z8XoenFbytSSOIkr/s+AqRzj0o=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f/go.mod h1:Ji0SfJChcwjgq8ljy6Y5CcYfHfAYKXjKYeysOoDS/6s=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4 h1:jSojuc7njleS3UOz223WDlXOinmuLAIPI0z2vtq8EgI=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4/go.mod h1:VahT+GtfQIM+o8ht2StR6J9g+Ef+C2Vokh5uuSmOD/4=
github.com/grafana/grafana-app-sdk v0.48.7 h1:9mF7nqkqP0QUYYDlznoOt+GIyjzj45wGfUHB32u2ZMo=
@@ -46,7 +46,7 @@
"x": 0,
"y": 0
},
"id": 23,
"id": 1,
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
@@ -77,7 +77,7 @@
"x": 0,
"y": 0
},
"id": 24,
"id": 23,
"panels": [],
"targets": [
{
@@ -31,6 +31,53 @@
"cursorSync": "Off",
"editable": false,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Application Monitoring",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "prometheus",
"spec": {}
},
"datasource": {
"type": "prometheus",
"uid": "default-ds-uid"
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "text",
"spec": {
"pluginVersion": "",
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
},
"fieldConfig": {
"defaults": {},
"overrides": []
}
}
}
}
},
"panel-10": {
"kind": "Panel",
"spec": {
@@ -977,53 +1024,6 @@
}
}
},
"panel-23": {
"kind": "Panel",
"spec": {
"id": 23,
"title": "Application Monitoring",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "prometheus",
"spec": {}
},
"datasource": {
"type": "prometheus",
"uid": "default-ds-uid"
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "text",
"spec": {
"pluginVersion": "",
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
},
"fieldConfig": {
"defaults": {},
"overrides": []
}
}
}
}
},
"panel-6": {
"kind": "Panel",
"spec": {
@@ -1259,7 +1259,7 @@
"height": 3,
"element": {
"kind": "ElementReference",
"name": "panel-23"
"name": "panel-1"
}
}
}
@@ -32,6 +32,55 @@
"cursorSync": "Off",
"editable": false,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Application Monitoring",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "prometheus",
"version": "v0",
"datasource": {
"name": "default-ds-uid"
},
"spec": {}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "text",
"version": "",
"spec": {
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
},
"fieldConfig": {
"defaults": {},
"overrides": []
}
}
}
}
},
"panel-10": {
"kind": "Panel",
"spec": {
@@ -1018,55 +1067,6 @@
}
}
},
"panel-23": {
"kind": "Panel",
"spec": {
"id": 23,
"title": "Application Monitoring",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "prometheus",
"version": "v0",
"datasource": {
"name": "default-ds-uid"
},
"spec": {}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "text",
"version": "",
"spec": {
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
},
"fieldConfig": {
"defaults": {},
"overrides": []
}
}
}
}
},
"panel-6": {
"kind": "Panel",
"spec": {
@@ -1310,7 +1310,7 @@
"height": 3,
"element": {
"kind": "ElementReference",
"name": "panel-23"
"name": "panel-1"
}
}
}
@@ -432,21 +432,6 @@ func getPanels(dashboard map[string]interface{}) []map[string]interface{} {
}
}
// Also get panels from rows
if rows, ok := dashboard["rows"].([]interface{}); ok {
for _, rowInterface := range rows {
if row, ok := rowInterface.(map[string]interface{}); ok {
if rowPanels, ok := row["panels"].([]interface{}); ok {
for _, panelInterface := range rowPanels {
if panel, ok := panelInterface.(map[string]interface{}); ok {
panels = append(panels, panel)
}
}
}
}
}
}
return panels
}
@@ -46,8 +46,7 @@ func upgradeToGridLayout(dashboard map[string]interface{}) {
widthFactor := gridColumnCount / 12.0
// Find max panel ID (lines 1014-1021 in TS)
// Also check top-level panels which may have been assigned IDs by ensurePanelsHaveUniqueIds
maxPanelID := getMaxPanelID(dashboard, rows)
maxPanelID := getMaxPanelID(rows)
nextRowID := maxPanelID + 1
// Match frontend: dashboard.panels already exists with top-level panels
@@ -270,25 +269,10 @@ func (r *rowArea) getPanelPosition(panelHeight int, panelWidth int) map[string]i
return r.getPanelPosition(panelHeight, panelWidth)
}
func getMaxPanelID(dashboard map[string]interface{}, rows []interface{}) int {
func getMaxPanelID(rows []interface{}) int {
maxID := 0
hasValidID := false
// Check top-level panels first (these may have been assigned IDs by ensurePanelsHaveUniqueIds)
if panels, ok := dashboard["panels"].([]interface{}); ok {
for _, panelInterface := range panels {
if panel, ok := panelInterface.(map[string]interface{}); ok {
if id := GetIntValue(panel, "id", 0); id > 0 {
hasValidID = true
if id > maxID {
maxID = id
}
}
}
}
}
// Also check panels inside rows
for _, rowInterface := range rows {
if row, ok := rowInterface.(map[string]interface{}); ok {
if panels, ok := row["panels"].([]interface{}); ok {
@@ -40,7 +40,7 @@
"x": 0,
"y": 0
},
"id": 23,
"id": 1,
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
@@ -71,7 +71,7 @@
"x": 0,
"y": 0
},
"id": 24,
"id": 23,
"panels": [],
"targets": [
{
@@ -35,7 +35,7 @@
"x": 0,
"y": 0
},
"id": 23,
"id": 1,
"options": {
"content": "This dashboard demonstrates various monitoring components for application observability and performance metrics.\n",
"mode": "markdown"
@@ -51,7 +51,7 @@
"x": 0,
"y": 0
},
"id": 24,
"id": 23,
"panels": [],
"title": "Application Service",
"type": "row"
+9
View File
@@ -0,0 +1,9 @@
include ../sdk.mk
.PHONY: generate # Run Grafana App SDK code generation
generate: install-app-sdk update-app-sdk
@$(APP_SDK_BIN) generate \
--source=./kinds/ \
--gogenpath=./pkg/apis \
--grouping=group \
--defencoding=none
@@ -0,0 +1,228 @@
{
"kind": "CustomResourceDefinition",
"apiVersion": "apiextensions.k8s.io/v1",
"metadata": {
"name": "dashboardcompatibilityscores.dashvalidator.ext.grafana.com"
},
"spec": {
"group": "dashvalidator.ext.grafana.com",
"versions": [
{
"name": "v1alpha1",
"served": true,
"storage": true,
"schema": {
"openAPIV3Schema": {
"properties": {
"spec": {
"properties": {
"dashboardJson": {
"description": "Complete dashboard JSON object to validate.\nMust be a v1 dashboard schema (contains \"panels\" array).\nv2 dashboards (with \"elements\" structure) are not yet supported.",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"datasourceMappings": {
"description": "Array of datasources to validate against.\nThe validator will check dashboard queries against each datasource\nand provide per-datasource compatibility results.\n\nMVP: Only single datasource supported (array length = 1), Prometheus type only.\nFuture: Will support multiple datasources for dashboards with mixed queries.",
"items": {
"description": "DataSourceMapping specifies a datasource to validate dashboard queries against.\nMaps logical datasource references in the dashboard to actual datasource instances.",
"properties": {
"name": {
"description": "Optional human-readable name for display in results.\nIf not provided, UID will be used in error messages.\nExample: \"Production Prometheus (US-West)\"",
"type": "string"
},
"type": {
"description": "Type of datasource plugin.\nMVP: Only \"prometheus\" supported.\nFuture: \"mysql\", \"postgres\", \"elasticsearch\", etc.",
"type": "string"
},
"uid": {
"description": "Unique identifier of the datasource instance.\nExample: \"prometheus-prod-us-west\"",
"type": "string"
}
},
"required": ["uid", "type"],
"type": "object"
},
"type": "array"
}
},
"required": ["dashboardJson", "datasourceMappings"],
"type": "object"
},
"status": {
"properties": {
"additionalFields": {
"description": "additionalFields is reserved for future use",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"compatibilityScore": {
"description": "Overall compatibility score across all datasources (0-100).\nCalculated as: (total found metrics / total referenced metrics) * 100\n\nScore interpretation:\n- 100: Perfect compatibility, all queries will work\n- 80-99: Excellent, minor missing metrics\n- 50-79: Fair, significant missing metrics\n- 0-49: Poor, most queries will fail",
"type": "number"
},
"datasourceResults": {
"description": "Per-datasource validation results.\nArray length matches spec.datasourceMappings.\nEach element contains detailed metrics and query-level breakdown.",
"items": {
"description": "DataSourceResult contains validation results for a single datasource.\nProvides aggregate statistics and per-query breakdown of compatibility.",
"properties": {
"checkedQueries": {
"description": "Number of queries successfully validated.\nMay be less than totalQueries if some queries couldn't be parsed.",
"type": "integer"
},
"compatibilityScore": {
"description": "Overall compatibility score for this datasource (0-100).\nCalculated as: (foundMetrics / totalMetrics) * 100\nUsed to calculate the global compatibilityScore in status.",
"type": "number"
},
"foundMetrics": {
"description": "Number of metrics that exist in the datasource schema.\nfoundMetrics \u003c= totalMetrics",
"type": "integer"
},
"missingMetrics": {
"description": "Array of metric names that were referenced but don't exist.\nUseful for debugging why a dashboard shows \"no data\".\nExample for Prometheus: [\"http_requests_total\", \"api_latency_seconds\"]",
"items": {
"type": "string"
},
"type": "array"
},
"name": {
"description": "Optional display name (matches DataSourceMapping.name if provided)",
"type": "string"
},
"queryBreakdown": {
"description": "Per-query breakdown showing which specific queries have issues.\nOne entry per query target (refId: \"A\", \"B\", \"C\", etc.) in each panel.\nAllows pinpointing exactly which panel/query needs fixing.",
"items": {
"description": "QueryBreakdown provides compatibility details for a single query within a panel.\nGranular per-query results allow users to identify exactly which queries need fixing.\n\nNote: A panel can have multiple queries (refId: \"A\", \"B\", \"C\", etc.),\nso there may be multiple QueryBreakdown entries for the same panelID.",
"properties": {
"compatibilityScore": {
"description": "Compatibility percentage for this individual query (0-100).\nCalculated as: (foundMetrics / totalMetrics) * 100\n100 = query will work perfectly, 0 = query will return no data.",
"type": "number"
},
"foundMetrics": {
"description": "Number of those metrics that exist in the datasource.\nfoundMetrics \u003c= totalMetrics",
"type": "integer"
},
"missingMetrics": {
"description": "Array of missing metric names specific to this query.\nHelps identify exactly which part of a query expression will fail.\nEmpty array means query is fully compatible.",
"items": {
"type": "string"
},
"type": "array"
},
"panelID": {
"description": "Numeric panel ID from dashboard JSON.\nUsed to correlate with dashboard structure.",
"type": "integer"
},
"panelTitle": {
"description": "Human-readable panel title for context.\nExample: \"CPU Usage\", \"Request Rate\"",
"type": "string"
},
"queryRefId": {
"description": "Query identifier within the panel.\nValues: \"A\", \"B\", \"C\", etc. (from panel.targets[].refId)\nUniquely identifies which query in a multi-query panel this refers to.",
"type": "string"
},
"totalMetrics": {
"description": "Number of unique metrics referenced in this specific query.\nFor Prometheus: metrics extracted from the PromQL expr.\nExample: rate(http_requests_total[5m]) references 1 metric.",
"type": "integer"
}
},
"required": [
"panelTitle",
"panelID",
"queryRefId",
"totalMetrics",
"foundMetrics",
"missingMetrics",
"compatibilityScore"
],
"type": "object"
},
"type": "array"
},
"totalMetrics": {
"description": "Total number of unique metrics/identifiers referenced across all queries.\nFor Prometheus: metric names extracted from PromQL expressions.\nFor SQL datasources: table and column names.",
"type": "integer"
},
"totalQueries": {
"description": "Total number of queries in the dashboard targeting this datasource.\nIncludes all panel targets/queries that reference this datasource.",
"type": "integer"
},
"type": {
"description": "Datasource type (matches DataSourceMapping.type)",
"type": "string"
},
"uid": {
"description": "Datasource UID that was validated (matches DataSourceMapping.uid)",
"type": "string"
}
},
"required": [
"uid",
"type",
"totalQueries",
"checkedQueries",
"totalMetrics",
"foundMetrics",
"missingMetrics",
"queryBreakdown",
"compatibilityScore"
],
"type": "object"
},
"type": "array"
},
"lastChecked": {
"description": "ISO 8601 timestamp of when validation was last performed.\nExample: \"2024-01-15T10:30:00Z\"",
"type": "string"
},
"message": {
"description": "Human-readable summary of validation result.\nExamples: \"All queries compatible\", \"3 missing metrics found\"",
"type": "string"
},
"operatorStates": {
"additionalProperties": {
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"description": "details contains any extra information that is operator-specific",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"required": ["compatibilityScore", "datasourceResults"],
"type": "object"
}
},
"required": ["spec"],
"type": "object"
}
},
"subresources": {
"status": {}
}
}
],
"names": {
"kind": "DashboardCompatibilityScore",
"plural": "dashboardcompatibilityscores"
},
"scope": "Namespaced"
}
}
@@ -0,0 +1,223 @@
{
"apiVersion": "apps.grafana.com/v1alpha1",
"kind": "AppManifest",
"metadata": {
"name": "dashvalidator"
},
"spec": {
"appName": "dashvalidator",
"group": "dashvalidator.ext.grafana.com",
"versions": [
{
"name": "v1alpha1",
"served": true,
"kinds": [
{
"kind": "DashboardCompatibilityScore",
"plural": "DashboardCompatibilityScores",
"scope": "Namespaced",
"schema": {
"spec": {
"properties": {
"dashboardJson": {
"description": "Complete dashboard JSON object to validate.\nMust be a v1 dashboard schema (contains \"panels\" array).\nv2 dashboards (with \"elements\" structure) are not yet supported.",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"datasourceMappings": {
"description": "Array of datasources to validate against.\nThe validator will check dashboard queries against each datasource\nand provide per-datasource compatibility results.\n\nMVP: Only single datasource supported (array length = 1), Prometheus type only.\nFuture: Will support multiple datasources for dashboards with mixed queries.",
"items": {
"description": "DataSourceMapping specifies a datasource to validate dashboard queries against.\nMaps logical datasource references in the dashboard to actual datasource instances.",
"properties": {
"name": {
"description": "Optional human-readable name for display in results.\nIf not provided, UID will be used in error messages.\nExample: \"Production Prometheus (US-West)\"",
"type": "string"
},
"type": {
"description": "Type of datasource plugin.\nMVP: Only \"prometheus\" supported.\nFuture: \"mysql\", \"postgres\", \"elasticsearch\", etc.",
"type": "string"
},
"uid": {
"description": "Unique identifier of the datasource instance.\nExample: \"prometheus-prod-us-west\"",
"type": "string"
}
},
"required": ["uid", "type"],
"type": "object"
},
"type": "array"
}
},
"required": ["dashboardJson", "datasourceMappings"],
"type": "object"
},
"status": {
"properties": {
"additionalFields": {
"description": "additionalFields is reserved for future use",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"compatibilityScore": {
"description": "Overall compatibility score across all datasources (0-100).\nCalculated as: (total found metrics / total referenced metrics) * 100\n\nScore interpretation:\n- 100: Perfect compatibility, all queries will work\n- 80-99: Excellent, minor missing metrics\n- 50-79: Fair, significant missing metrics\n- 0-49: Poor, most queries will fail",
"type": "number"
},
"datasourceResults": {
"description": "Per-datasource validation results.\nArray length matches spec.datasourceMappings.\nEach element contains detailed metrics and query-level breakdown.",
"items": {
"description": "DataSourceResult contains validation results for a single datasource.\nProvides aggregate statistics and per-query breakdown of compatibility.",
"properties": {
"checkedQueries": {
"description": "Number of queries successfully validated.\nMay be less than totalQueries if some queries couldn't be parsed.",
"type": "integer"
},
"compatibilityScore": {
"description": "Overall compatibility score for this datasource (0-100).\nCalculated as: (foundMetrics / totalMetrics) * 100\nUsed to calculate the global compatibilityScore in status.",
"type": "number"
},
"foundMetrics": {
"description": "Number of metrics that exist in the datasource schema.\nfoundMetrics \u003c= totalMetrics",
"type": "integer"
},
"missingMetrics": {
"description": "Array of metric names that were referenced but don't exist.\nUseful for debugging why a dashboard shows \"no data\".\nExample for Prometheus: [\"http_requests_total\", \"api_latency_seconds\"]",
"items": {
"type": "string"
},
"type": "array"
},
"name": {
"description": "Optional display name (matches DataSourceMapping.name if provided)",
"type": "string"
},
"queryBreakdown": {
"description": "Per-query breakdown showing which specific queries have issues.\nOne entry per query target (refId: \"A\", \"B\", \"C\", etc.) in each panel.\nAllows pinpointing exactly which panel/query needs fixing.",
"items": {
"description": "QueryBreakdown provides compatibility details for a single query within a panel.\nGranular per-query results allow users to identify exactly which queries need fixing.\n\nNote: A panel can have multiple queries (refId: \"A\", \"B\", \"C\", etc.),\nso there may be multiple QueryBreakdown entries for the same panelID.",
"properties": {
"compatibilityScore": {
"description": "Compatibility percentage for this individual query (0-100).\nCalculated as: (foundMetrics / totalMetrics) * 100\n100 = query will work perfectly, 0 = query will return no data.",
"type": "number"
},
"foundMetrics": {
"description": "Number of those metrics that exist in the datasource.\nfoundMetrics \u003c= totalMetrics",
"type": "integer"
},
"missingMetrics": {
"description": "Array of missing metric names specific to this query.\nHelps identify exactly which part of a query expression will fail.\nEmpty array means query is fully compatible.",
"items": {
"type": "string"
},
"type": "array"
},
"panelID": {
"description": "Numeric panel ID from dashboard JSON.\nUsed to correlate with dashboard structure.",
"type": "integer"
},
"panelTitle": {
"description": "Human-readable panel title for context.\nExample: \"CPU Usage\", \"Request Rate\"",
"type": "string"
},
"queryRefId": {
"description": "Query identifier within the panel.\nValues: \"A\", \"B\", \"C\", etc. (from panel.targets[].refId)\nUniquely identifies which query in a multi-query panel this refers to.",
"type": "string"
},
"totalMetrics": {
"description": "Number of unique metrics referenced in this specific query.\nFor Prometheus: metrics extracted from the PromQL expr.\nExample: rate(http_requests_total[5m]) references 1 metric.",
"type": "integer"
}
},
"required": [
"panelTitle",
"panelID",
"queryRefId",
"totalMetrics",
"foundMetrics",
"missingMetrics",
"compatibilityScore"
],
"type": "object"
},
"type": "array"
},
"totalMetrics": {
"description": "Total number of unique metrics/identifiers referenced across all queries.\nFor Prometheus: metric names extracted from PromQL expressions.\nFor SQL datasources: table and column names.",
"type": "integer"
},
"totalQueries": {
"description": "Total number of queries in the dashboard targeting this datasource.\nIncludes all panel targets/queries that reference this datasource.",
"type": "integer"
},
"type": {
"description": "Datasource type (matches DataSourceMapping.type)",
"type": "string"
},
"uid": {
"description": "Datasource UID that was validated (matches DataSourceMapping.uid)",
"type": "string"
}
},
"required": [
"uid",
"type",
"totalQueries",
"checkedQueries",
"totalMetrics",
"foundMetrics",
"missingMetrics",
"queryBreakdown",
"compatibilityScore"
],
"type": "object"
},
"type": "array"
},
"lastChecked": {
"description": "ISO 8601 timestamp of when validation was last performed.\nExample: \"2024-01-15T10:30:00Z\"",
"type": "string"
},
"message": {
"description": "Human-readable summary of validation result.\nExamples: \"All queries compatible\", \"3 missing metrics found\"",
"type": "string"
},
"operatorStates": {
"additionalProperties": {
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"description": "details contains any extra information that is operator-specific",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"required": ["compatibilityScore", "datasourceResults"],
"type": "object"
}
},
"conversion": false
}
]
}
],
"preferredVersion": "v1alpha1"
}
}
+247
View File
@@ -0,0 +1,247 @@
module github.com/grafana/grafana/apps/dashvalidator
go 1.25.5
require (
github.com/grafana/grafana v0.0.0-00010101000000-000000000000
github.com/grafana/grafana-app-sdk v0.48.7
github.com/grafana/grafana-app-sdk/logging v0.48.7
github.com/prometheus/prometheus v0.303.1
k8s.io/apimachinery v0.34.3
k8s.io/kube-openapi v0.0.0-20251125145642-4e65d59e963e
)
require (
filippo.io/edwards25519 v1.1.0 // indirect
github.com/Machiel/slugify v1.0.1 // indirect
github.com/ProtonMail/go-crypto v1.1.6 // indirect
github.com/VividCortex/mysqlerr v0.0.0-20170204212430-6c6b55f8796f // indirect
github.com/antlr4-go/antlr/v4 v4.13.1 // indirect
github.com/apache/arrow-go/v18 v18.4.1 // indirect
github.com/armon/go-metrics v0.4.1 // indirect
github.com/aws/aws-sdk-go-v2 v1.39.1 // indirect
github.com/aws/aws-sdk-go-v2/credentials v1.18.14 // indirect
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.8 // indirect
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.8 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.1 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.8 // indirect
github.com/aws/aws-sdk-go-v2/service/sts v1.38.5 // indirect
github.com/aws/smithy-go v1.23.1 // indirect
github.com/barkimedes/go-deepcopy v0.0.0-20220514131651-17c30cfc62df // indirect
github.com/beorn7/perks v1.0.1 // indirect
github.com/blang/semver v3.5.1+incompatible // indirect
github.com/blang/semver/v4 v4.0.0 // indirect
github.com/bluele/gcache v0.0.2 // indirect
github.com/bradfitz/gomemcache v0.0.0-20250403215159-8d39553ac7cf // indirect
github.com/bwmarrin/snowflake v0.3.0 // indirect
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/cheekybits/genny v1.0.0 // indirect
github.com/cloudflare/circl v1.6.1 // indirect
github.com/coreos/go-systemd/v22 v22.6.0 // indirect
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
github.com/dennwc/varint v1.0.0 // indirect
github.com/diegoholiveira/jsonlogic/v3 v3.7.4 // indirect
github.com/dolthub/go-icu-regex v0.0.0-20250916051405-78a38d478790 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect
github.com/emicklei/go-restful/v3 v3.13.0 // indirect
github.com/fatih/color v1.18.0 // indirect
github.com/fxamacker/cbor/v2 v2.9.0 // indirect
github.com/getkin/kin-openapi v0.133.0 // indirect
github.com/go-jose/go-jose/v4 v4.1.3 // indirect
github.com/go-kit/log v0.2.1 // indirect
github.com/go-logfmt/logfmt v0.6.1 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/go-openapi/jsonpointer v0.22.4 // indirect
github.com/go-openapi/jsonreference v0.21.4 // indirect
github.com/go-openapi/swag v0.25.4 // indirect
github.com/go-openapi/swag/cmdutils v0.25.4 // indirect
github.com/go-openapi/swag/conv v0.25.4 // indirect
github.com/go-openapi/swag/fileutils v0.25.4 // indirect
github.com/go-openapi/swag/jsonname v0.25.4 // indirect
github.com/go-openapi/swag/jsonutils v0.25.4 // indirect
github.com/go-openapi/swag/loading v0.25.4 // indirect
github.com/go-openapi/swag/mangling v0.25.4 // indirect
github.com/go-openapi/swag/netutils v0.25.4 // indirect
github.com/go-openapi/swag/stringutils v0.25.4 // indirect
github.com/go-openapi/swag/typeutils v0.25.4 // indirect
github.com/go-openapi/swag/yamlutils v0.25.4 // indirect
github.com/go-sql-driver/mysql v1.9.3 // indirect
github.com/go-stack/stack v1.8.1 // indirect
github.com/gobwas/glob v0.2.3 // indirect
github.com/goccy/go-json v0.10.5 // indirect
github.com/gogo/googleapis v1.4.1 // indirect
github.com/gogo/protobuf v1.3.2 // indirect
github.com/golang-jwt/jwt/v5 v5.3.0 // indirect
github.com/golang-migrate/migrate/v4 v4.7.0 // indirect
github.com/golang/protobuf v1.5.4 // indirect
github.com/google/btree v1.1.3 // indirect
github.com/google/flatbuffers v25.2.10+incompatible // indirect
github.com/google/gnostic-models v0.7.1 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f // indirect
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // indirect
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // indirect
github.com/grafana/dataplane/sdata v0.0.9 // indirect
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4 // indirect
github.com/grafana/grafana-aws-sdk v1.3.0 // indirect
github.com/grafana/grafana-azure-sdk-go/v2 v2.3.1 // indirect
github.com/grafana/grafana-plugin-sdk-go v0.284.0 // indirect
github.com/grafana/grafana/pkg/apimachinery v0.0.0 // indirect
github.com/grafana/grafana/pkg/apiserver v0.0.0 // indirect
github.com/grafana/grafana/pkg/semconv v0.0.0-20250804150913-990f1c69ecc2 // indirect
github.com/grafana/otel-profiling-go v0.5.1 // indirect
github.com/grafana/pyroscope-go/godeltaprof v0.1.9 // indirect
github.com/grafana/regexp v0.0.0-20240518133315-a468a5bfb3bc // indirect
github.com/grafana/sqlds/v4 v4.2.7 // indirect
github.com/grpc-ecosystem/go-grpc-middleware/providers/prometheus v1.1.0 // indirect
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.3 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.3 // indirect
github.com/hashicorp/errwrap v1.1.0 // indirect
github.com/hashicorp/go-hclog v1.6.3 // indirect
github.com/hashicorp/go-immutable-radix v1.3.1 // indirect
github.com/hashicorp/go-metrics v0.5.4 // indirect
github.com/hashicorp/go-msgpack/v2 v2.1.2 // indirect
github.com/hashicorp/go-multierror v1.1.1 // indirect
github.com/hashicorp/go-plugin v1.7.0 // indirect
github.com/hashicorp/go-sockaddr v1.0.7 // indirect
github.com/hashicorp/golang-lru v1.0.2 // indirect
github.com/hashicorp/golang-lru/v2 v2.0.7 // indirect
github.com/hashicorp/memberlist v0.5.2 // indirect
github.com/hashicorp/yamux v0.1.2 // indirect
github.com/jaegertracing/jaeger-idl v0.5.0 // indirect
github.com/jmespath-community/go-jmespath v1.1.1 // indirect
github.com/jmoiron/sqlx v1.4.0 // indirect
github.com/josharian/intern v1.0.0 // indirect
github.com/jpillora/backoff v1.0.0 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/jszwedko/go-datemath v0.1.1-0.20230526204004-640a500621d6 // indirect
github.com/klauspost/compress v1.18.0 // indirect
github.com/klauspost/cpuid/v2 v2.3.0 // indirect
github.com/lib/pq v1.10.9 // indirect
github.com/mailru/easyjson v0.9.1 // indirect
github.com/mattetti/filebuffer v1.0.1 // indirect
github.com/mattn/go-colorable v0.1.14 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/mattn/go-runewidth v0.0.16 // indirect
github.com/mattn/go-sqlite3 v1.14.32 // indirect
github.com/mdlayher/socket v0.4.1 // indirect
github.com/mdlayher/vsock v1.2.1 // indirect
github.com/miekg/dns v1.1.63 // indirect
github.com/mitchellh/go-homedir v1.1.0 // indirect
github.com/mithrandie/csvq v1.18.1 // indirect
github.com/mithrandie/csvq-driver v1.7.0 // indirect
github.com/mithrandie/go-file/v2 v2.1.0 // indirect
github.com/mithrandie/go-text v1.6.0 // indirect
github.com/mithrandie/ternary v1.1.1 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.3-0.20250322232337-35a7c28c31ee // indirect
github.com/mohae/deepcopy v0.0.0-20170929034955-c48cc78d4826 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f // indirect
github.com/ncruces/go-strftime v0.1.9 // indirect
github.com/nikunjy/rules v1.5.0 // indirect
github.com/oasdiff/yaml v0.0.0-20250309154309-f31be36b4037 // indirect
github.com/oasdiff/yaml3 v0.0.0-20250309153720-d2182401db90 // indirect
github.com/oklog/run v1.1.0 // indirect
github.com/oklog/ulid v1.3.1 // indirect
github.com/olekukonko/tablewriter v0.0.5 // indirect
github.com/open-feature/go-sdk v1.16.0 // indirect
github.com/open-feature/go-sdk-contrib/providers/go-feature-flag v0.2.6 // indirect
github.com/open-feature/go-sdk-contrib/providers/ofrep v0.1.6 // indirect
github.com/patrickmn/go-cache v2.1.0+incompatible // indirect
github.com/perimeterx/marshmallow v1.1.5 // indirect
github.com/pierrec/lz4/v4 v4.1.22 // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
github.com/prometheus/alertmanager v0.28.2 // indirect
github.com/prometheus/client_golang v1.23.2 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.67.4 // indirect
github.com/prometheus/exporter-toolkit v0.14.0 // indirect
github.com/prometheus/procfs v0.19.2 // indirect
github.com/puzpuzpuz/xsync/v2 v2.5.1 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/rivo/uniseg v0.4.7 // indirect
github.com/sean-/seed v0.0.0-20170313163322-e2103e2c3529 // indirect
github.com/spf13/pflag v1.0.10 // indirect
github.com/stretchr/objx v0.5.2 // indirect
github.com/stretchr/testify v1.11.1 // indirect
github.com/thomaspoignant/go-feature-flag v1.42.0 // indirect
github.com/tjhop/slog-gokit v0.1.5 // indirect
github.com/woodsbury/decimal128 v1.4.0 // indirect
github.com/x448/float16 v0.8.4 // indirect
github.com/zeebo/xxh3 v1.0.2 // indirect
go.opentelemetry.io/auto/sdk v1.2.1 // indirect
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.64.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/httptrace/otelhttptrace v0.63.0 // indirect
go.opentelemetry.io/contrib/propagators/jaeger v1.38.0 // indirect
go.opentelemetry.io/contrib/samplers/jaegerremote v0.32.0 // indirect
go.opentelemetry.io/otel v1.39.0 // indirect
go.opentelemetry.io/otel/exporters/jaeger v1.17.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.39.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.39.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.39.0 // indirect
go.opentelemetry.io/otel/metric v1.39.0 // indirect
go.opentelemetry.io/otel/sdk v1.39.0 // indirect
go.opentelemetry.io/otel/trace v1.39.0 // indirect
go.opentelemetry.io/proto/otlp v1.9.0 // indirect
go.uber.org/atomic v1.11.0 // indirect
go.uber.org/mock v0.6.0 // indirect
go.yaml.in/yaml/v2 v2.4.3 // indirect
go.yaml.in/yaml/v3 v3.0.4 // indirect
golang.org/x/crypto v0.46.0 // indirect
golang.org/x/exp v0.0.0-20251209150349-8475f28825e9 // indirect
golang.org/x/mod v0.31.0 // indirect
golang.org/x/net v0.48.0 // indirect
golang.org/x/oauth2 v0.34.0 // indirect
golang.org/x/sync v0.19.0 // indirect
golang.org/x/sys v0.39.0 // indirect
golang.org/x/telemetry v0.0.0-20251203150158-8fff8a5912fc // indirect
golang.org/x/term v0.38.0 // indirect
golang.org/x/text v0.32.0 // indirect
golang.org/x/time v0.14.0 // indirect
golang.org/x/tools v0.40.0 // indirect
golang.org/x/xerrors v0.0.0-20240903120638-7835f813f4da // indirect
gomodules.xyz/jsonpatch/v2 v2.5.0 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20251213004720-97cd9d5aeac2 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251213004720-97cd9d5aeac2 // indirect
google.golang.org/grpc v1.77.0 // indirect
google.golang.org/protobuf v1.36.11 // indirect
gopkg.in/inf.v0 v0.9.1 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
k8s.io/api v0.34.3 // indirect
k8s.io/apiextensions-apiserver v0.34.3 // indirect
k8s.io/apiserver v0.34.3 // indirect
k8s.io/client-go v0.34.3 // indirect
k8s.io/component-base v0.34.3 // indirect
k8s.io/klog/v2 v2.130.1 // indirect
k8s.io/utils v0.0.0-20251002143259-bc988d571ff4 // indirect
modernc.org/libc v1.66.10 // indirect
modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect
modernc.org/sqlite v1.40.1 // indirect
sigs.k8s.io/json v0.0.0-20250730193827-2d320260d730 // indirect
sigs.k8s.io/randfill v1.0.0 // indirect
sigs.k8s.io/structured-merge-diff/v6 v6.3.1 // indirect
sigs.k8s.io/yaml v1.6.0 // indirect
xorm.io/builder v0.3.13 // indirect
)
// transitive dependencies that need replaced
// TODO: stop depending on grafana core
replace github.com/grafana/grafana => ../..
replace github.com/grafana/grafana/pkg/apimachinery => ../../pkg/apimachinery
replace github.com/grafana/grafana/pkg/apiserver => ../../pkg/apiserver
replace github.com/grafana/grafana/apps/dashboard => ../dashboard
replace github.com/grafana/grafana/apps/provisioning => ../provisioning
replace github.com/prometheus/alertmanager => github.com/grafana/prometheus-alertmanager v0.25.1-0.20250911094103-5456b6e45604
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,2 @@
module: "github.com/grafana/grafana/apps/dashvalidator/kinds"
language: version: "v0.8.2"
@@ -0,0 +1,157 @@
package kinds
// DashboardCompatibilityScore validates whether a dashboard's queries
// are compatible with the target datasource schema.
//
// This resource checks if metrics, tables, or other identifiers referenced
// in dashboard queries actually exist in the configured datasources,
// helping users identify dashboards that will show "no data" before deployment.
//
// MVP: Prometheus datasource only; architecture supports future datasource types.
dashboardcompatibilityscorev0alpha1: {
kind: "DashboardCompatibilityScore"
plural: "dashboardcompatibilityscores"
scope: "Namespaced"
schema: {
spec: {
// Complete dashboard JSON object to validate.
// Must be a v1 dashboard schema (contains "panels" array).
// v2 dashboards (with "elements" structure) are not yet supported.
dashboardJson: {...}
// Array of datasources to validate against.
// The validator will check dashboard queries against each datasource
// and provide per-datasource compatibility results.
//
// MVP: Only single datasource supported (array length = 1), Prometheus type only.
// Future: Will support multiple datasources for dashboards with mixed queries.
datasourceMappings: [...#DataSourceMapping]
}
status: {
// Overall compatibility score across all datasources (0-100).
// Calculated as: (total found metrics / total referenced metrics) * 100
//
// Score interpretation:
// - 100: Perfect compatibility, all queries will work
// - 80-99: Excellent, minor missing metrics
// - 50-79: Fair, significant missing metrics
// - 0-49: Poor, most queries will fail
compatibilityScore: float64
// Per-datasource validation results.
// Array length matches spec.datasourceMappings.
// Each element contains detailed metrics and query-level breakdown.
datasourceResults: [...#DataSourceResult]
// ISO 8601 timestamp of when validation was last performed.
// Example: "2024-01-15T10:30:00Z"
lastChecked?: string
// Human-readable summary of validation result.
// Examples: "All queries compatible", "3 missing metrics found"
message?: string
}
}
}
// DataSourceMapping specifies a datasource to validate dashboard queries against.
// Maps logical datasource references in the dashboard to actual datasource instances.
#DataSourceMapping: {
// Unique identifier of the datasource instance.
// Example: "prometheus-prod-us-west"
uid: string
// Type of datasource plugin.
// MVP: Only "prometheus" supported.
// Future: "mysql", "postgres", "elasticsearch", etc.
type: string
// Optional human-readable name for display in results.
// If not provided, UID will be used in error messages.
// Example: "Production Prometheus (US-West)"
name?: string
}
// DataSourceResult contains validation results for a single datasource.
// Provides aggregate statistics and per-query breakdown of compatibility.
#DataSourceResult: {
// Datasource UID that was validated (matches DataSourceMapping.uid)
uid: string
// Datasource type (matches DataSourceMapping.type)
type: string
// Optional display name (matches DataSourceMapping.name if provided)
name?: string
// Total number of queries in the dashboard targeting this datasource.
// Includes all panel targets/queries that reference this datasource.
totalQueries: int
// Number of queries successfully validated.
// May be less than totalQueries if some queries couldn't be parsed.
checkedQueries: int
// Total number of unique metrics/identifiers referenced across all queries.
// For Prometheus: metric names extracted from PromQL expressions.
// For SQL datasources: table and column names.
totalMetrics: int
// Number of metrics that exist in the datasource schema.
// foundMetrics <= totalMetrics
foundMetrics: int
// Array of metric names that were referenced but don't exist.
// Useful for debugging why a dashboard shows "no data".
// Example for Prometheus: ["http_requests_total", "api_latency_seconds"]
missingMetrics: [...string]
// Per-query breakdown showing which specific queries have issues.
// One entry per query target (refId: "A", "B", "C", etc.) in each panel.
// Allows pinpointing exactly which panel/query needs fixing.
queryBreakdown: [...#QueryBreakdown]
// Overall compatibility score for this datasource (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// Used to calculate the global compatibilityScore in status.
compatibilityScore: float64
}
// QueryBreakdown provides compatibility details for a single query within a panel.
// Granular per-query results allow users to identify exactly which queries need fixing.
//
// Note: A panel can have multiple queries (refId: "A", "B", "C", etc.),
// so there may be multiple QueryBreakdown entries for the same panelID.
#QueryBreakdown: {
// Human-readable panel title for context.
// Example: "CPU Usage", "Request Rate"
panelTitle: string
// Numeric panel ID from dashboard JSON.
// Used to correlate with dashboard structure.
panelID: int
// Query identifier within the panel.
// Values: "A", "B", "C", etc. (from panel.targets[].refId)
// Uniquely identifies which query in a multi-query panel this refers to.
queryRefId: string
// Number of unique metrics referenced in this specific query.
// For Prometheus: metrics extracted from the PromQL expr.
// Example: rate(http_requests_total[5m]) references 1 metric.
totalMetrics: int
// Number of those metrics that exist in the datasource.
// foundMetrics <= totalMetrics
foundMetrics: int
// Array of missing metric names specific to this query.
// Helps identify exactly which part of a query expression will fail.
// Empty array means query is fully compatible.
missingMetrics: [...string]
// Compatibility percentage for this individual query (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// 100 = query will work perfectly, 0 = query will return no data.
compatibilityScore: float64
}
+110
View File
@@ -0,0 +1,110 @@
package kinds
manifest: {
// appName is the unique name of your app. It is used to reference the app from other config objects,
// and to generate the group used by your app in the app platform API.
appName: "dashvalidator"
// groupOverride can be used to specify a non-appName-based API group.
// By default, an app's API group is LOWER(REPLACE(appName, '-', '')).ext.grafana.com,
// but there are cases where this needs to be changed.
// Keep in mind that changing this after an app is deployed can cause problems with clients and/or kind data.
groupOverride: "dashvalidator.grafana.app"
// versions is a map of versions supported by your app. Version names should follow the format "v<integer>" or
// "v<integer>(alpha|beta)<integer>". Each version contains the kinds your app manages for that version.
// If your app needs access to kinds managed by another app, use permissions.accessKinds to allow your app access.
versions: {
"v1alpha1": v1alpha1
}
// extraPermissions contains any additional permissions your app may require to function.
// Your app will always have all permissions for each kind it manages (the items defined in 'kinds').
extraPermissions: {
// If your app needs access to additional kinds supplied by other apps, you can list them here
accessKinds: [
// Here is an example for your app accessing the playlist kind for reads and watch
// {
// group: "playlist.grafana.app"
// resource: "playlists"
// actions: ["get","list","watch"]
// }
]
}
}
// v1alpha1 is the v1alpha1 version of the app's API.
// It includes kinds which the v1alpha1 API serves, and (future) custom routes served globally from the v1alpha1 version.
v1alpha1: {
// kinds is the list of kinds served by this version
kinds: [dashboardcompatibilityscorev0alpha1]
// [OPTIONAL]
// served indicates whether this particular version is served by the API server.
// served should be set to false before a version is removed from the manifest entirely.
// served defaults to true if not present.
served: true
// [OPTIONAL]
// Codegen is a trait that tells the grafana-app-sdk, or other code generation tooling, how to process this kind.
// If not present, default values within the codegen trait are used.
// If you wish to specify codegen per-version, put this section in the version's object
// (for example, <no value>v1alpha1) instead.
routes: {
namespaced: {
"/check": {
"POST": {
request: {
body: {
dashboardJson: {...}
datasourceMappings: [...{
uid: string
type: string
name?: string
}]
}
}
response: {
compatibilityScore: number
datasourceResults: [...{
uid: string
type: string
name?: string
totalQueries: int
checkedQueries: int
totalMetrics: int
foundMetrics: int
missingMetrics: [...string]
queryBreakdown: [...{
panelTitle: string
panelID: int
queryRefId: string
totalMetrics: int
foundMetrics: int
missingMetrics: [...string]
compatibilityScore: number
}]
compatibilityScore: number
}]
}
}
}
}
cluser: {}
}
codegen: {
// [OPTIONAL]
// ts contains TypeScript code generation properties for the kind
ts: {
// [OPTIONAL]
// enabled indicates whether the CLI should generate front-end TypeScript code for the kind.
// Defaults to true if not present.
enabled: true
}
// [OPTIONAL]
// go contains go code generation properties for the kind
go: {
// [OPTIONAL]
// enabled indicates whether the CLI should generate back-end go code for the kind.
// Defaults to true if not present.
enabled: true
}
}
}
@@ -0,0 +1,18 @@
package v1alpha1
import "k8s.io/apimachinery/pkg/runtime/schema"
const (
// APIGroup is the API group used by all kinds in this package
APIGroup = "dashvalidator.grafana.app"
// APIVersion is the API version used by all kinds in this package
APIVersion = "v1alpha1"
)
var (
// GroupVersion is a schema.GroupVersion consisting of the Group and Version constants for this package
GroupVersion = schema.GroupVersion{
Group: APIGroup,
Version: APIVersion,
}
)
@@ -0,0 +1,27 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
type CreateCheckRequestBody struct {
DashboardJson map[string]any `json:"dashboardJson"`
DatasourceMappings []CreateCheckRequestV1alpha1BodyDatasourceMappings `json:"datasourceMappings"`
}
// NewCreateCheckRequestBody creates a new CreateCheckRequestBody object.
func NewCreateCheckRequestBody() *CreateCheckRequestBody {
return &CreateCheckRequestBody{
DashboardJson: map[string]any{},
DatasourceMappings: []CreateCheckRequestV1alpha1BodyDatasourceMappings{},
}
}
type CreateCheckRequestV1alpha1BodyDatasourceMappings struct {
Uid string `json:"uid"`
Type string `json:"type"`
Name *string `json:"name,omitempty"`
}
// NewCreateCheckRequestV1alpha1BodyDatasourceMappings creates a new CreateCheckRequestV1alpha1BodyDatasourceMappings object.
func NewCreateCheckRequestV1alpha1BodyDatasourceMappings() *CreateCheckRequestV1alpha1BodyDatasourceMappings {
return &CreateCheckRequestV1alpha1BodyDatasourceMappings{}
}
@@ -0,0 +1,56 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
// +k8s:openapi-gen=true
type CreateCheckBody struct {
CompatibilityScore float64 `json:"compatibilityScore"`
DatasourceResults []V1alpha1CreateCheckBodyDatasourceResults `json:"datasourceResults"`
}
// NewCreateCheckBody creates a new CreateCheckBody object.
func NewCreateCheckBody() *CreateCheckBody {
return &CreateCheckBody{
DatasourceResults: []V1alpha1CreateCheckBodyDatasourceResults{},
}
}
// +k8s:openapi-gen=true
type V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown struct {
PanelTitle string `json:"panelTitle"`
PanelID int64 `json:"panelID"`
QueryRefId string `json:"queryRefId"`
TotalMetrics int64 `json:"totalMetrics"`
FoundMetrics int64 `json:"foundMetrics"`
MissingMetrics []string `json:"missingMetrics"`
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewV1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown creates a new V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown object.
func NewV1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown() *V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown {
return &V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown{
MissingMetrics: []string{},
}
}
// +k8s:openapi-gen=true
type V1alpha1CreateCheckBodyDatasourceResults struct {
Uid string `json:"uid"`
Type string `json:"type"`
Name *string `json:"name,omitempty"`
TotalQueries int64 `json:"totalQueries"`
CheckedQueries int64 `json:"checkedQueries"`
TotalMetrics int64 `json:"totalMetrics"`
FoundMetrics int64 `json:"foundMetrics"`
MissingMetrics []string `json:"missingMetrics"`
QueryBreakdown []V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown `json:"queryBreakdown"`
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewV1alpha1CreateCheckBodyDatasourceResults creates a new V1alpha1CreateCheckBodyDatasourceResults object.
func NewV1alpha1CreateCheckBodyDatasourceResults() *V1alpha1CreateCheckBodyDatasourceResults {
return &V1alpha1CreateCheckBodyDatasourceResults{
MissingMetrics: []string{},
QueryBreakdown: []V1alpha1CreateCheckBodyDatasourceResultsQueryBreakdown{},
}
}
@@ -0,0 +1,37 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
import (
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/runtime"
)
// +k8s:openapi-gen=true
type CreateCheck struct {
metav1.TypeMeta `json:",inline"`
CreateCheckBody `json:",inline"`
}
func NewCreateCheck() *CreateCheck {
return &CreateCheck{}
}
func (t *CreateCheckBody) DeepCopyInto(dst *CreateCheckBody) {
_ = resource.CopyObjectInto(dst, t)
}
func (o *CreateCheck) DeepCopyObject() runtime.Object {
dst := NewCreateCheck()
o.DeepCopyInto(dst)
return dst
}
func (o *CreateCheck) DeepCopyInto(dst *CreateCheck) {
dst.TypeMeta.APIVersion = o.TypeMeta.APIVersion
dst.TypeMeta.Kind = o.TypeMeta.Kind
o.CreateCheckBody.DeepCopyInto(&dst.CreateCheckBody)
}
var _ runtime.Object = NewCreateCheck()
@@ -0,0 +1,99 @@
package v1alpha1
import (
"context"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
)
type DashboardCompatibilityScoreClient struct {
client *resource.TypedClient[*DashboardCompatibilityScore, *DashboardCompatibilityScoreList]
}
func NewDashboardCompatibilityScoreClient(client resource.Client) *DashboardCompatibilityScoreClient {
return &DashboardCompatibilityScoreClient{
client: resource.NewTypedClient[*DashboardCompatibilityScore, *DashboardCompatibilityScoreList](client, DashboardCompatibilityScoreKind()),
}
}
func NewDashboardCompatibilityScoreClientFromGenerator(generator resource.ClientGenerator) (*DashboardCompatibilityScoreClient, error) {
c, err := generator.ClientFor(DashboardCompatibilityScoreKind())
if err != nil {
return nil, err
}
return NewDashboardCompatibilityScoreClient(c), nil
}
func (c *DashboardCompatibilityScoreClient) Get(ctx context.Context, identifier resource.Identifier) (*DashboardCompatibilityScore, error) {
return c.client.Get(ctx, identifier)
}
func (c *DashboardCompatibilityScoreClient) List(ctx context.Context, namespace string, opts resource.ListOptions) (*DashboardCompatibilityScoreList, error) {
return c.client.List(ctx, namespace, opts)
}
func (c *DashboardCompatibilityScoreClient) ListAll(ctx context.Context, namespace string, opts resource.ListOptions) (*DashboardCompatibilityScoreList, error) {
resp, err := c.client.List(ctx, namespace, resource.ListOptions{
ResourceVersion: opts.ResourceVersion,
Limit: opts.Limit,
LabelFilters: opts.LabelFilters,
FieldSelectors: opts.FieldSelectors,
})
if err != nil {
return nil, err
}
for resp.GetContinue() != "" {
page, err := c.client.List(ctx, namespace, resource.ListOptions{
Continue: resp.GetContinue(),
ResourceVersion: opts.ResourceVersion,
Limit: opts.Limit,
LabelFilters: opts.LabelFilters,
FieldSelectors: opts.FieldSelectors,
})
if err != nil {
return nil, err
}
resp.SetContinue(page.GetContinue())
resp.SetResourceVersion(page.GetResourceVersion())
resp.SetItems(append(resp.GetItems(), page.GetItems()...))
}
return resp, nil
}
func (c *DashboardCompatibilityScoreClient) Create(ctx context.Context, obj *DashboardCompatibilityScore, opts resource.CreateOptions) (*DashboardCompatibilityScore, error) {
// Make sure apiVersion and kind are set
obj.APIVersion = GroupVersion.Identifier()
obj.Kind = DashboardCompatibilityScoreKind().Kind()
return c.client.Create(ctx, obj, opts)
}
func (c *DashboardCompatibilityScoreClient) Update(ctx context.Context, obj *DashboardCompatibilityScore, opts resource.UpdateOptions) (*DashboardCompatibilityScore, error) {
return c.client.Update(ctx, obj, opts)
}
func (c *DashboardCompatibilityScoreClient) Patch(ctx context.Context, identifier resource.Identifier, req resource.PatchRequest, opts resource.PatchOptions) (*DashboardCompatibilityScore, error) {
return c.client.Patch(ctx, identifier, req, opts)
}
func (c *DashboardCompatibilityScoreClient) UpdateStatus(ctx context.Context, identifier resource.Identifier, newStatus DashboardCompatibilityScoreStatus, opts resource.UpdateOptions) (*DashboardCompatibilityScore, error) {
return c.client.Update(ctx, &DashboardCompatibilityScore{
TypeMeta: metav1.TypeMeta{
Kind: DashboardCompatibilityScoreKind().Kind(),
APIVersion: GroupVersion.Identifier(),
},
ObjectMeta: metav1.ObjectMeta{
ResourceVersion: opts.ResourceVersion,
Namespace: identifier.Namespace,
Name: identifier.Name,
},
Status: newStatus,
}, resource.UpdateOptions{
Subresource: "status",
ResourceVersion: opts.ResourceVersion,
})
}
func (c *DashboardCompatibilityScoreClient) Delete(ctx context.Context, identifier resource.Identifier, opts resource.DeleteOptions) error {
return c.client.Delete(ctx, identifier, opts)
}
@@ -0,0 +1,28 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"encoding/json"
"io"
"github.com/grafana/grafana-app-sdk/resource"
)
// DashboardCompatibilityScoreJSONCodec is an implementation of resource.Codec for kubernetes JSON encoding
type DashboardCompatibilityScoreJSONCodec struct{}
// Read reads JSON-encoded bytes from `reader` and unmarshals them into `into`
func (*DashboardCompatibilityScoreJSONCodec) Read(reader io.Reader, into resource.Object) error {
return json.NewDecoder(reader).Decode(into)
}
// Write writes JSON-encoded bytes into `writer` marshaled from `from`
func (*DashboardCompatibilityScoreJSONCodec) Write(writer io.Writer, from resource.Object) error {
return json.NewEncoder(writer).Encode(from)
}
// Interface compliance checks
var _ resource.Codec = &DashboardCompatibilityScoreJSONCodec{}
@@ -0,0 +1,31 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
import (
time "time"
)
// metadata contains embedded CommonMetadata and can be extended with custom string fields
// TODO: use CommonMetadata instead of redefining here; currently needs to be defined here
// without external reference as using the CommonMetadata reference breaks thema codegen.
type DashboardCompatibilityScoreMetadata struct {
UpdateTimestamp time.Time `json:"updateTimestamp"`
CreatedBy string `json:"createdBy"`
Uid string `json:"uid"`
CreationTimestamp time.Time `json:"creationTimestamp"`
DeletionTimestamp *time.Time `json:"deletionTimestamp,omitempty"`
Finalizers []string `json:"finalizers"`
ResourceVersion string `json:"resourceVersion"`
Generation int64 `json:"generation"`
UpdatedBy string `json:"updatedBy"`
Labels map[string]string `json:"labels"`
}
// NewDashboardCompatibilityScoreMetadata creates a new DashboardCompatibilityScoreMetadata object.
func NewDashboardCompatibilityScoreMetadata() *DashboardCompatibilityScoreMetadata {
return &DashboardCompatibilityScoreMetadata{
Finalizers: []string{},
Labels: map[string]string{},
}
}
@@ -0,0 +1,326 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"fmt"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apimachinery/pkg/runtime/schema"
"k8s.io/apimachinery/pkg/types"
"time"
)
// +k8s:openapi-gen=true
type DashboardCompatibilityScore struct {
metav1.TypeMeta `json:",inline" yaml:",inline"`
metav1.ObjectMeta `json:"metadata" yaml:"metadata"`
// Spec is the spec of the DashboardCompatibilityScore
Spec DashboardCompatibilityScoreSpec `json:"spec" yaml:"spec"`
Status DashboardCompatibilityScoreStatus `json:"status" yaml:"status"`
}
func NewDashboardCompatibilityScore() *DashboardCompatibilityScore {
return &DashboardCompatibilityScore{
Spec: *NewDashboardCompatibilityScoreSpec(),
Status: *NewDashboardCompatibilityScoreStatus(),
}
}
func (o *DashboardCompatibilityScore) GetSpec() any {
return o.Spec
}
func (o *DashboardCompatibilityScore) SetSpec(spec any) error {
cast, ok := spec.(DashboardCompatibilityScoreSpec)
if !ok {
return fmt.Errorf("cannot set spec type %#v, not of type Spec", spec)
}
o.Spec = cast
return nil
}
func (o *DashboardCompatibilityScore) GetSubresources() map[string]any {
return map[string]any{
"status": o.Status,
}
}
func (o *DashboardCompatibilityScore) GetSubresource(name string) (any, bool) {
switch name {
case "status":
return o.Status, true
default:
return nil, false
}
}
func (o *DashboardCompatibilityScore) SetSubresource(name string, value any) error {
switch name {
case "status":
cast, ok := value.(DashboardCompatibilityScoreStatus)
if !ok {
return fmt.Errorf("cannot set status type %#v, not of type DashboardCompatibilityScoreStatus", value)
}
o.Status = cast
return nil
default:
return fmt.Errorf("subresource '%s' does not exist", name)
}
}
func (o *DashboardCompatibilityScore) GetStaticMetadata() resource.StaticMetadata {
gvk := o.GroupVersionKind()
return resource.StaticMetadata{
Name: o.ObjectMeta.Name,
Namespace: o.ObjectMeta.Namespace,
Group: gvk.Group,
Version: gvk.Version,
Kind: gvk.Kind,
}
}
func (o *DashboardCompatibilityScore) SetStaticMetadata(metadata resource.StaticMetadata) {
o.Name = metadata.Name
o.Namespace = metadata.Namespace
o.SetGroupVersionKind(schema.GroupVersionKind{
Group: metadata.Group,
Version: metadata.Version,
Kind: metadata.Kind,
})
}
func (o *DashboardCompatibilityScore) GetCommonMetadata() resource.CommonMetadata {
dt := o.DeletionTimestamp
var deletionTimestamp *time.Time
if dt != nil {
deletionTimestamp = &dt.Time
}
// Legacy ExtraFields support
extraFields := make(map[string]any)
if o.Annotations != nil {
extraFields["annotations"] = o.Annotations
}
if o.ManagedFields != nil {
extraFields["managedFields"] = o.ManagedFields
}
if o.OwnerReferences != nil {
extraFields["ownerReferences"] = o.OwnerReferences
}
return resource.CommonMetadata{
UID: string(o.UID),
ResourceVersion: o.ResourceVersion,
Generation: o.Generation,
Labels: o.Labels,
CreationTimestamp: o.CreationTimestamp.Time,
DeletionTimestamp: deletionTimestamp,
Finalizers: o.Finalizers,
UpdateTimestamp: o.GetUpdateTimestamp(),
CreatedBy: o.GetCreatedBy(),
UpdatedBy: o.GetUpdatedBy(),
ExtraFields: extraFields,
}
}
func (o *DashboardCompatibilityScore) SetCommonMetadata(metadata resource.CommonMetadata) {
o.UID = types.UID(metadata.UID)
o.ResourceVersion = metadata.ResourceVersion
o.Generation = metadata.Generation
o.Labels = metadata.Labels
o.CreationTimestamp = metav1.NewTime(metadata.CreationTimestamp)
if metadata.DeletionTimestamp != nil {
dt := metav1.NewTime(*metadata.DeletionTimestamp)
o.DeletionTimestamp = &dt
} else {
o.DeletionTimestamp = nil
}
o.Finalizers = metadata.Finalizers
if o.Annotations == nil {
o.Annotations = make(map[string]string)
}
if !metadata.UpdateTimestamp.IsZero() {
o.SetUpdateTimestamp(metadata.UpdateTimestamp)
}
if metadata.CreatedBy != "" {
o.SetCreatedBy(metadata.CreatedBy)
}
if metadata.UpdatedBy != "" {
o.SetUpdatedBy(metadata.UpdatedBy)
}
// Legacy support for setting Annotations, ManagedFields, and OwnerReferences via ExtraFields
if metadata.ExtraFields != nil {
if annotations, ok := metadata.ExtraFields["annotations"]; ok {
if cast, ok := annotations.(map[string]string); ok {
o.Annotations = cast
}
}
if managedFields, ok := metadata.ExtraFields["managedFields"]; ok {
if cast, ok := managedFields.([]metav1.ManagedFieldsEntry); ok {
o.ManagedFields = cast
}
}
if ownerReferences, ok := metadata.ExtraFields["ownerReferences"]; ok {
if cast, ok := ownerReferences.([]metav1.OwnerReference); ok {
o.OwnerReferences = cast
}
}
}
}
func (o *DashboardCompatibilityScore) GetCreatedBy() string {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
return o.ObjectMeta.Annotations["grafana.com/createdBy"]
}
func (o *DashboardCompatibilityScore) SetCreatedBy(createdBy string) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/createdBy"] = createdBy
}
func (o *DashboardCompatibilityScore) GetUpdateTimestamp() time.Time {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
parsed, _ := time.Parse(time.RFC3339, o.ObjectMeta.Annotations["grafana.com/updateTimestamp"])
return parsed
}
func (o *DashboardCompatibilityScore) SetUpdateTimestamp(updateTimestamp time.Time) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/updateTimestamp"] = updateTimestamp.Format(time.RFC3339)
}
func (o *DashboardCompatibilityScore) GetUpdatedBy() string {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
return o.ObjectMeta.Annotations["grafana.com/updatedBy"]
}
func (o *DashboardCompatibilityScore) SetUpdatedBy(updatedBy string) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/updatedBy"] = updatedBy
}
func (o *DashboardCompatibilityScore) Copy() resource.Object {
return resource.CopyObject(o)
}
func (o *DashboardCompatibilityScore) DeepCopyObject() runtime.Object {
return o.Copy()
}
func (o *DashboardCompatibilityScore) DeepCopy() *DashboardCompatibilityScore {
cpy := &DashboardCompatibilityScore{}
o.DeepCopyInto(cpy)
return cpy
}
func (o *DashboardCompatibilityScore) DeepCopyInto(dst *DashboardCompatibilityScore) {
dst.TypeMeta.APIVersion = o.TypeMeta.APIVersion
dst.TypeMeta.Kind = o.TypeMeta.Kind
o.ObjectMeta.DeepCopyInto(&dst.ObjectMeta)
o.Spec.DeepCopyInto(&dst.Spec)
o.Status.DeepCopyInto(&dst.Status)
}
// Interface compliance compile-time check
var _ resource.Object = &DashboardCompatibilityScore{}
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreList struct {
metav1.TypeMeta `json:",inline" yaml:",inline"`
metav1.ListMeta `json:"metadata" yaml:"metadata"`
Items []DashboardCompatibilityScore `json:"items" yaml:"items"`
}
func (o *DashboardCompatibilityScoreList) DeepCopyObject() runtime.Object {
return o.Copy()
}
func (o *DashboardCompatibilityScoreList) Copy() resource.ListObject {
cpy := &DashboardCompatibilityScoreList{
TypeMeta: o.TypeMeta,
Items: make([]DashboardCompatibilityScore, len(o.Items)),
}
o.ListMeta.DeepCopyInto(&cpy.ListMeta)
for i := 0; i < len(o.Items); i++ {
if item, ok := o.Items[i].Copy().(*DashboardCompatibilityScore); ok {
cpy.Items[i] = *item
}
}
return cpy
}
func (o *DashboardCompatibilityScoreList) GetItems() []resource.Object {
items := make([]resource.Object, len(o.Items))
for i := 0; i < len(o.Items); i++ {
items[i] = &o.Items[i]
}
return items
}
func (o *DashboardCompatibilityScoreList) SetItems(items []resource.Object) {
o.Items = make([]DashboardCompatibilityScore, len(items))
for i := 0; i < len(items); i++ {
o.Items[i] = *items[i].(*DashboardCompatibilityScore)
}
}
func (o *DashboardCompatibilityScoreList) DeepCopy() *DashboardCompatibilityScoreList {
cpy := &DashboardCompatibilityScoreList{}
o.DeepCopyInto(cpy)
return cpy
}
func (o *DashboardCompatibilityScoreList) DeepCopyInto(dst *DashboardCompatibilityScoreList) {
resource.CopyObjectInto(dst, o)
}
// Interface compliance compile-time check
var _ resource.ListObject = &DashboardCompatibilityScoreList{}
// Copy methods for all subresource types
// DeepCopy creates a full deep copy of Spec
func (s *DashboardCompatibilityScoreSpec) DeepCopy() *DashboardCompatibilityScoreSpec {
cpy := &DashboardCompatibilityScoreSpec{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies Spec into another Spec object
func (s *DashboardCompatibilityScoreSpec) DeepCopyInto(dst *DashboardCompatibilityScoreSpec) {
resource.CopyObjectInto(dst, s)
}
// DeepCopy creates a full deep copy of DashboardCompatibilityScoreStatus
func (s *DashboardCompatibilityScoreStatus) DeepCopy() *DashboardCompatibilityScoreStatus {
cpy := &DashboardCompatibilityScoreStatus{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies DashboardCompatibilityScoreStatus into another DashboardCompatibilityScoreStatus object
func (s *DashboardCompatibilityScoreStatus) DeepCopyInto(dst *DashboardCompatibilityScoreStatus) {
resource.CopyObjectInto(dst, s)
}
@@ -0,0 +1,34 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"github.com/grafana/grafana-app-sdk/resource"
)
// schema is unexported to prevent accidental overwrites
var (
schemaDashboardCompatibilityScore = resource.NewSimpleSchema("dashvalidator.grafana.app", "v1alpha1", NewDashboardCompatibilityScore(), &DashboardCompatibilityScoreList{}, resource.WithKind("DashboardCompatibilityScore"),
resource.WithPlural("dashboardcompatibilityscores"), resource.WithScope(resource.NamespacedScope))
kindDashboardCompatibilityScore = resource.Kind{
Schema: schemaDashboardCompatibilityScore,
Codecs: map[resource.KindEncoding]resource.Codec{
resource.KindEncodingJSON: &DashboardCompatibilityScoreJSONCodec{},
},
}
)
// Kind returns a resource.Kind for this Schema with a JSON codec
func DashboardCompatibilityScoreKind() resource.Kind {
return kindDashboardCompatibilityScore
}
// Schema returns a resource.SimpleSchema representation of DashboardCompatibilityScore
func DashboardCompatibilityScoreSchema() *resource.SimpleSchema {
return schemaDashboardCompatibilityScore
}
// Interface compliance checks
var _ resource.Schema = kindDashboardCompatibilityScore
@@ -0,0 +1,48 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
// DataSourceMapping specifies a datasource to validate dashboard queries against.
// Maps logical datasource references in the dashboard to actual datasource instances.
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreDataSourceMapping struct {
// Unique identifier of the datasource instance.
// Example: "prometheus-prod-us-west"
Uid string `json:"uid"`
// Type of datasource plugin.
// MVP: Only "prometheus" supported.
// Future: "mysql", "postgres", "elasticsearch", etc.
Type string `json:"type"`
// Optional human-readable name for display in results.
// If not provided, UID will be used in error messages.
// Example: "Production Prometheus (US-West)"
Name *string `json:"name,omitempty"`
}
// NewDashboardCompatibilityScoreDataSourceMapping creates a new DashboardCompatibilityScoreDataSourceMapping object.
func NewDashboardCompatibilityScoreDataSourceMapping() *DashboardCompatibilityScoreDataSourceMapping {
return &DashboardCompatibilityScoreDataSourceMapping{}
}
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreSpec struct {
// Complete dashboard JSON object to validate.
// Must be a v1 dashboard schema (contains "panels" array).
// v2 dashboards (with "elements" structure) are not yet supported.
DashboardJson map[string]interface{} `json:"dashboardJson"`
// Array of datasources to validate against.
// The validator will check dashboard queries against each datasource
// and provide per-datasource compatibility results.
//
// MVP: Only single datasource supported (array length = 1), Prometheus type only.
// Future: Will support multiple datasources for dashboards with mixed queries.
DatasourceMappings []DashboardCompatibilityScoreDataSourceMapping `json:"datasourceMappings"`
}
// NewDashboardCompatibilityScoreSpec creates a new DashboardCompatibilityScoreSpec object.
func NewDashboardCompatibilityScoreSpec() *DashboardCompatibilityScoreSpec {
return &DashboardCompatibilityScoreSpec{
DashboardJson: map[string]interface{}{},
DatasourceMappings: []DashboardCompatibilityScoreDataSourceMapping{},
}
}
@@ -0,0 +1,151 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
// DataSourceResult contains validation results for a single datasource.
// Provides aggregate statistics and per-query breakdown of compatibility.
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreDataSourceResult struct {
// Datasource UID that was validated (matches DataSourceMapping.uid)
Uid string `json:"uid"`
// Datasource type (matches DataSourceMapping.type)
Type string `json:"type"`
// Optional display name (matches DataSourceMapping.name if provided)
Name *string `json:"name,omitempty"`
// Total number of queries in the dashboard targeting this datasource.
// Includes all panel targets/queries that reference this datasource.
TotalQueries int64 `json:"totalQueries"`
// Number of queries successfully validated.
// May be less than totalQueries if some queries couldn't be parsed.
CheckedQueries int64 `json:"checkedQueries"`
// Total number of unique metrics/identifiers referenced across all queries.
// For Prometheus: metric names extracted from PromQL expressions.
// For SQL datasources: table and column names.
TotalMetrics int64 `json:"totalMetrics"`
// Number of metrics that exist in the datasource schema.
// foundMetrics <= totalMetrics
FoundMetrics int64 `json:"foundMetrics"`
// Array of metric names that were referenced but don't exist.
// Useful for debugging why a dashboard shows "no data".
// Example for Prometheus: ["http_requests_total", "api_latency_seconds"]
MissingMetrics []string `json:"missingMetrics"`
// Per-query breakdown showing which specific queries have issues.
// One entry per query target (refId: "A", "B", "C", etc.) in each panel.
// Allows pinpointing exactly which panel/query needs fixing.
QueryBreakdown []DashboardCompatibilityScoreQueryBreakdown `json:"queryBreakdown"`
// Overall compatibility score for this datasource (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// Used to calculate the global compatibilityScore in status.
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewDashboardCompatibilityScoreDataSourceResult creates a new DashboardCompatibilityScoreDataSourceResult object.
func NewDashboardCompatibilityScoreDataSourceResult() *DashboardCompatibilityScoreDataSourceResult {
return &DashboardCompatibilityScoreDataSourceResult{
MissingMetrics: []string{},
QueryBreakdown: []DashboardCompatibilityScoreQueryBreakdown{},
}
}
// QueryBreakdown provides compatibility details for a single query within a panel.
// Granular per-query results allow users to identify exactly which queries need fixing.
//
// Note: A panel can have multiple queries (refId: "A", "B", "C", etc.),
// so there may be multiple QueryBreakdown entries for the same panelID.
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreQueryBreakdown struct {
// Human-readable panel title for context.
// Example: "CPU Usage", "Request Rate"
PanelTitle string `json:"panelTitle"`
// Numeric panel ID from dashboard JSON.
// Used to correlate with dashboard structure.
PanelID int64 `json:"panelID"`
// Query identifier within the panel.
// Values: "A", "B", "C", etc. (from panel.targets[].refId)
// Uniquely identifies which query in a multi-query panel this refers to.
QueryRefId string `json:"queryRefId"`
// Number of unique metrics referenced in this specific query.
// For Prometheus: metrics extracted from the PromQL expr.
// Example: rate(http_requests_total[5m]) references 1 metric.
TotalMetrics int64 `json:"totalMetrics"`
// Number of those metrics that exist in the datasource.
// foundMetrics <= totalMetrics
FoundMetrics int64 `json:"foundMetrics"`
// Array of missing metric names specific to this query.
// Helps identify exactly which part of a query expression will fail.
// Empty array means query is fully compatible.
MissingMetrics []string `json:"missingMetrics"`
// Compatibility percentage for this individual query (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// 100 = query will work perfectly, 0 = query will return no data.
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewDashboardCompatibilityScoreQueryBreakdown creates a new DashboardCompatibilityScoreQueryBreakdown object.
func NewDashboardCompatibilityScoreQueryBreakdown() *DashboardCompatibilityScoreQueryBreakdown {
return &DashboardCompatibilityScoreQueryBreakdown{
MissingMetrics: []string{},
}
}
// +k8s:openapi-gen=true
type DashboardCompatibilityScorestatusOperatorState struct {
// lastEvaluation is the ResourceVersion last evaluated
LastEvaluation string `json:"lastEvaluation"`
// state describes the state of the lastEvaluation.
// It is limited to three possible states for machine evaluation.
State DashboardCompatibilityScoreStatusOperatorStateState `json:"state"`
// descriptiveState is an optional more descriptive state field which has no requirements on format
DescriptiveState *string `json:"descriptiveState,omitempty"`
// details contains any extra information that is operator-specific
Details map[string]interface{} `json:"details,omitempty"`
}
// NewDashboardCompatibilityScorestatusOperatorState creates a new DashboardCompatibilityScorestatusOperatorState object.
func NewDashboardCompatibilityScorestatusOperatorState() *DashboardCompatibilityScorestatusOperatorState {
return &DashboardCompatibilityScorestatusOperatorState{}
}
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreStatus struct {
// Overall compatibility score across all datasources (0-100).
// Calculated as: (total found metrics / total referenced metrics) * 100
//
// Score interpretation:
// - 100: Perfect compatibility, all queries will work
// - 80-99: Excellent, minor missing metrics
// - 50-79: Fair, significant missing metrics
// - 0-49: Poor, most queries will fail
CompatibilityScore float64 `json:"compatibilityScore"`
// Per-datasource validation results.
// Array length matches spec.datasourceMappings.
// Each element contains detailed metrics and query-level breakdown.
DatasourceResults []DashboardCompatibilityScoreDataSourceResult `json:"datasourceResults"`
// ISO 8601 timestamp of when validation was last performed.
// Example: "2024-01-15T10:30:00Z"
LastChecked *string `json:"lastChecked,omitempty"`
// operatorStates is a map of operator ID to operator state evaluations.
// Any operator which consumes this kind SHOULD add its state evaluation information to this field.
OperatorStates map[string]DashboardCompatibilityScorestatusOperatorState `json:"operatorStates,omitempty"`
// Human-readable summary of validation result.
// Examples: "All queries compatible", "3 missing metrics found"
Message *string `json:"message,omitempty"`
// additionalFields is reserved for future use
AdditionalFields map[string]interface{} `json:"additionalFields,omitempty"`
}
// NewDashboardCompatibilityScoreStatus creates a new DashboardCompatibilityScoreStatus object.
func NewDashboardCompatibilityScoreStatus() *DashboardCompatibilityScoreStatus {
return &DashboardCompatibilityScoreStatus{
DatasourceResults: []DashboardCompatibilityScoreDataSourceResult{},
}
}
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreStatusOperatorStateState string
const (
DashboardCompatibilityScoreStatusOperatorStateStateSuccess DashboardCompatibilityScoreStatusOperatorStateState = "success"
DashboardCompatibilityScoreStatusOperatorStateStateInProgress DashboardCompatibilityScoreStatusOperatorStateState = "in_progress"
DashboardCompatibilityScoreStatusOperatorStateStateFailed DashboardCompatibilityScoreStatusOperatorStateState = "failed"
)
File diff suppressed because one or more lines are too long
+360
View File
@@ -0,0 +1,360 @@
package app
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/logging"
"github.com/grafana/grafana-app-sdk/resource"
"github.com/grafana/grafana-app-sdk/simple"
"k8s.io/apimachinery/pkg/runtime/schema"
validatorv1alpha1 "github.com/grafana/grafana/apps/dashvalidator/pkg/apis/dashvalidator/v1alpha1"
"github.com/grafana/grafana/apps/dashvalidator/pkg/validator"
_ "github.com/grafana/grafana/apps/dashvalidator/pkg/validator/prometheus" // Register prometheus validator via init()
"github.com/grafana/grafana/pkg/infra/httpclient"
"github.com/grafana/grafana/pkg/services/datasources"
"github.com/grafana/grafana/pkg/services/pluginsintegration/plugincontext"
"strings"
)
type DashValidatorConfig struct {
DatasourceSvc datasources.DataSourceService
PluginCtx *plugincontext.Provider
HTTPClientProvider httpclient.Provider
}
// checkRequest matches the CUE schema for POST /check request
type checkRequest struct {
DashboardJSON map[string]interface{} `json:"dashboardJson"`
DatasourceMappings []datasourceMapping `json:"datasourceMappings"`
}
// datasourceMapping represents a datasource to validate against
type datasourceMapping struct {
UID string `json:"uid"`
Type string `json:"type"`
Name *string `json:"name,omitempty"`
}
// checkResponse matches the CUE schema for POST /check response
type checkResponse struct {
CompatibilityScore float64 `json:"compatibilityScore"`
DatasourceResults []datasourceResult `json:"datasourceResults"`
}
// datasourceResult contains validation results for a single datasource
type datasourceResult struct {
UID string `json:"uid"`
Type string `json:"type"`
Name *string `json:"name,omitempty"`
TotalQueries int `json:"totalQueries"`
CheckedQueries int `json:"checkedQueries"`
TotalMetrics int `json:"totalMetrics"`
FoundMetrics int `json:"foundMetrics"`
MissingMetrics []string `json:"missingMetrics"`
QueryBreakdown []queryResult `json:"queryBreakdown"`
CompatibilityScore float64 `json:"compatibilityScore"`
}
// queryResult contains validation results for a single query
type queryResult struct {
PanelTitle string `json:"panelTitle"`
PanelID int `json:"panelID"`
QueryRefID string `json:"queryRefId"`
TotalMetrics int `json:"totalMetrics"`
FoundMetrics int `json:"foundMetrics"`
MissingMetrics []string `json:"missingMetrics"`
CompatibilityScore float64 `json:"compatibilityScore"`
}
func New(cfg app.Config) (app.App, error) {
specificConfig, ok := cfg.SpecificConfig.(*DashValidatorConfig)
if !ok {
return nil, fmt.Errorf("invalid config type: expected DashValidatorConfig")
}
log := logging.DefaultLogger.With("app", "dashvalidator")
// configure our app
simpleConfig := simple.AppConfig{
Name: "dashvalidator",
KubeConfig: cfg.KubeConfig,
//Define our custom route
VersionedCustomRoutes: map[string]simple.AppVersionRouteHandlers{
"v1alpha1": {
{
Namespaced: true,
Path: "check",
Method: "POST",
}: handleCheckRoute(log, specificConfig.DatasourceSvc, specificConfig.PluginCtx, specificConfig.HTTPClientProvider),
},
},
}
a, err := simple.NewApp(simpleConfig)
if err != nil {
return nil, fmt.Errorf("failed to create app: %w", err)
}
return a, nil
}
// custom route handler to check dashboard compatibility
func handleCheckRoute(
log logging.Logger,
datasourceSvc datasources.DataSourceService,
pluginCtx *plugincontext.Provider,
httpClientProvider httpclient.Provider,
) func(context.Context, app.CustomRouteResponseWriter, *app.CustomRouteRequest) error {
return func(ctx context.Context, w app.CustomRouteResponseWriter, r *app.CustomRouteRequest) error {
logger := log.WithContext(ctx)
logger.Info("Received compatibility check request")
// Step 1: Parse request body
body, err := io.ReadAll(r.Body)
if err != nil {
logger.Error("Failed to read request body", "error", err)
w.WriteHeader(http.StatusBadRequest)
return json.NewEncoder(w).Encode(map[string]string{
"error": "failed to read request body",
})
}
var req checkRequest
if err := json.Unmarshal(body, &req); err != nil {
logger.Error("Failed to parse request JSON", "error", err)
w.WriteHeader(http.StatusBadRequest)
return json.NewEncoder(w).Encode(map[string]string{
"error": "invalid JSON in request body",
})
}
// MVP: Only support single datasource validation
if len(req.DatasourceMappings) != 1 {
logger.Error("MVP only supports single datasource validation", "numDatasources", len(req.DatasourceMappings))
w.WriteHeader(http.StatusBadRequest)
return json.NewEncoder(w).Encode(map[string]string{
"error": fmt.Sprintf("MVP only supports single datasource validation, got %d datasources", len(req.DatasourceMappings)),
"code": "invalid_request",
})
}
// Step 2: Build validator request
validatorReq := validator.DashboardCompatibilityRequest{
DashboardJSON: req.DashboardJSON,
DatasourceMappings: make([]validator.DatasourceMapping, 0, len(req.DatasourceMappings)),
}
logger.Info("Processing request", "dashboardTitle", req.DashboardJSON["title"], "numMappings", len(req.DatasourceMappings))
// Get namespace from request (needed for datasource lookup)
// Namespace format is typically "org-{orgID}"
namespace := r.ResourceIdentifier.Namespace
// Extract orgID from namespace for logging context
orgID := extractOrgIDFromNamespace(namespace)
logger = logger.With("orgID", orgID, "namespace", namespace)
for _, dsMapping := range req.DatasourceMappings {
dsLogger := logger.With("datasourceUID", dsMapping.UID, "datasourceType", dsMapping.Type)
// Convert optional name pointer to string
name := ""
if dsMapping.Name != nil {
name = *dsMapping.Name
dsLogger = dsLogger.With("datasourceName", name)
}
// Fetch datasource from Grafana using app-platform method
// Parameters: namespace, name (UID), group (datasource type)
ds, err := datasourceSvc.GetDataSourceInNamespace(ctx, namespace, dsMapping.UID, dsMapping.Type)
if err != nil {
dsLogger.Error("Failed to get datasource from namespace", "error", err)
// Check if it's a not found error vs other errors
errMsg := err.Error()
statusCode := http.StatusInternalServerError
userMsg := fmt.Sprintf("failed to retrieve datasource: %s", dsMapping.UID)
if strings.Contains(errMsg, "not found") || strings.Contains(errMsg, "does not exist") {
statusCode = http.StatusNotFound
userMsg = fmt.Sprintf("datasource not found: %s (type: %s)", dsMapping.UID, dsMapping.Type)
dsLogger.Warn("Datasource not found in namespace")
}
w.WriteHeader(statusCode)
return json.NewEncoder(w).Encode(map[string]string{
"error": userMsg,
"code": "datasource_error",
})
}
dsLogger.Info("Retrieved datasource", "url", ds.URL, "actualType", ds.Type)
// Validate that the datasource type matches the expected type
if ds.Type != dsMapping.Type {
dsLogger.Error("Datasource type mismatch",
"expectedType", dsMapping.Type,
"actualType", ds.Type)
w.WriteHeader(http.StatusBadRequest)
return json.NewEncoder(w).Encode(map[string]string{
"error": fmt.Sprintf("datasource %s has type %s, expected %s", dsMapping.UID, ds.Type, dsMapping.Type),
"code": "datasource_wrong_type",
})
}
// Validate that this is a supported datasource type
// For MVP, we only support Prometheus
if !isSupportedDatasourceType(ds.Type) {
dsLogger.Error("Unsupported datasource type", "type", ds.Type)
w.WriteHeader(http.StatusBadRequest)
return json.NewEncoder(w).Encode(map[string]string{
"error": fmt.Sprintf("datasource type '%s' is not supported (currently only 'prometheus' is supported)", ds.Type),
"code": "datasource_unsupported_type",
})
}
// Get authenticated HTTP transport for this datasource
transport, err := datasourceSvc.GetHTTPTransport(ctx, ds, httpClientProvider)
if err != nil {
dsLogger.Error("Failed to get HTTP transport for datasource", "error", err)
w.WriteHeader(http.StatusInternalServerError)
return json.NewEncoder(w).Encode(map[string]string{
"error": fmt.Sprintf("failed to configure authentication for datasource: %s", dsMapping.UID),
"code": "datasource_config_error",
})
}
// Create HTTP client with authenticated transport
httpClient := &http.Client{
Transport: transport,
}
validatorReq.DatasourceMappings = append(validatorReq.DatasourceMappings, validator.DatasourceMapping{
UID: dsMapping.UID,
Type: dsMapping.Type,
Name: name,
URL: ds.URL,
HTTPClient: httpClient, // Pass authenticated client
})
dsLogger.Debug("Datasource configured successfully for validation")
}
// Step 3: Validate dashboard compatibility
result, err := validator.ValidateDashboardCompatibility(ctx, validatorReq)
if err != nil {
logger.Error("Validation failed", "error", err)
// Check if it's a structured ValidationError with a specific status code
statusCode := http.StatusInternalServerError
errorCode := "validation_error"
errorMsg := fmt.Sprintf("validation failed: %v", err)
if validationErr := validator.GetValidationError(err); validationErr != nil {
statusCode = validationErr.StatusCode
errorCode = string(validationErr.Code)
errorMsg = validationErr.Message
// Log additional context from the error
for key, value := range validationErr.Details {
logger.Error("Validation error detail", key, value)
}
}
w.WriteHeader(statusCode)
return json.NewEncoder(w).Encode(map[string]string{
"error": errorMsg,
"code": errorCode,
})
}
// Step 4: Convert result to response format
response := convertToCheckResponse(result)
// Step 5: Return response
w.WriteHeader(http.StatusOK)
return json.NewEncoder(w).Encode(response)
}
}
// convertToCheckResponse converts validator result to API response format
func convertToCheckResponse(result *validator.DashboardCompatibilityResult) checkResponse {
response := checkResponse{
CompatibilityScore: result.CompatibilityScore,
DatasourceResults: make([]datasourceResult, 0, len(result.DatasourceResults)),
}
for _, dsResult := range result.DatasourceResults {
// Convert name string to pointer
var name *string
if dsResult.Name != "" {
name = &dsResult.Name
}
// Convert query results
queryBreakdown := make([]queryResult, 0, len(dsResult.QueryBreakdown))
for _, qr := range dsResult.QueryBreakdown {
queryBreakdown = append(queryBreakdown, queryResult{
PanelTitle: qr.PanelTitle,
PanelID: qr.PanelID,
QueryRefID: qr.QueryRefID,
TotalMetrics: qr.TotalMetrics,
FoundMetrics: qr.FoundMetrics,
MissingMetrics: qr.MissingMetrics,
CompatibilityScore: qr.CompatibilityScore,
})
}
response.DatasourceResults = append(response.DatasourceResults, datasourceResult{
UID: dsResult.UID,
Type: dsResult.Type,
Name: name,
TotalQueries: dsResult.TotalQueries,
CheckedQueries: dsResult.CheckedQueries,
TotalMetrics: dsResult.TotalMetrics,
FoundMetrics: dsResult.FoundMetrics,
MissingMetrics: dsResult.MissingMetrics,
QueryBreakdown: queryBreakdown,
CompatibilityScore: dsResult.CompatibilityScore,
})
}
return response
}
// extractOrgIDFromNamespace extracts the org ID from a namespace string
// Namespace format is typically "org-{orgID}"
func extractOrgIDFromNamespace(namespace string) string {
parts := strings.Split(namespace, "-")
if len(parts) >= 2 && parts[0] == "org" {
return parts[1]
}
return "unknown"
}
// isSupportedDatasourceType checks if a datasource type is supported
// For MVP, we only support Prometheus
func isSupportedDatasourceType(dsType string) bool {
supportedTypes := map[string]bool{
"prometheus": true,
}
return supportedTypes[strings.ToLower(dsType)]
}
func GetKinds() map[schema.GroupVersion][]resource.Kind {
gv := schema.GroupVersion{
Group: "dashvalidator.grafana.com",
Version: "v1alpha1",
}
return map[schema.GroupVersion][]resource.Kind{
gv: {validatorv1alpha1.DashboardCompatibilityScoreKind()},
}
}
@@ -0,0 +1,18 @@
package v1alpha1
import "k8s.io/apimachinery/pkg/runtime/schema"
const (
// APIGroup is the API group used by all kinds in this package
APIGroup = "dashvalidator.ext.grafana.com"
// APIVersion is the API version used by all kinds in this package
APIVersion = "v1alpha1"
)
var (
// GroupVersion is a schema.GroupVersion consisting of the Group and Version constants for this package
GroupVersion = schema.GroupVersion{
Group: APIGroup,
Version: APIVersion,
}
)
@@ -0,0 +1,99 @@
package v1alpha1
import (
"context"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
)
type DashboardCompatibilityScoreClient struct {
client *resource.TypedClient[*DashboardCompatibilityScore, *DashboardCompatibilityScoreList]
}
func NewDashboardCompatibilityScoreClient(client resource.Client) *DashboardCompatibilityScoreClient {
return &DashboardCompatibilityScoreClient{
client: resource.NewTypedClient[*DashboardCompatibilityScore, *DashboardCompatibilityScoreList](client, Kind()),
}
}
func NewDashboardCompatibilityScoreClientFromGenerator(generator resource.ClientGenerator) (*DashboardCompatibilityScoreClient, error) {
c, err := generator.ClientFor(Kind())
if err != nil {
return nil, err
}
return NewDashboardCompatibilityScoreClient(c), nil
}
func (c *DashboardCompatibilityScoreClient) Get(ctx context.Context, identifier resource.Identifier) (*DashboardCompatibilityScore, error) {
return c.client.Get(ctx, identifier)
}
func (c *DashboardCompatibilityScoreClient) List(ctx context.Context, namespace string, opts resource.ListOptions) (*DashboardCompatibilityScoreList, error) {
return c.client.List(ctx, namespace, opts)
}
func (c *DashboardCompatibilityScoreClient) ListAll(ctx context.Context, namespace string, opts resource.ListOptions) (*DashboardCompatibilityScoreList, error) {
resp, err := c.client.List(ctx, namespace, resource.ListOptions{
ResourceVersion: opts.ResourceVersion,
Limit: opts.Limit,
LabelFilters: opts.LabelFilters,
FieldSelectors: opts.FieldSelectors,
})
if err != nil {
return nil, err
}
for resp.GetContinue() != "" {
page, err := c.client.List(ctx, namespace, resource.ListOptions{
Continue: resp.GetContinue(),
ResourceVersion: opts.ResourceVersion,
Limit: opts.Limit,
LabelFilters: opts.LabelFilters,
FieldSelectors: opts.FieldSelectors,
})
if err != nil {
return nil, err
}
resp.SetContinue(page.GetContinue())
resp.SetResourceVersion(page.GetResourceVersion())
resp.SetItems(append(resp.GetItems(), page.GetItems()...))
}
return resp, nil
}
func (c *DashboardCompatibilityScoreClient) Create(ctx context.Context, obj *DashboardCompatibilityScore, opts resource.CreateOptions) (*DashboardCompatibilityScore, error) {
// Make sure apiVersion and kind are set
obj.APIVersion = GroupVersion.Identifier()
obj.Kind = Kind().Kind()
return c.client.Create(ctx, obj, opts)
}
func (c *DashboardCompatibilityScoreClient) Update(ctx context.Context, obj *DashboardCompatibilityScore, opts resource.UpdateOptions) (*DashboardCompatibilityScore, error) {
return c.client.Update(ctx, obj, opts)
}
func (c *DashboardCompatibilityScoreClient) Patch(ctx context.Context, identifier resource.Identifier, req resource.PatchRequest, opts resource.PatchOptions) (*DashboardCompatibilityScore, error) {
return c.client.Patch(ctx, identifier, req, opts)
}
func (c *DashboardCompatibilityScoreClient) UpdateStatus(ctx context.Context, identifier resource.Identifier, newStatus Status, opts resource.UpdateOptions) (*DashboardCompatibilityScore, error) {
return c.client.Update(ctx, &DashboardCompatibilityScore{
TypeMeta: metav1.TypeMeta{
Kind: Kind().Kind(),
APIVersion: GroupVersion.Identifier(),
},
ObjectMeta: metav1.ObjectMeta{
ResourceVersion: opts.ResourceVersion,
Namespace: identifier.Namespace,
Name: identifier.Name,
},
Status: newStatus,
}, resource.UpdateOptions{
Subresource: "status",
ResourceVersion: opts.ResourceVersion,
})
}
func (c *DashboardCompatibilityScoreClient) Delete(ctx context.Context, identifier resource.Identifier, opts resource.DeleteOptions) error {
return c.client.Delete(ctx, identifier, opts)
}
@@ -0,0 +1,28 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"encoding/json"
"io"
"github.com/grafana/grafana-app-sdk/resource"
)
// JSONCodec is an implementation of resource.Codec for kubernetes JSON encoding
type JSONCodec struct{}
// Read reads JSON-encoded bytes from `reader` and unmarshals them into `into`
func (*JSONCodec) Read(reader io.Reader, into resource.Object) error {
return json.NewDecoder(reader).Decode(into)
}
// Write writes JSON-encoded bytes into `writer` marshaled from `from`
func (*JSONCodec) Write(writer io.Writer, from resource.Object) error {
return json.NewEncoder(writer).Encode(from)
}
// Interface compliance checks
var _ resource.Codec = &JSONCodec{}
@@ -0,0 +1,31 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
import (
time "time"
)
// metadata contains embedded CommonMetadata and can be extended with custom string fields
// TODO: use CommonMetadata instead of redefining here; currently needs to be defined here
// without external reference as using the CommonMetadata reference breaks thema codegen.
type Metadata struct {
UpdateTimestamp time.Time `json:"updateTimestamp"`
CreatedBy string `json:"createdBy"`
Uid string `json:"uid"`
CreationTimestamp time.Time `json:"creationTimestamp"`
DeletionTimestamp *time.Time `json:"deletionTimestamp,omitempty"`
Finalizers []string `json:"finalizers"`
ResourceVersion string `json:"resourceVersion"`
Generation int64 `json:"generation"`
UpdatedBy string `json:"updatedBy"`
Labels map[string]string `json:"labels"`
}
// NewMetadata creates a new Metadata object.
func NewMetadata() *Metadata {
return &Metadata{
Finalizers: []string{},
Labels: map[string]string{},
}
}
@@ -0,0 +1,326 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"fmt"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apimachinery/pkg/runtime/schema"
"k8s.io/apimachinery/pkg/types"
"time"
)
// +k8s:openapi-gen=true
type DashboardCompatibilityScore struct {
metav1.TypeMeta `json:",inline" yaml:",inline"`
metav1.ObjectMeta `json:"metadata" yaml:"metadata"`
// Spec is the spec of the DashboardCompatibilityScore
Spec Spec `json:"spec" yaml:"spec"`
Status Status `json:"status" yaml:"status"`
}
func NewDashboardCompatibilityScore() *DashboardCompatibilityScore {
return &DashboardCompatibilityScore{
Spec: *NewSpec(),
Status: *NewStatus(),
}
}
func (o *DashboardCompatibilityScore) GetSpec() any {
return o.Spec
}
func (o *DashboardCompatibilityScore) SetSpec(spec any) error {
cast, ok := spec.(Spec)
if !ok {
return fmt.Errorf("cannot set spec type %#v, not of type Spec", spec)
}
o.Spec = cast
return nil
}
func (o *DashboardCompatibilityScore) GetSubresources() map[string]any {
return map[string]any{
"status": o.Status,
}
}
func (o *DashboardCompatibilityScore) GetSubresource(name string) (any, bool) {
switch name {
case "status":
return o.Status, true
default:
return nil, false
}
}
func (o *DashboardCompatibilityScore) SetSubresource(name string, value any) error {
switch name {
case "status":
cast, ok := value.(Status)
if !ok {
return fmt.Errorf("cannot set status type %#v, not of type Status", value)
}
o.Status = cast
return nil
default:
return fmt.Errorf("subresource '%s' does not exist", name)
}
}
func (o *DashboardCompatibilityScore) GetStaticMetadata() resource.StaticMetadata {
gvk := o.GroupVersionKind()
return resource.StaticMetadata{
Name: o.ObjectMeta.Name,
Namespace: o.ObjectMeta.Namespace,
Group: gvk.Group,
Version: gvk.Version,
Kind: gvk.Kind,
}
}
func (o *DashboardCompatibilityScore) SetStaticMetadata(metadata resource.StaticMetadata) {
o.Name = metadata.Name
o.Namespace = metadata.Namespace
o.SetGroupVersionKind(schema.GroupVersionKind{
Group: metadata.Group,
Version: metadata.Version,
Kind: metadata.Kind,
})
}
func (o *DashboardCompatibilityScore) GetCommonMetadata() resource.CommonMetadata {
dt := o.DeletionTimestamp
var deletionTimestamp *time.Time
if dt != nil {
deletionTimestamp = &dt.Time
}
// Legacy ExtraFields support
extraFields := make(map[string]any)
if o.Annotations != nil {
extraFields["annotations"] = o.Annotations
}
if o.ManagedFields != nil {
extraFields["managedFields"] = o.ManagedFields
}
if o.OwnerReferences != nil {
extraFields["ownerReferences"] = o.OwnerReferences
}
return resource.CommonMetadata{
UID: string(o.UID),
ResourceVersion: o.ResourceVersion,
Generation: o.Generation,
Labels: o.Labels,
CreationTimestamp: o.CreationTimestamp.Time,
DeletionTimestamp: deletionTimestamp,
Finalizers: o.Finalizers,
UpdateTimestamp: o.GetUpdateTimestamp(),
CreatedBy: o.GetCreatedBy(),
UpdatedBy: o.GetUpdatedBy(),
ExtraFields: extraFields,
}
}
func (o *DashboardCompatibilityScore) SetCommonMetadata(metadata resource.CommonMetadata) {
o.UID = types.UID(metadata.UID)
o.ResourceVersion = metadata.ResourceVersion
o.Generation = metadata.Generation
o.Labels = metadata.Labels
o.CreationTimestamp = metav1.NewTime(metadata.CreationTimestamp)
if metadata.DeletionTimestamp != nil {
dt := metav1.NewTime(*metadata.DeletionTimestamp)
o.DeletionTimestamp = &dt
} else {
o.DeletionTimestamp = nil
}
o.Finalizers = metadata.Finalizers
if o.Annotations == nil {
o.Annotations = make(map[string]string)
}
if !metadata.UpdateTimestamp.IsZero() {
o.SetUpdateTimestamp(metadata.UpdateTimestamp)
}
if metadata.CreatedBy != "" {
o.SetCreatedBy(metadata.CreatedBy)
}
if metadata.UpdatedBy != "" {
o.SetUpdatedBy(metadata.UpdatedBy)
}
// Legacy support for setting Annotations, ManagedFields, and OwnerReferences via ExtraFields
if metadata.ExtraFields != nil {
if annotations, ok := metadata.ExtraFields["annotations"]; ok {
if cast, ok := annotations.(map[string]string); ok {
o.Annotations = cast
}
}
if managedFields, ok := metadata.ExtraFields["managedFields"]; ok {
if cast, ok := managedFields.([]metav1.ManagedFieldsEntry); ok {
o.ManagedFields = cast
}
}
if ownerReferences, ok := metadata.ExtraFields["ownerReferences"]; ok {
if cast, ok := ownerReferences.([]metav1.OwnerReference); ok {
o.OwnerReferences = cast
}
}
}
}
func (o *DashboardCompatibilityScore) GetCreatedBy() string {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
return o.ObjectMeta.Annotations["grafana.com/createdBy"]
}
func (o *DashboardCompatibilityScore) SetCreatedBy(createdBy string) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/createdBy"] = createdBy
}
func (o *DashboardCompatibilityScore) GetUpdateTimestamp() time.Time {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
parsed, _ := time.Parse(time.RFC3339, o.ObjectMeta.Annotations["grafana.com/updateTimestamp"])
return parsed
}
func (o *DashboardCompatibilityScore) SetUpdateTimestamp(updateTimestamp time.Time) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/updateTimestamp"] = updateTimestamp.Format(time.RFC3339)
}
func (o *DashboardCompatibilityScore) GetUpdatedBy() string {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
return o.ObjectMeta.Annotations["grafana.com/updatedBy"]
}
func (o *DashboardCompatibilityScore) SetUpdatedBy(updatedBy string) {
if o.ObjectMeta.Annotations == nil {
o.ObjectMeta.Annotations = make(map[string]string)
}
o.ObjectMeta.Annotations["grafana.com/updatedBy"] = updatedBy
}
func (o *DashboardCompatibilityScore) Copy() resource.Object {
return resource.CopyObject(o)
}
func (o *DashboardCompatibilityScore) DeepCopyObject() runtime.Object {
return o.Copy()
}
func (o *DashboardCompatibilityScore) DeepCopy() *DashboardCompatibilityScore {
cpy := &DashboardCompatibilityScore{}
o.DeepCopyInto(cpy)
return cpy
}
func (o *DashboardCompatibilityScore) DeepCopyInto(dst *DashboardCompatibilityScore) {
dst.TypeMeta.APIVersion = o.TypeMeta.APIVersion
dst.TypeMeta.Kind = o.TypeMeta.Kind
o.ObjectMeta.DeepCopyInto(&dst.ObjectMeta)
o.Spec.DeepCopyInto(&dst.Spec)
o.Status.DeepCopyInto(&dst.Status)
}
// Interface compliance compile-time check
var _ resource.Object = &DashboardCompatibilityScore{}
// +k8s:openapi-gen=true
type DashboardCompatibilityScoreList struct {
metav1.TypeMeta `json:",inline" yaml:",inline"`
metav1.ListMeta `json:"metadata" yaml:"metadata"`
Items []DashboardCompatibilityScore `json:"items" yaml:"items"`
}
func (o *DashboardCompatibilityScoreList) DeepCopyObject() runtime.Object {
return o.Copy()
}
func (o *DashboardCompatibilityScoreList) Copy() resource.ListObject {
cpy := &DashboardCompatibilityScoreList{
TypeMeta: o.TypeMeta,
Items: make([]DashboardCompatibilityScore, len(o.Items)),
}
o.ListMeta.DeepCopyInto(&cpy.ListMeta)
for i := 0; i < len(o.Items); i++ {
if item, ok := o.Items[i].Copy().(*DashboardCompatibilityScore); ok {
cpy.Items[i] = *item
}
}
return cpy
}
func (o *DashboardCompatibilityScoreList) GetItems() []resource.Object {
items := make([]resource.Object, len(o.Items))
for i := 0; i < len(o.Items); i++ {
items[i] = &o.Items[i]
}
return items
}
func (o *DashboardCompatibilityScoreList) SetItems(items []resource.Object) {
o.Items = make([]DashboardCompatibilityScore, len(items))
for i := 0; i < len(items); i++ {
o.Items[i] = *items[i].(*DashboardCompatibilityScore)
}
}
func (o *DashboardCompatibilityScoreList) DeepCopy() *DashboardCompatibilityScoreList {
cpy := &DashboardCompatibilityScoreList{}
o.DeepCopyInto(cpy)
return cpy
}
func (o *DashboardCompatibilityScoreList) DeepCopyInto(dst *DashboardCompatibilityScoreList) {
resource.CopyObjectInto(dst, o)
}
// Interface compliance compile-time check
var _ resource.ListObject = &DashboardCompatibilityScoreList{}
// Copy methods for all subresource types
// DeepCopy creates a full deep copy of Spec
func (s *Spec) DeepCopy() *Spec {
cpy := &Spec{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies Spec into another Spec object
func (s *Spec) DeepCopyInto(dst *Spec) {
resource.CopyObjectInto(dst, s)
}
// DeepCopy creates a full deep copy of Status
func (s *Status) DeepCopy() *Status {
cpy := &Status{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies Status into another Status object
func (s *Status) DeepCopyInto(dst *Status) {
resource.CopyObjectInto(dst, s)
}
@@ -0,0 +1,34 @@
//
// Code generated by grafana-app-sdk. DO NOT EDIT.
//
package v1alpha1
import (
"github.com/grafana/grafana-app-sdk/resource"
)
// schema is unexported to prevent accidental overwrites
var (
schemaDashboardCompatibilityScore = resource.NewSimpleSchema("dashvalidator.ext.grafana.com", "v1alpha1", NewDashboardCompatibilityScore(), &DashboardCompatibilityScoreList{}, resource.WithKind("DashboardCompatibilityScore"),
resource.WithPlural("dashboardcompatibilityscores"), resource.WithScope(resource.NamespacedScope))
kindDashboardCompatibilityScore = resource.Kind{
Schema: schemaDashboardCompatibilityScore,
Codecs: map[resource.KindEncoding]resource.Codec{
resource.KindEncodingJSON: &JSONCodec{},
},
}
)
// Kind returns a resource.Kind for this Schema with a JSON codec
func Kind() resource.Kind {
return kindDashboardCompatibilityScore
}
// Schema returns a resource.SimpleSchema representation of DashboardCompatibilityScore
func Schema() *resource.SimpleSchema {
return schemaDashboardCompatibilityScore
}
// Interface compliance checks
var _ resource.Schema = kindDashboardCompatibilityScore
@@ -0,0 +1,48 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
// DataSourceMapping specifies a datasource to validate dashboard queries against.
// Maps logical datasource references in the dashboard to actual datasource instances.
// +k8s:openapi-gen=true
type DataSourceMapping struct {
// Unique identifier of the datasource instance.
// Example: "prometheus-prod-us-west"
Uid string `json:"uid"`
// Type of datasource plugin.
// MVP: Only "prometheus" supported.
// Future: "mysql", "postgres", "elasticsearch", etc.
Type string `json:"type"`
// Optional human-readable name for display in results.
// If not provided, UID will be used in error messages.
// Example: "Production Prometheus (US-West)"
Name *string `json:"name,omitempty"`
}
// NewDataSourceMapping creates a new DataSourceMapping object.
func NewDataSourceMapping() *DataSourceMapping {
return &DataSourceMapping{}
}
// +k8s:openapi-gen=true
type Spec struct {
// Complete dashboard JSON object to validate.
// Must be a v1 dashboard schema (contains "panels" array).
// v2 dashboards (with "elements" structure) are not yet supported.
DashboardJson map[string]interface{} `json:"dashboardJson"`
// Array of datasources to validate against.
// The validator will check dashboard queries against each datasource
// and provide per-datasource compatibility results.
//
// MVP: Only single datasource supported (array length = 1), Prometheus type only.
// Future: Will support multiple datasources for dashboards with mixed queries.
DatasourceMappings []DataSourceMapping `json:"datasourceMappings"`
}
// NewSpec creates a new Spec object.
func NewSpec() *Spec {
return &Spec{
DashboardJson: map[string]interface{}{},
DatasourceMappings: []DataSourceMapping{},
}
}
@@ -0,0 +1,151 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
package v1alpha1
// DataSourceResult contains validation results for a single datasource.
// Provides aggregate statistics and per-query breakdown of compatibility.
// +k8s:openapi-gen=true
type DataSourceResult struct {
// Datasource UID that was validated (matches DataSourceMapping.uid)
Uid string `json:"uid"`
// Datasource type (matches DataSourceMapping.type)
Type string `json:"type"`
// Optional display name (matches DataSourceMapping.name if provided)
Name *string `json:"name,omitempty"`
// Total number of queries in the dashboard targeting this datasource.
// Includes all panel targets/queries that reference this datasource.
TotalQueries int64 `json:"totalQueries"`
// Number of queries successfully validated.
// May be less than totalQueries if some queries couldn't be parsed.
CheckedQueries int64 `json:"checkedQueries"`
// Total number of unique metrics/identifiers referenced across all queries.
// For Prometheus: metric names extracted from PromQL expressions.
// For SQL datasources: table and column names.
TotalMetrics int64 `json:"totalMetrics"`
// Number of metrics that exist in the datasource schema.
// foundMetrics <= totalMetrics
FoundMetrics int64 `json:"foundMetrics"`
// Array of metric names that were referenced but don't exist.
// Useful for debugging why a dashboard shows "no data".
// Example for Prometheus: ["http_requests_total", "api_latency_seconds"]
MissingMetrics []string `json:"missingMetrics"`
// Per-query breakdown showing which specific queries have issues.
// One entry per query target (refId: "A", "B", "C", etc.) in each panel.
// Allows pinpointing exactly which panel/query needs fixing.
QueryBreakdown []QueryBreakdown `json:"queryBreakdown"`
// Overall compatibility score for this datasource (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// Used to calculate the global compatibilityScore in status.
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewDataSourceResult creates a new DataSourceResult object.
func NewDataSourceResult() *DataSourceResult {
return &DataSourceResult{
MissingMetrics: []string{},
QueryBreakdown: []QueryBreakdown{},
}
}
// QueryBreakdown provides compatibility details for a single query within a panel.
// Granular per-query results allow users to identify exactly which queries need fixing.
//
// Note: A panel can have multiple queries (refId: "A", "B", "C", etc.),
// so there may be multiple QueryBreakdown entries for the same panelID.
// +k8s:openapi-gen=true
type QueryBreakdown struct {
// Human-readable panel title for context.
// Example: "CPU Usage", "Request Rate"
PanelTitle string `json:"panelTitle"`
// Numeric panel ID from dashboard JSON.
// Used to correlate with dashboard structure.
PanelID int64 `json:"panelID"`
// Query identifier within the panel.
// Values: "A", "B", "C", etc. (from panel.targets[].refId)
// Uniquely identifies which query in a multi-query panel this refers to.
QueryRefId string `json:"queryRefId"`
// Number of unique metrics referenced in this specific query.
// For Prometheus: metrics extracted from the PromQL expr.
// Example: rate(http_requests_total[5m]) references 1 metric.
TotalMetrics int64 `json:"totalMetrics"`
// Number of those metrics that exist in the datasource.
// foundMetrics <= totalMetrics
FoundMetrics int64 `json:"foundMetrics"`
// Array of missing metric names specific to this query.
// Helps identify exactly which part of a query expression will fail.
// Empty array means query is fully compatible.
MissingMetrics []string `json:"missingMetrics"`
// Compatibility percentage for this individual query (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// 100 = query will work perfectly, 0 = query will return no data.
CompatibilityScore float64 `json:"compatibilityScore"`
}
// NewQueryBreakdown creates a new QueryBreakdown object.
func NewQueryBreakdown() *QueryBreakdown {
return &QueryBreakdown{
MissingMetrics: []string{},
}
}
// +k8s:openapi-gen=true
type StatusOperatorState struct {
// lastEvaluation is the ResourceVersion last evaluated
LastEvaluation string `json:"lastEvaluation"`
// state describes the state of the lastEvaluation.
// It is limited to three possible states for machine evaluation.
State StatusOperatorStateState `json:"state"`
// descriptiveState is an optional more descriptive state field which has no requirements on format
DescriptiveState *string `json:"descriptiveState,omitempty"`
// details contains any extra information that is operator-specific
Details map[string]interface{} `json:"details,omitempty"`
}
// NewStatusOperatorState creates a new StatusOperatorState object.
func NewStatusOperatorState() *StatusOperatorState {
return &StatusOperatorState{}
}
// +k8s:openapi-gen=true
type Status struct {
// Overall compatibility score across all datasources (0-100).
// Calculated as: (total found metrics / total referenced metrics) * 100
//
// Score interpretation:
// - 100: Perfect compatibility, all queries will work
// - 80-99: Excellent, minor missing metrics
// - 50-79: Fair, significant missing metrics
// - 0-49: Poor, most queries will fail
CompatibilityScore float64 `json:"compatibilityScore"`
// Per-datasource validation results.
// Array length matches spec.datasourceMappings.
// Each element contains detailed metrics and query-level breakdown.
DatasourceResults []DataSourceResult `json:"datasourceResults"`
// ISO 8601 timestamp of when validation was last performed.
// Example: "2024-01-15T10:30:00Z"
LastChecked *string `json:"lastChecked,omitempty"`
// operatorStates is a map of operator ID to operator state evaluations.
// Any operator which consumes this kind SHOULD add its state evaluation information to this field.
OperatorStates map[string]StatusOperatorState `json:"operatorStates,omitempty"`
// Human-readable summary of validation result.
// Examples: "All queries compatible", "3 missing metrics found"
Message *string `json:"message,omitempty"`
// additionalFields is reserved for future use
AdditionalFields map[string]interface{} `json:"additionalFields,omitempty"`
}
// NewStatus creates a new Status object.
func NewStatus() *Status {
return &Status{
DatasourceResults: []DataSourceResult{},
}
}
// +k8s:openapi-gen=true
type StatusOperatorStateState string
const (
StatusOperatorStateStateSuccess StatusOperatorStateState = "success"
StatusOperatorStateStateInProgress StatusOperatorStateState = "in_progress"
StatusOperatorStateStateFailed StatusOperatorStateState = "failed"
)
File diff suppressed because one or more lines are too long
@@ -0,0 +1,568 @@
package validator
import (
"context"
"fmt"
"net/http"
"strings"
)
// DashboardCompatibilityRequest contains the dashboard and datasource mappings to validate
type DashboardCompatibilityRequest struct {
DashboardJSON map[string]interface{} // Dashboard JSON structure
DatasourceMappings []DatasourceMapping // List of datasources to validate against
}
// DatasourceMapping maps a datasource UID to its type and optionally name/URL
type DatasourceMapping struct {
UID string // Datasource UID
Type string // Datasource type (prometheus, mysql, etc.)
Name string // Optional: Datasource name
URL string // Datasource URL
HTTPClient *http.Client // Authenticated HTTP client
}
// DashboardCompatibilityResult contains the validation results for a dashboard
type DashboardCompatibilityResult struct {
CompatibilityScore float64 // Overall compatibility (0.0 - 1.0)
DatasourceResults []DatasourceValidationResult // Per-datasource results
}
// DatasourceValidationResult contains validation results for one datasource
type DatasourceValidationResult struct {
UID string
Type string
Name string
TotalQueries int
CheckedQueries int
TotalMetrics int
FoundMetrics int
MissingMetrics []string
QueryBreakdown []QueryResult
CompatibilityScore float64
}
// ValidateDashboardCompatibility is the main entry point for validating dashboard compatibility
// It extracts queries from the dashboard, validates them against each datasource, and returns aggregated results
func ValidateDashboardCompatibility(ctx context.Context, req DashboardCompatibilityRequest) (*DashboardCompatibilityResult, error) {
// MVP: Only support single datasource validation
if len(req.DatasourceMappings) != 1 {
return nil, fmt.Errorf("MVP only supports single datasource validation, got %d datasources", len(req.DatasourceMappings))
}
singleDatasource := req.DatasourceMappings[0]
result := &DashboardCompatibilityResult{
DatasourceResults: make([]DatasourceValidationResult, 0, len(req.DatasourceMappings)),
}
// Step 1: Extract queries from dashboard JSON
queries, err := extractQueriesFromDashboard(req.DashboardJSON)
if err != nil {
return nil, fmt.Errorf("failed to extract queries from dashboard: %w", err)
}
fmt.Printf("[DEBUG] Extracted %d queries from dashboard\n", len(queries))
for i, q := range queries {
fmt.Printf("[DEBUG] Query %d: DS=%s, RefID=%s, Query=%s\n", i, q.DatasourceUID, q.RefID, q.QueryText)
}
// Step 2: Group queries by datasource UID (with variable resolution for MVP)
queriesByDatasource := groupQueriesByDatasource(queries, singleDatasource.UID, req.DashboardJSON)
fmt.Printf("[DEBUG] Grouped queries by %d datasources\n", len(queriesByDatasource))
for dsUID, dsQueries := range queriesByDatasource {
fmt.Printf("[DEBUG] Datasource %s has %d queries\n", dsUID, len(dsQueries))
}
// Step 3: Validate each datasource
var totalCompatibility float64
validatedCount := 0
for _, dsMapping := range req.DatasourceMappings {
fmt.Printf("[DEBUG] Processing datasource mapping: UID=%s, Type=%s, URL=%s\n", dsMapping.UID, dsMapping.Type, dsMapping.URL)
// Get queries for this datasource
dsQueries, ok := queriesByDatasource[dsMapping.UID]
if !ok || len(dsQueries) == 0 {
// No queries for this datasource, skip
fmt.Printf("[DEBUG] No queries found for datasource %s, skipping\n", dsMapping.UID)
continue
}
fmt.Printf("[DEBUG] Found %d queries for datasource %s\n", len(dsQueries), dsMapping.UID)
// Get validator for this datasource type
v, err := GetValidator(dsMapping.Type)
if err != nil {
// Unsupported datasource type, skip but log
fmt.Printf("[DEBUG] Failed to get validator for type %s: %v\n", dsMapping.Type, err)
continue
}
fmt.Printf("[DEBUG] Got validator for type %s, starting validation\n", dsMapping.Type)
// Build Datasource struct
ds := Datasource{
UID: dsMapping.UID,
Type: dsMapping.Type,
Name: dsMapping.Name,
URL: dsMapping.URL,
HTTPClient: dsMapping.HTTPClient,
}
// Validate queries
validationResult, err := v.ValidateQueries(ctx, dsQueries, ds)
if err != nil {
// Validation failed for this datasource - return error to caller
// This could be a connection error, auth error, or other critical failure
return nil, fmt.Errorf("validation failed for datasource %s: %w", dsMapping.UID, err)
}
// Convert to DatasourceValidationResult
dsResult := DatasourceValidationResult{
UID: dsMapping.UID,
Type: dsMapping.Type,
Name: dsMapping.Name,
TotalQueries: validationResult.TotalQueries,
CheckedQueries: validationResult.CheckedQueries,
TotalMetrics: validationResult.TotalMetrics,
FoundMetrics: validationResult.FoundMetrics,
MissingMetrics: validationResult.MissingMetrics,
QueryBreakdown: validationResult.QueryBreakdown,
CompatibilityScore: validationResult.CompatibilityScore,
}
result.DatasourceResults = append(result.DatasourceResults, dsResult)
totalCompatibility += validationResult.CompatibilityScore
validatedCount++
}
// Step 4: Calculate overall compatibility score
if validatedCount > 0 {
result.CompatibilityScore = totalCompatibility / float64(validatedCount)
} else {
result.CompatibilityScore = 1.0 // No datasources = perfect compatibility
}
return result, nil
}
// extractQueriesFromDashboard parses the dashboard JSON and extracts all queries
// Supports both v1 (legacy) and v2 (new) dashboard formats
func extractQueriesFromDashboard(dashboardJSON map[string]interface{}) ([]DashboardQuery, error) {
var queries []DashboardQuery
// Debug: Print what keys we have
fmt.Printf("[DEBUG] Dashboard JSON keys: ")
for key := range dashboardJSON {
fmt.Printf("%s, ", key)
}
fmt.Printf("\n")
// Detect dashboard version (v1 uses "panels", v2 uses different structure)
// For MVP, we only support v1 (legacy format with panels array)
if !isV1Dashboard(dashboardJSON) {
fmt.Printf("[DEBUG] isV1Dashboard returned false, 'panels' key exists: %v\n", dashboardJSON["panels"] != nil)
return nil, fmt.Errorf("unsupported dashboard format: only v1 dashboards are supported in MVP")
}
// Extract panels array
panels, ok := dashboardJSON["panels"].([]interface{})
if !ok {
// No panels in dashboard, return empty array
return queries, nil
}
// Iterate through all panels
for _, panelInterface := range panels {
panel, ok := panelInterface.(map[string]interface{})
if !ok {
continue
}
// Extract queries from this panel
panelQueries := extractQueriesFromPanel(panel)
queries = append(queries, panelQueries...)
// Handle nested panels in collapsed rows
nestedPanels, hasNested := panel["panels"].([]interface{})
if hasNested {
for _, nestedPanelInterface := range nestedPanels {
nestedPanel, ok := nestedPanelInterface.(map[string]interface{})
if !ok {
continue
}
nestedQueries := extractQueriesFromPanel(nestedPanel)
queries = append(queries, nestedQueries...)
}
}
}
return queries, nil
}
// isV1Dashboard checks if a dashboard is in v1 (legacy) format
// v1 dashboards have a "panels" array at the top level
// v2 dashboards have "elements" map and "layout" structure
//
// This follows Grafana's official dashboard conversion logic which uses
// type-safe assertions to distinguish between formats.
// Reference: apps/dashboard/pkg/migration/conversion/v1beta1_to_v2alpha1.go:450
func isV1Dashboard(dashboard map[string]interface{}) bool {
// Check for v2 indicators first (positive identification)
// v2 dashboards use a map of elements, not an array
if _, hasElements := dashboard["elements"].(map[string]interface{}); hasElements {
return false // Definitely v2
}
// v2 dashboards also have a layout structure
if _, hasLayout := dashboard["layout"]; hasLayout {
return false // v2 has layout field
}
// Check for v1 panels with type assertion (must be an array)
// This is type-safe: `{"panels": "string"}` would fail this check and return false
_, hasPanels := dashboard["panels"].([]interface{})
return hasPanels
}
// extractQueriesFromPanel extracts all queries/targets from a single panel
func extractQueriesFromPanel(panel map[string]interface{}) []DashboardQuery {
var queries []DashboardQuery
// Get panel info for context
panelTitle := getStringValue(panel, "title", "Untitled Panel")
panelID := getIntValue(panel, "id", 0)
// Extract targets array (queries)
targets, hasTargets := panel["targets"].([]interface{})
if !hasTargets {
return queries
}
// Iterate through each target/query
for _, targetInterface := range targets {
target, ok := targetInterface.(map[string]interface{})
if !ok {
continue
}
// Extract datasource UID
datasourceUID := extractDatasourceUID(target, panel)
if datasourceUID == "" {
// Skip queries without datasource
continue
}
// Extract query text (different fields for different datasources)
queryText := extractQueryText(target)
if queryText == "" {
// Skip empty queries
continue
}
// Extract refId (A, B, C, etc.)
refID := getStringValue(target, "refId", "")
// Build DashboardQuery
query := DashboardQuery{
DatasourceUID: datasourceUID,
RefID: refID,
QueryText: queryText,
PanelTitle: panelTitle,
PanelID: panelID,
}
queries = append(queries, query)
}
return queries
}
// extractDatasourceUID gets the datasource UID from a target, falling back to panel datasource
func extractDatasourceUID(target map[string]interface{}, panel map[string]interface{}) string {
// Try target-level datasource first
if ds, ok := target["datasource"]; ok {
if uid := getDatasourceUIDFromValue(ds); uid != "" {
return uid
}
}
// Fall back to panel-level datasource
if ds, ok := panel["datasource"]; ok {
if uid := getDatasourceUIDFromValue(ds); uid != "" {
return uid
}
}
return ""
}
// getDatasourceUIDFromValue extracts UID from datasource value (can be string or object)
func getDatasourceUIDFromValue(ds interface{}) string {
switch v := ds.(type) {
case string:
// Direct UID string
return v
case map[string]interface{}:
// Structured datasource reference { uid: "...", type: "..." }
return getStringValue(v, "uid", "")
default:
return ""
}
}
// isVariableReference checks if a string is a template variable reference
// Matches patterns: ${varname}, $varname, [[varname]]
// Follows Grafana's frontend regex: /\$(\w+)|\[\[(\w+?)(?::(\w+))?\]\]|\${(\w+)(?:\.([^:^\}]+))?(?::([^\}]+))?}/g
// where \w = [A-Za-z0-9_] (alphanumeric + underscore, NO dashes)
func isVariableReference(uid string) bool {
if uid == "" {
return false
}
// Match ${...} pattern - requires at least one \w character inside braces
if len(uid) > 3 && uid[0] == '$' && uid[1] == '{' && uid[len(uid)-1] == '}' {
// Extract content between ${ and }
content := uid[2 : len(uid)-1]
if len(content) == 0 {
return false // Empty braces ${} not allowed
}
// Check if content starts with \w+ (before any . or :)
for i, ch := range content {
if ch == '.' || ch == ':' {
// Found delimiter, check if we had at least one \w before it
return i > 0
}
// Must be alphanumeric or underscore
if !((ch >= 'a' && ch <= 'z') || (ch >= 'A' && ch <= 'Z') ||
(ch >= '0' && ch <= '9') || ch == '_') {
return false
}
}
return true // All characters were valid \w
}
// Match $varname pattern - requires at least one \w character after $
// \w = alphanumeric + underscore (digits ARE allowed, dashes are NOT)
if uid[0] == '$' && len(uid) > 1 {
for i := 1; i < len(uid); i++ {
ch := uid[i]
// \w = [A-Za-z0-9_] only (NO dashes)
if !((ch >= 'a' && ch <= 'z') || (ch >= 'A' && ch <= 'Z') ||
(ch >= '0' && ch <= '9') || ch == '_') {
return false
}
}
return true
}
// Match [[varname]] pattern - requires at least one \w character inside brackets
// Also supports [[varname:format]] syntax
if len(uid) > 4 && uid[0] == '[' && uid[1] == '[' &&
uid[len(uid)-2] == ']' && uid[len(uid)-1] == ']' {
// Extract content between [[ and ]]
content := uid[2 : len(uid)-2]
if len(content) == 0 {
return false // Empty brackets [[]] not allowed
}
// Check if content starts with \w+ (before any :)
for i, ch := range content {
if ch == ':' {
// Found format delimiter, check if we had at least one \w before it
return i > 0
}
// Must be alphanumeric or underscore
if !((ch >= 'a' && ch <= 'z') || (ch >= 'A' && ch <= 'Z') ||
(ch >= '0' && ch <= '9') || ch == '_') {
return false
}
}
return true // All characters were valid \w
}
return false
}
// extractVariableName extracts the variable name from a variable reference
// Returns only the name part, excluding fieldPath (after .) and format (after :)
// Examples: ${var.field} -> "var", [[var:text]] -> "var", $datasource -> "datasource"
func extractVariableName(varRef string) string {
if !isVariableReference(varRef) {
return ""
}
// Handle ${varname} pattern - may include .fieldPath or :format
if len(varRef) > 3 && varRef[0] == '$' && varRef[1] == '{' && varRef[len(varRef)-1] == '}' {
content := varRef[2 : len(varRef)-1]
// Extract only up to . or :
for i, ch := range content {
if ch == '.' || ch == ':' {
return content[:i]
}
}
return content
}
// Handle $varname pattern - no modifiers possible
if varRef[0] == '$' && len(varRef) > 1 {
return varRef[1:]
}
// Handle [[varname]] pattern - may include :format
if len(varRef) > 4 && varRef[0] == '[' && varRef[1] == '[' {
content := varRef[2 : len(varRef)-2]
// Extract only up to :
for i, ch := range content {
if ch == ':' {
return content[:i]
}
}
return content
}
return ""
}
// isPrometheusVariable checks if a variable reference points to a Prometheus datasource
// Looks in dashboard.__inputs for the datasource type
func isPrometheusVariable(varRef string, dashboardJSON map[string]interface{}) bool {
if !isVariableReference(varRef) {
return false
}
varName := extractVariableName(varRef)
if varName == "" {
return false
}
// Look for __inputs array in dashboard
inputs, hasInputs := dashboardJSON["__inputs"].([]interface{})
if !hasInputs {
// No __inputs, assume it might be Prometheus (MVP: single datasource)
// This is a fallback for dashboards without explicit __inputs
return true
}
// Search for this variable in __inputs
for _, inputInterface := range inputs {
input, ok := inputInterface.(map[string]interface{})
if !ok {
continue
}
// Check if this input matches our variable name
inputName := getStringValue(input, "name", "")
inputType := getStringValue(input, "type", "")
inputPluginID := getStringValue(input, "pluginId", "")
// Match by name (case-insensitive for flexibility)
if inputName != "" && varName != "" {
if inputName == varName ||
strings.EqualFold(inputName, varName) ||
strings.Contains(strings.ToLower(varName), strings.ToLower(inputName)) {
// Check if it's a datasource input with prometheus plugin
if inputType == "datasource" && inputPluginID == "prometheus" {
return true
}
}
}
}
// Not found or not Prometheus
return false
}
// resolveDatasourceUID resolves a datasource UID, handling variable references (MVP: single datasource)
// For MVP, all Prometheus variables resolve to the single datasource UID
func resolveDatasourceUID(uid string, singleDatasourceUID string, dashboardJSON map[string]interface{}) string {
// If not a variable, return as-is (concrete UID)
if !isVariableReference(uid) {
return uid
}
// Check if it's a Prometheus variable
if isPrometheusVariable(uid, dashboardJSON) {
fmt.Printf("[DEBUG] Resolved Prometheus variable %s to %s\n", uid, singleDatasourceUID)
return singleDatasourceUID
}
// Non-Prometheus variable, return as-is (will be ignored in grouping)
fmt.Printf("[DEBUG] Variable %s is not a Prometheus variable, skipping\n", uid)
return uid
}
// extractQueryText extracts the query text from a target
// Different datasources use different field names (expr, query, rawSql, etc.)
func extractQueryText(target map[string]interface{}) string {
// Try common query field names
queryFields := []string{"expr", "query", "rawSql", "rawQuery", "target", "measurement"}
for _, field := range queryFields {
if queryText := getStringValue(target, field, ""); queryText != "" {
return queryText
}
}
return ""
}
// getStringValue safely extracts a string value from a map
func getStringValue(m map[string]interface{}, key string, defaultValue string) string {
if value, ok := m[key]; ok {
if s, ok := value.(string); ok {
return s
}
}
return defaultValue
}
// getIntValue safely extracts an int value from a map
func getIntValue(m map[string]interface{}, key string, defaultValue int) int {
if value, ok := m[key]; ok {
switch v := value.(type) {
case int:
return v
case float64:
return int(v)
case int64:
return int(v)
}
}
return defaultValue
}
// DashboardQuery represents a query extracted from a dashboard panel
type DashboardQuery struct {
DatasourceUID string // Which datasource this query belongs to
RefID string // Query reference ID
QueryText string // The actual query
PanelTitle string // Panel title
PanelID int // Panel ID
}
// groupQueriesByDatasource groups dashboard queries by their datasource UID
// For MVP: resolves Prometheus template variables to the single datasource UID
func groupQueriesByDatasource(queries []DashboardQuery, singleDatasourceUID string, dashboardJSON map[string]interface{}) map[string][]Query {
grouped := make(map[string][]Query)
for _, dq := range queries {
q := Query{
RefID: dq.RefID,
QueryText: dq.QueryText,
PanelTitle: dq.PanelTitle,
PanelID: dq.PanelID,
}
// Resolve datasource UID (handles both concrete UIDs and variables)
resolvedUID := resolveDatasourceUID(dq.DatasourceUID, singleDatasourceUID, dashboardJSON)
// Only add to grouping if we got a valid resolved UID
if resolvedUID != "" {
grouped[resolvedUID] = append(grouped[resolvedUID], q)
}
}
return grouped
}
@@ -0,0 +1,604 @@
package validator
import (
"testing"
"github.com/stretchr/testify/require"
)
// Note: extractQueryText() uses a hardcoded field priority list because
// Grafana doesn't expose datasource query schemas at runtime.
// When Grafana adds new datasource types, update the list in dashboard.go
// and add corresponding test cases here.
// =============================================================================
// Category 1: extractQueryText Tests
// Tests verify the hardcoded field priority list works correctly.
// =============================================================================
func TestExtractQueryText(t *testing.T) {
tests := []struct {
name string
target map[string]interface{}
expected string
}{
{
name: "prometheus_expr_field",
target: map[string]interface{}{
"expr": "up",
},
expected: "up",
},
{
name: "mysql_rawSql_field",
target: map[string]interface{}{
"rawSql": "SELECT * FROM users LIMIT 100",
},
expected: "SELECT * FROM users LIMIT 100",
},
{
name: "generic_query_field",
target: map[string]interface{}{
"query": "show measurements",
},
expected: "show measurements",
},
{
name: "field_priority_order",
target: map[string]interface{}{
"expr": "rate(cpu[5m])", // First priority
"query": "show metrics", // Second priority
},
expected: "rate(cpu[5m])", // Should return expr, not query
},
{
name: "missing_query_fields",
target: map[string]interface{}{"refId": "A", "hide": false},
expected: "",
},
{
name: "empty_string_value",
target: map[string]interface{}{
"expr": "",
},
expected: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := extractQueryText(tt.target)
require.Equal(t, tt.expected, result)
})
}
}
// =============================================================================
// Category 2: getDatasourceUIDFromValue Tests (4 tests)
// =============================================================================
func TestGetDatasourceUIDFromValue(t *testing.T) {
tests := []struct {
name string
value interface{}
expected string
}{
{
name: "string_datasource_uid",
value: "prom-123",
expected: "prom-123",
},
{
name: "object_datasource_with_uid",
value: map[string]interface{}{
"uid": "prom-123",
"type": "prometheus",
},
expected: "prom-123",
},
{
name: "variable_reference_passed_through",
value: "${DS_PROMETHEUS}",
expected: "${DS_PROMETHEUS}",
},
{
name: "nil_value",
value: nil,
expected: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := getDatasourceUIDFromValue(tt.value)
require.Equal(t, tt.expected, result)
})
}
}
// =============================================================================
// Category 3: extractDatasourceUID Tests (5 tests)
// =============================================================================
func TestExtractDatasourceUID(t *testing.T) {
tests := []struct {
name string
target map[string]interface{}
panel map[string]interface{}
expected string
}{
{
name: "target_level_datasource_string",
target: map[string]interface{}{
"datasource": "target-ds-123",
},
panel: map[string]interface{}{},
expected: "target-ds-123",
},
{
name: "target_level_datasource_object",
target: map[string]interface{}{
"datasource": map[string]interface{}{
"uid": "target-ds-456",
"type": "prometheus",
},
},
panel: map[string]interface{}{},
expected: "target-ds-456",
},
{
name: "panel_level_fallback",
target: map[string]interface{}{},
panel: map[string]interface{}{
"datasource": "panel-ds-789",
},
expected: "panel-ds-789",
},
{
name: "target_level_takes_precedence",
target: map[string]interface{}{
"datasource": "target-ds",
},
panel: map[string]interface{}{
"datasource": "panel-ds",
},
expected: "target-ds",
},
{
name: "both_missing_returns_empty",
target: map[string]interface{}{},
panel: map[string]interface{}{},
expected: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := extractDatasourceUID(tt.target, tt.panel)
require.Equal(t, tt.expected, result)
})
}
}
// =============================================================================
// Category 4: extractQueriesFromPanel Tests (8 tests)
// =============================================================================
func TestExtractQueriesFromPanel(t *testing.T) {
tests := []struct {
name string
panel map[string]interface{}
expected []DashboardQuery
}{
{
name: "panel_with_single_target",
panel: map[string]interface{}{
"id": 42,
"title": "CPU Usage",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"expr": "rate(cpu[5m])",
"datasource": "prom-main",
},
},
},
expected: []DashboardQuery{
{
DatasourceUID: "prom-main",
RefID: "A",
QueryText: "rate(cpu[5m])",
PanelTitle: "CPU Usage",
PanelID: 42,
},
},
},
{
name: "panel_with_multiple_targets",
panel: map[string]interface{}{
"id": 10,
"title": "Metrics",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"expr": "up",
"datasource": "prom-1",
},
map[string]interface{}{
"refId": "B",
"expr": "down",
"datasource": "prom-1",
},
},
},
expected: []DashboardQuery{
{
DatasourceUID: "prom-1",
RefID: "A",
QueryText: "up",
PanelTitle: "Metrics",
PanelID: 10,
},
{
DatasourceUID: "prom-1",
RefID: "B",
QueryText: "down",
PanelTitle: "Metrics",
PanelID: 10,
},
},
},
{
name: "panel_with_no_targets_field",
panel: map[string]interface{}{
"id": 1,
"title": "Text Panel",
},
expected: []DashboardQuery{},
},
{
name: "panel_with_empty_targets_array",
panel: map[string]interface{}{
"id": 2,
"title": "Empty",
"targets": []interface{}{},
},
expected: []DashboardQuery{},
},
{
name: "target_missing_datasource_skipped",
panel: map[string]interface{}{
"id": 3,
"title": "Incomplete",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"expr": "up",
// No datasource field
},
},
},
expected: []DashboardQuery{}, // Empty because no datasource
},
{
name: "target_missing_query_text_skipped",
panel: map[string]interface{}{
"id": 4,
"title": "No Query",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"datasource": "prom-1",
// No expr/query field
},
},
},
expected: []DashboardQuery{}, // Empty because no query text
},
{
name: "panel_metadata_extraction",
panel: map[string]interface{}{
"id": 999,
"title": "Custom Title",
"targets": []interface{}{
map[string]interface{}{
"refId": "Z",
"expr": "test_metric",
"datasource": "ds-abc",
},
},
},
expected: []DashboardQuery{
{
DatasourceUID: "ds-abc",
RefID: "Z",
QueryText: "test_metric",
PanelTitle: "Custom Title",
PanelID: 999,
},
},
},
{
name: "panel_id_as_float64",
panel: map[string]interface{}{
"id": float64(123), // JSON numbers parse as float64
"title": "Float ID Panel",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"expr": "metric",
"datasource": "ds-1",
},
},
},
expected: []DashboardQuery{
{
DatasourceUID: "ds-1",
RefID: "A",
QueryText: "metric",
PanelTitle: "Float ID Panel",
PanelID: 123,
},
},
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := extractQueriesFromPanel(tt.panel)
if len(tt.expected) == 0 {
require.Empty(t, result)
} else {
require.Equal(t, tt.expected, result)
}
})
}
}
// =============================================================================
// Category 5: Helper Functions Tests
// =============================================================================
func TestGetStringValue(t *testing.T) {
tests := []struct {
name string
m map[string]interface{}
key string
defaultValue string
expected string
}{
{
name: "returns_value_if_exists",
m: map[string]interface{}{"name": "test"},
key: "name",
defaultValue: "default",
expected: "test",
},
{
name: "returns_default_if_missing",
m: map[string]interface{}{"other": "value"},
key: "name",
defaultValue: "default",
expected: "default",
},
{
name: "handles_non_string_type",
m: map[string]interface{}{"name": 123},
key: "name",
defaultValue: "default",
expected: "default",
},
{
name: "empty_map_returns_default",
m: map[string]interface{}{},
key: "name",
defaultValue: "default",
expected: "default",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := getStringValue(tt.m, tt.key, tt.defaultValue)
require.Equal(t, tt.expected, result)
})
}
}
func TestGetIntValue(t *testing.T) {
tests := []struct {
name string
m map[string]interface{}
key string
defaultValue int
expected int
}{
{
name: "returns_int_value",
m: map[string]interface{}{"count": 42},
key: "count",
defaultValue: 0,
expected: 42,
},
{
name: "handles_float64_conversion",
m: map[string]interface{}{"count": float64(123)},
key: "count",
defaultValue: 0,
expected: 123,
},
{
name: "handles_int64_conversion",
m: map[string]interface{}{"count": int64(456)},
key: "count",
defaultValue: 0,
expected: 456,
},
{
name: "returns_default_for_missing",
m: map[string]interface{}{},
key: "count",
defaultValue: 99,
expected: 99,
},
{
name: "returns_default_for_invalid_type",
m: map[string]interface{}{"count": "not a number"},
key: "count",
defaultValue: 99,
expected: 99,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := getIntValue(tt.m, tt.key, tt.defaultValue)
require.Equal(t, tt.expected, result)
})
}
}
// =============================================================================
// Category 6: Integration Tests
// Real-world dashboard panel structures
// =============================================================================
func TestRealisticPrometheusPanel(t *testing.T) {
// Realistic Prometheus panel from actual Grafana dashboard
panel := map[string]interface{}{
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "prometheus-main",
},
"gridPos": map[string]interface{}{
"h": 8,
"w": 12,
"x": 0,
"y": 0,
},
"id": 28,
"title": "Request Rate",
"type": "timeseries",
"targets": []interface{}{
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "prometheus-main",
},
"expr": "rate(http_requests_total{job=\"api\"}[5m])",
"refId": "A",
"legendFormat": "{{method}} {{status}}",
"interval": "",
},
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "prometheus-main",
},
"expr": "rate(http_requests_total{job=\"worker\"}[5m])",
"refId": "B",
"legendFormat": "{{method}}",
},
},
}
result := extractQueriesFromPanel(panel)
require.Len(t, result, 2)
require.Equal(t, "prometheus-main", result[0].DatasourceUID)
require.Equal(t, "A", result[0].RefID)
require.Equal(t, "rate(http_requests_total{job=\"api\"}[5m])", result[0].QueryText)
require.Equal(t, "Request Rate", result[0].PanelTitle)
require.Equal(t, 28, result[0].PanelID)
require.Equal(t, "prometheus-main", result[1].DatasourceUID)
require.Equal(t, "B", result[1].RefID)
require.Equal(t, "rate(http_requests_total{job=\"worker\"}[5m])", result[1].QueryText)
}
func TestRealisticMySQLPanel(t *testing.T) {
// Realistic MySQL panel structure
panel := map[string]interface{}{
"id": 10,
"title": "Recent Users",
"type": "table",
"datasource": map[string]interface{}{
"type": "mysql",
"uid": "mysql-prod",
},
"targets": []interface{}{
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "mysql",
"uid": "mysql-prod",
},
"refId": "A",
"rawSql": "SELECT id, username, email FROM users WHERE created_at > NOW() - INTERVAL 1 DAY ORDER BY created_at DESC LIMIT 100",
"format": "table",
},
},
}
result := extractQueriesFromPanel(panel)
require.Len(t, result, 1)
require.Equal(t, "mysql-prod", result[0].DatasourceUID)
require.Equal(t, "A", result[0].RefID)
require.Contains(t, result[0].QueryText, "SELECT id, username, email FROM users")
require.Equal(t, "Recent Users", result[0].PanelTitle)
require.Equal(t, 10, result[0].PanelID)
}
func TestMixedDatasourcesPanel(t *testing.T) {
// Panel with targets using different datasource types
panel := map[string]interface{}{
"id": 50,
"title": "Mixed Data",
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "default-prom",
},
"targets": []interface{}{
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "prom-1",
},
"refId": "A",
"expr": "up",
},
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "elasticsearch",
"uid": "elastic-1",
},
"refId": "B",
"query": "status:200",
},
map[string]interface{}{
// Uses panel-level datasource (fallback)
"refId": "C",
"expr": "down",
},
},
}
result := extractQueriesFromPanel(panel)
require.Len(t, result, 3)
// Prometheus query
require.Equal(t, "prom-1", result[0].DatasourceUID)
require.Equal(t, "A", result[0].RefID)
require.Equal(t, "up", result[0].QueryText)
// Elasticsearch query
require.Equal(t, "elastic-1", result[1].DatasourceUID)
require.Equal(t, "B", result[1].RefID)
require.Equal(t, "status:200", result[1].QueryText)
// Query with panel-level datasource fallback
require.Equal(t, "default-prom", result[2].DatasourceUID)
require.Equal(t, "C", result[2].RefID)
require.Equal(t, "down", result[2].QueryText)
}
@@ -0,0 +1,197 @@
package validator
import (
"testing"
"github.com/stretchr/testify/require"
)
func TestIsV1Dashboard(t *testing.T) {
tests := []struct {
name string
dashboard map[string]interface{}
expected bool
}{
{
name: "v1 dashboard with panels array",
dashboard: map[string]interface{}{
"panels": []interface{}{
map[string]interface{}{
"id": 1,
"title": "Panel 1",
"type": "timeseries",
},
},
},
expected: true,
},
{
name: "v1 dashboard with empty panels",
dashboard: map[string]interface{}{
"panels": []interface{}{},
},
expected: true,
},
{
name: "v2 dashboard with elements map",
dashboard: map[string]interface{}{
"elements": map[string]interface{}{
"panel-1": map[string]interface{}{
"kind": "Panel",
"spec": map[string]interface{}{
"id": 1,
"title": "Panel 1",
},
},
},
},
expected: false,
},
{
name: "v2 dashboard with layout",
dashboard: map[string]interface{}{
"layout": map[string]interface{}{
"kind": "GridLayout",
"spec": map[string]interface{}{
"items": []interface{}{},
},
},
},
expected: false,
},
{
name: "v2 dashboard with both elements and layout",
dashboard: map[string]interface{}{
"elements": map[string]interface{}{
"panel-1": map[string]interface{}{
"kind": "Panel",
},
},
"layout": map[string]interface{}{
"kind": "GridLayout",
},
},
expected: false,
},
{
name: "empty dashboard",
dashboard: map[string]interface{}{},
expected: false,
},
{
name: "dashboard with wrong panels type (string instead of array)",
dashboard: map[string]interface{}{
"panels": "this-should-be-array-not-string",
},
expected: false,
},
{
name: "dashboard with other fields only",
dashboard: map[string]interface{}{
"title": "Test Dashboard",
"uid": "test-uid",
"tags": []string{"monitoring"},
},
expected: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := isV1Dashboard(tt.dashboard)
require.Equal(t, tt.expected, result, "isV1Dashboard() returned unexpected result")
})
}
}
func TestExtractQueriesFromDashboard_VersionValidation(t *testing.T) {
tests := []struct {
name string
dashboard map[string]interface{}
expectError bool
errorContains string
}{
{
name: "valid v1 dashboard extracts queries successfully",
dashboard: map[string]interface{}{
"panels": []interface{}{
map[string]interface{}{
"id": 1,
"title": "CPU Usage",
"type": "timeseries",
"gridPos": map[string]interface{}{
"h": 8,
"w": 12,
"x": 0,
"y": 0,
},
"targets": []interface{}{
map[string]interface{}{
"datasource": map[string]interface{}{
"type": "prometheus",
"uid": "test-prometheus",
},
"expr": "rate(cpu_usage_total[5m])",
"refId": "A",
},
},
},
},
},
expectError: false,
},
{
name: "v2 dashboard returns unsupported format error",
dashboard: map[string]interface{}{
"elements": map[string]interface{}{
"panel-1": map[string]interface{}{
"kind": "Panel",
"spec": map[string]interface{}{
"id": 1,
"title": "Panel 1",
"data": map[string]interface{}{
"kind": "QueryGroup",
},
"vizConfig": map[string]interface{}{
"kind": "TimeSeriesVisualConfig",
"pluginId": "timeseries",
},
},
},
},
"layout": map[string]interface{}{
"kind": "GridLayout",
"spec": map[string]interface{}{
"items": []interface{}{},
},
},
},
expectError: true,
errorContains: "unsupported dashboard format",
},
{
name: "invalid dashboard (no panels or elements) returns error",
dashboard: map[string]interface{}{
"title": "Invalid Dashboard",
"description": "This dashboard has no panels or elements",
"tags": []string{"test"},
},
expectError: true,
errorContains: "unsupported dashboard format",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
queries, err := extractQueriesFromDashboard(tt.dashboard)
if tt.expectError {
require.Error(t, err, "Expected error but got none")
require.Contains(t, err.Error(), tt.errorContains, "Error message doesn't contain expected substring")
} else {
require.NoError(t, err, "Expected no error but got: %v", err)
require.NotNil(t, queries, "Queries should not be nil for valid dashboard")
}
})
}
}
+173
View File
@@ -0,0 +1,173 @@
package validator
import (
"errors"
"fmt"
"net/http"
)
// ErrorCode represents the type of error that occurred
type ErrorCode string
const (
// Datasource-related errors
ErrCodeDatasourceNotFound ErrorCode = "datasource_not_found"
ErrCodeDatasourceWrongType ErrorCode = "datasource_wrong_type"
ErrCodeDatasourceUnreachable ErrorCode = "datasource_unreachable"
ErrCodeDatasourceAuth ErrorCode = "datasource_auth_failed"
ErrCodeDatasourceConfig ErrorCode = "datasource_config_error"
// API-related errors
ErrCodeAPIUnavailable ErrorCode = "api_unavailable"
ErrCodeAPIInvalidResponse ErrorCode = "api_invalid_response"
ErrCodeAPIRateLimit ErrorCode = "api_rate_limit"
ErrCodeAPITimeout ErrorCode = "api_timeout"
// Validation errors
ErrCodeInvalidDashboard ErrorCode = "invalid_dashboard"
ErrCodeUnsupportedDashVersion ErrorCode = "unsupported_dashboard_version"
ErrCodeInvalidQuery ErrorCode = "invalid_query"
// Internal errors
ErrCodeInternal ErrorCode = "internal_error"
)
// ValidationError represents a structured error with context
type ValidationError struct {
Code ErrorCode
Message string
Details map[string]interface{}
StatusCode int
Cause error
}
// Error implements the error interface
func (e *ValidationError) Error() string {
if e.Cause != nil {
return fmt.Sprintf("%s: %s (caused by: %v)", e.Code, e.Message, e.Cause)
}
return fmt.Sprintf("%s: %s", e.Code, e.Message)
}
// Unwrap implements error unwrapping
func (e *ValidationError) Unwrap() error {
return e.Cause
}
// NewValidationError creates a new ValidationError
func NewValidationError(code ErrorCode, message string, statusCode int) *ValidationError {
return &ValidationError{
Code: code,
Message: message,
StatusCode: statusCode,
Details: make(map[string]interface{}),
}
}
// WithCause adds the underlying error cause
func (e *ValidationError) WithCause(err error) *ValidationError {
e.Cause = err
return e
}
// WithDetail adds contextual information
func (e *ValidationError) WithDetail(key string, value interface{}) *ValidationError {
e.Details[key] = value
return e
}
// Common error constructors
// NewDatasourceNotFoundError creates an error for datasource not found
func NewDatasourceNotFoundError(uid string, namespace string) *ValidationError {
return NewValidationError(
ErrCodeDatasourceNotFound,
fmt.Sprintf("datasource not found: %s", uid),
http.StatusNotFound,
).WithDetail("datasourceUID", uid).WithDetail("namespace", namespace)
}
// NewDatasourceWrongTypeError creates an error for wrong datasource type
func NewDatasourceWrongTypeError(uid string, expectedType string, actualType string) *ValidationError {
return NewValidationError(
ErrCodeDatasourceWrongType,
fmt.Sprintf("datasource %s has wrong type: expected %s, got %s", uid, expectedType, actualType),
http.StatusBadRequest,
).WithDetail("datasourceUID", uid).
WithDetail("expectedType", expectedType).
WithDetail("actualType", actualType)
}
// NewDatasourceUnreachableError creates an error for unreachable datasource
func NewDatasourceUnreachableError(uid string, url string, cause error) *ValidationError {
return NewValidationError(
ErrCodeDatasourceUnreachable,
fmt.Sprintf("datasource %s at %s is unreachable", uid, url),
http.StatusServiceUnavailable,
).WithDetail("datasourceUID", uid).
WithDetail("url", url).
WithCause(cause)
}
// NewAPIUnavailableError creates an error for unavailable API
func NewAPIUnavailableError(statusCode int, responseBody string, cause error) *ValidationError {
return NewValidationError(
ErrCodeAPIUnavailable,
fmt.Sprintf("Prometheus API returned status %d", statusCode),
http.StatusBadGateway,
).WithDetail("upstreamStatus", statusCode).
WithDetail("responseBody", responseBody).
WithCause(cause)
}
// NewAPIInvalidResponseError creates an error for invalid API response
func NewAPIInvalidResponseError(message string, cause error) *ValidationError {
return NewValidationError(
ErrCodeAPIInvalidResponse,
fmt.Sprintf("Prometheus API returned invalid response: %s", message),
http.StatusBadGateway,
).WithCause(cause)
}
// NewAPITimeoutError creates an error for API timeout
func NewAPITimeoutError(url string, cause error) *ValidationError {
return NewValidationError(
ErrCodeAPITimeout,
fmt.Sprintf("request to %s timed out", url),
http.StatusGatewayTimeout,
).WithDetail("url", url).
WithCause(cause)
}
// NewDatasourceAuthError creates an error for authentication failures
func NewDatasourceAuthError(uid string, statusCode int) *ValidationError {
return NewValidationError(
ErrCodeDatasourceAuth,
fmt.Sprintf("authentication failed for datasource %s (status %d)", uid, statusCode),
http.StatusUnauthorized,
).WithDetail("datasourceUID", uid).
WithDetail("upstreamStatus", statusCode)
}
// IsValidationError checks if an error is a ValidationError
func IsValidationError(err error) bool {
var validationErr *ValidationError
return errors.As(err, &validationErr)
}
// GetValidationError extracts a ValidationError from an error chain
func GetValidationError(err error) *ValidationError {
var validationErr *ValidationError
if errors.As(err, &validationErr) {
return validationErr
}
return nil
}
// GetHTTPStatusCode returns the appropriate HTTP status code for an error
func GetHTTPStatusCode(err error) int {
if validationErr := GetValidationError(err); validationErr != nil {
return validationErr.StatusCode
}
return http.StatusInternalServerError
}
@@ -0,0 +1,131 @@
package validator
import (
"errors"
"net/http"
"testing"
"github.com/stretchr/testify/require"
)
func TestNewDatasourceNotFoundError(t *testing.T) {
err := NewDatasourceNotFoundError("test-uid", "org-1")
require.Equal(t, ErrCodeDatasourceNotFound, err.Code)
require.Equal(t, http.StatusNotFound, err.StatusCode)
require.Equal(t, "test-uid", err.Details["datasourceUID"])
require.Equal(t, "org-1", err.Details["namespace"])
}
func TestNewDatasourceWrongTypeError(t *testing.T) {
err := NewDatasourceWrongTypeError("test-uid", "prometheus", "influxdb")
require.Equal(t, ErrCodeDatasourceWrongType, err.Code)
require.Equal(t, http.StatusBadRequest, err.StatusCode)
require.Equal(t, "prometheus", err.Details["expectedType"])
require.Equal(t, "influxdb", err.Details["actualType"])
}
func TestNewDatasourceUnreachableError(t *testing.T) {
cause := errors.New("connection refused")
err := NewDatasourceUnreachableError("test-uid", "http://localhost:9090", cause)
require.Equal(t, ErrCodeDatasourceUnreachable, err.Code)
require.Equal(t, http.StatusServiceUnavailable, err.StatusCode)
require.Equal(t, cause, err.Cause)
require.Equal(t, "http://localhost:9090", err.Details["url"])
}
func TestNewAPIUnavailableError(t *testing.T) {
err := NewAPIUnavailableError(503, "service unavailable", nil)
require.Equal(t, ErrCodeAPIUnavailable, err.Code)
require.Equal(t, http.StatusBadGateway, err.StatusCode)
require.Equal(t, 503, err.Details["upstreamStatus"])
}
func TestNewAPIInvalidResponseError(t *testing.T) {
cause := errors.New("invalid JSON")
err := NewAPIInvalidResponseError("missing data field", cause)
require.Equal(t, ErrCodeAPIInvalidResponse, err.Code)
require.Equal(t, http.StatusBadGateway, err.StatusCode)
require.Equal(t, cause, err.Cause)
}
func TestNewAPITimeoutError(t *testing.T) {
cause := errors.New("context deadline exceeded")
err := NewAPITimeoutError("http://localhost:9090/api/v1/query", cause)
require.Equal(t, ErrCodeAPITimeout, err.Code)
require.Equal(t, http.StatusGatewayTimeout, err.StatusCode)
require.Equal(t, cause, err.Cause)
}
func TestNewDatasourceAuthError(t *testing.T) {
err := NewDatasourceAuthError("test-uid", 401)
require.Equal(t, ErrCodeDatasourceAuth, err.Code)
require.Equal(t, http.StatusUnauthorized, err.StatusCode)
require.Equal(t, 401, err.Details["upstreamStatus"])
}
func TestValidationErrorChaining(t *testing.T) {
cause := errors.New("network error")
err := NewValidationError(ErrCodeInternal, "test error", http.StatusInternalServerError).
WithCause(cause).
WithDetail("key1", "value1").
WithDetail("key2", 123)
require.Equal(t, cause, err.Cause)
require.Equal(t, "value1", err.Details["key1"])
require.Equal(t, 123, err.Details["key2"])
}
func TestIsValidationError(t *testing.T) {
validationErr := NewDatasourceNotFoundError("test-uid", "org-1")
regularErr := errors.New("regular error")
require.True(t, IsValidationError(validationErr), "expected IsValidationError to return true for ValidationError")
require.False(t, IsValidationError(regularErr), "expected IsValidationError to return false for regular error")
}
func TestGetValidationError(t *testing.T) {
validationErr := NewDatasourceNotFoundError("test-uid", "org-1")
regularErr := errors.New("regular error")
retrieved := GetValidationError(validationErr)
require.NotNil(t, retrieved, "expected GetValidationError to return the ValidationError")
require.Equal(t, ErrCodeDatasourceNotFound, retrieved.Code)
retrieved = GetValidationError(regularErr)
require.Nil(t, retrieved, "expected GetValidationError to return nil for regular error")
}
func TestGetHTTPStatusCode(t *testing.T) {
validationErr := NewDatasourceNotFoundError("test-uid", "org-1")
regularErr := errors.New("regular error")
require.Equal(t, http.StatusNotFound, GetHTTPStatusCode(validationErr))
require.Equal(t, http.StatusInternalServerError, GetHTTPStatusCode(regularErr), "expected default status code for regular error")
}
func TestErrorUnwrap(t *testing.T) {
cause := errors.New("underlying error")
err := NewDatasourceUnreachableError("test-uid", "http://localhost:9090", cause)
require.Equal(t, cause, errors.Unwrap(err), "expected Unwrap to return the cause")
}
func TestErrorErrorMethod(t *testing.T) {
// Test without cause
err1 := NewDatasourceNotFoundError("test-uid", "org-1")
require.NotEmpty(t, err1.Error(), "expected non-empty error message")
// Test with cause
cause := errors.New("underlying error")
err2 := NewDatasourceUnreachableError("test-uid", "http://localhost:9090", cause)
errMsg2 := err2.Error()
require.NotEmpty(t, errMsg2, "expected non-empty error message")
require.Contains(t, errMsg2, "underlying error", "error message should include cause")
}
@@ -0,0 +1,142 @@
package prometheus
import (
"context"
"encoding/json"
"errors"
"fmt"
"io"
"net/http"
"net/url"
"path"
"strings"
"github.com/grafana/grafana/apps/dashvalidator/pkg/validator"
)
// Fetcher fetches available metrics from a Prometheus datasource
type Fetcher struct{}
// NewFetcher creates a new Prometheus metrics fetcher
func NewFetcher() *Fetcher {
return &Fetcher{}
}
// prometheusResponse represents the Prometheus API response structure
type prometheusResponse struct {
Status string `json:"status"`
Data []string `json:"data"`
Error string `json:"error,omitempty"`
}
// FetchMetrics queries Prometheus to get all available metric names
// It uses the /api/v1/label/__name__/values endpoint
// The provided HTTP client should have proper authentication configured
func (f *Fetcher) FetchMetrics(ctx context.Context, datasourceURL string, client *http.Client) ([]string, error) {
// Build the API URL
baseURL, err := url.Parse(datasourceURL)
if err != nil {
return nil, validator.NewValidationError(
validator.ErrCodeDatasourceConfig,
"invalid datasource URL",
http.StatusBadRequest,
).WithCause(err).WithDetail("url", datasourceURL)
}
// Append Prometheus API endpoint to base URL path using path.Join
// This correctly handles datasources with existing paths (e.g., /api/prom)
endpoint := "api/v1/label/__name__/values"
baseURL.Path = path.Join(baseURL.Path, endpoint)
// Create the request
req, err := http.NewRequestWithContext(ctx, http.MethodGet, baseURL.String(), nil)
if err != nil {
return nil, validator.NewValidationError(
validator.ErrCodeInternal,
"failed to create HTTP request",
http.StatusInternalServerError,
).WithCause(err)
}
// Execute the request using the provided authenticated client
resp, err := client.Do(req)
if err != nil {
// Check if it's a timeout error
if errors.Is(err, context.DeadlineExceeded) || strings.Contains(err.Error(), "timeout") {
return nil, validator.NewAPITimeoutError(baseURL.String(), err)
}
// Network or connection error - datasource is unreachable
return nil, validator.NewDatasourceUnreachableError("", datasourceURL, err)
}
defer resp.Body.Close()
// Read response body for error reporting
body, readErr := io.ReadAll(resp.Body)
if readErr != nil {
body = []byte("<unable to read response body>")
}
// Check HTTP status code
switch resp.StatusCode {
case http.StatusOK:
// Success - continue to parse response
case http.StatusUnauthorized, http.StatusForbidden:
// Authentication or authorization failure
return nil, validator.NewDatasourceAuthError("", resp.StatusCode).
WithDetail("url", baseURL.String()).
WithDetail("responseBody", string(body))
case http.StatusNotFound:
// Endpoint not found - might not be a valid Prometheus instance
return nil, validator.NewAPIUnavailableError(
resp.StatusCode,
string(body),
fmt.Errorf("endpoint not found - this may not be a valid Prometheus datasource"),
).WithDetail("url", baseURL.String())
case http.StatusTooManyRequests:
// Rate limiting
return nil, validator.NewValidationError(
validator.ErrCodeAPIRateLimit,
"Prometheus API rate limit exceeded",
http.StatusTooManyRequests,
).WithDetail("url", baseURL.String()).WithDetail("responseBody", string(body))
case http.StatusServiceUnavailable, http.StatusBadGateway, http.StatusGatewayTimeout:
// Upstream service is down or unavailable
return nil, validator.NewAPIUnavailableError(resp.StatusCode, string(body), nil).
WithDetail("url", baseURL.String())
default:
// Other error status codes
return nil, validator.NewAPIUnavailableError(resp.StatusCode, string(body), nil).
WithDetail("url", baseURL.String())
}
// Parse the response JSON
var promResp prometheusResponse
if err := json.Unmarshal(body, &promResp); err != nil {
return nil, validator.NewAPIInvalidResponseError(
"response is not valid JSON",
err,
).WithDetail("url", baseURL.String()).WithDetail("responseBody", string(body))
}
// Check Prometheus API status field
if promResp.Status != "success" {
errorMsg := promResp.Error
if errorMsg == "" {
errorMsg = "unknown error"
}
return nil, validator.NewAPIInvalidResponseError(
fmt.Sprintf("Prometheus API returned error status: %s", errorMsg),
nil,
).WithDetail("url", baseURL.String()).WithDetail("prometheusError", errorMsg)
}
// Validate that we got data
if promResp.Data == nil {
return nil, validator.NewAPIInvalidResponseError(
"response missing 'data' field",
nil,
).WithDetail("url", baseURL.String()).WithDetail("responseBody", string(body))
}
return promResp.Data, nil
}
@@ -0,0 +1,49 @@
package prometheus
import (
"fmt"
"github.com/prometheus/prometheus/promql/parser"
)
// Parser extracts metric names from PromQL queries
type Parser struct{}
// NewParser creates a new PromQL parser
func NewParser() *Parser {
return &Parser{}
}
// ExtractMetrics parses a PromQL query and extracts all metric names
// For example: "rate(http_requests_total[5m])" returns ["http_requests_total"]
func (p *Parser) ExtractMetrics(query string) ([]string, error) {
// Parse the PromQL expression
expr, err := parser.ParseExpr(query)
if err != nil {
return nil, fmt.Errorf("failed to parse PromQL query: %w", err)
}
// Extract metric names by walking the AST
metrics := make(map[string]bool) // Use map to deduplicate
parser.Inspect(expr, func(node parser.Node, _ []parser.Node) error {
// VectorSelector represents a metric selector like "up" or "up{job="foo"}"
if vs, ok := node.(*parser.VectorSelector); ok {
metrics[vs.Name] = true
}
// MatrixSelector represents range queries like "up[5m]"
if ms, ok := node.(*parser.MatrixSelector); ok {
if vs, ok := ms.VectorSelector.(*parser.VectorSelector); ok {
metrics[vs.Name] = true
}
}
return nil
})
// Convert map to slice
result := make([]string, 0, len(metrics))
for metric := range metrics {
result = append(result, metric)
}
return result, nil
}
@@ -0,0 +1,137 @@
package prometheus
import (
"testing"
"github.com/stretchr/testify/require"
)
func TestExtractMetrics(t *testing.T) {
parser := NewParser()
tests := []struct {
name string
query string
expected []string
expectError bool
errorContains string
}{
// Category 1: Basic Extraction (3 tests - covers AST node types)
{
name: "simple metric",
query: "up",
expected: []string{"up"},
},
{
name: "metric with labels",
query: `up{job="api"}`,
expected: []string{"up"},
},
{
name: "range selector",
query: "up[5m]",
expected: []string{"up"},
},
// Category 2: Function Composition (2 tests - nested complexity)
{
name: "single function",
query: "rate(http_requests_total[5m])",
expected: []string{"http_requests_total"},
},
{
name: "nested functions",
query: "sum(rate(requests[5m]))",
expected: []string{"requests"},
},
// Category 3: Binary Operations (2 tests - multiple metrics)
{
name: "two metrics",
query: "metric_a + metric_b",
expected: []string{"metric_a", "metric_b"},
},
{
name: "three metrics nested",
query: "(a + b) / c",
expected: []string{"a", "b", "c"},
},
// Category 4: Deduplication (1 test - critical behavior)
{
name: "duplicate metric",
query: "up + up",
expected: []string{"up"},
},
// Category 5: Edge Cases (2 tests - boundary behaviors)
{
name: "no metrics (literals only)",
query: "1 + 1",
expected: []string{},
},
{
name: "built-in function without metric",
query: "time()",
expected: []string{},
},
{
name: "comparison operator",
query: "a > 5",
expected: []string{"a"},
},
// Category 6: Real Dashboard Patterns (3 tests - production queries)
{
name: "binary op with function and labels",
query: `(time() - process_start_time_seconds{job="prometheus", instance=~"$node"})`,
expected: []string{"process_start_time_seconds"},
},
{
name: "rate with regex label matcher",
query: `rate(prometheus_local_storage_ingested_samples_total{instance=~"$node"}[5m])`,
expected: []string{"prometheus_local_storage_ingested_samples_total"},
},
{
name: "metric with negation and multiple labels",
query: `prometheus_target_interval_length_seconds{quantile!="0.01", quantile!="0.05", instance=~"$node"}`,
expected: []string{"prometheus_target_interval_length_seconds"},
},
// Category 7: Error Handling (2 tests - validation)
{
name: "empty string",
query: "",
expectError: true,
errorContains: "parse",
},
{
name: "malformed expression",
query: "{{invalid}}",
expectError: true,
errorContains: "parse",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result, err := parser.ExtractMetrics(tt.query)
// Check error expectation
if tt.expectError {
require.Error(t, err, "Expected error for query: %q", tt.query)
if tt.errorContains != "" {
require.ErrorContains(t, err, tt.errorContains,
"Error should contain %q for query: %q", tt.errorContains, tt.query)
}
return
}
require.NoError(t, err, "Unexpected error for query: %q", tt.query)
// Check result matches expected (order-independent for multiple metrics)
require.ElementsMatch(t, tt.expected, result,
"ExtractMetrics(%q) returned unexpected metrics", tt.query)
})
}
}
@@ -0,0 +1,149 @@
package prometheus
import (
"context"
"fmt"
"github.com/grafana/grafana/apps/dashvalidator/pkg/validator"
)
// Register Prometheus validator on package import
func init() {
validator.RegisterValidator("prometheus", func() validator.DatasourceValidator {
return NewValidator()
})
}
// Validator implements validator.DatasourceValidator for Prometheus datasources
type Validator struct {
parser *Parser
fetcher *Fetcher
}
// NewValidator creates a new Prometheus validator
func NewValidator() validator.DatasourceValidator {
return &Validator{
parser: NewParser(),
fetcher: NewFetcher(),
}
}
// ValidateQueries validates Prometheus queries against the datasource
func (v *Validator) ValidateQueries(ctx context.Context, queries []validator.Query, datasource validator.Datasource) (*validator.ValidationResult, error) {
fmt.Printf("[DEBUG PROM] Starting validation for %d queries against datasource %s\n", len(queries), datasource.URL)
result := &validator.ValidationResult{
TotalQueries: len(queries),
QueryBreakdown: make([]validator.QueryResult, 0, len(queries)),
}
// Step 1: Parse all queries to extract metrics
allMetrics := make(map[string]bool) // Use map to deduplicate
queryMetrics := make(map[int][]string)
for i, query := range queries {
fmt.Printf("[DEBUG PROM] Parsing query %d: %s\n", i, query.QueryText)
metrics, err := v.parser.ExtractMetrics(query.QueryText)
if err != nil {
// If we can't parse the query, we still continue with others
// but we don't count this query as "checked"
fmt.Printf("[DEBUG PROM] Failed to parse query %d: %v\n", i, err)
continue
}
fmt.Printf("[DEBUG PROM] Extracted %d metrics from query %d: %v\n", len(metrics), i, metrics)
result.CheckedQueries++
queryMetrics[i] = metrics
// Add to global metrics set
for _, metric := range metrics {
allMetrics[metric] = true
}
}
// Convert map to slice for fetcher
metricsToCheck := make([]string, 0, len(allMetrics))
for metric := range allMetrics {
metricsToCheck = append(metricsToCheck, metric)
}
result.TotalMetrics = len(metricsToCheck)
fmt.Printf("[DEBUG PROM] Total metrics to check: %d - %v\n", len(metricsToCheck), metricsToCheck)
// Step 2: Fetch available metrics from Prometheus
fmt.Printf("[DEBUG PROM] Fetching available metrics from %s\n", datasource.URL)
availableMetrics, err := v.fetcher.FetchMetrics(ctx, datasource.URL, datasource.HTTPClient)
if err != nil {
fmt.Printf("[DEBUG PROM] Failed to fetch metrics: %v\n", err)
return nil, fmt.Errorf("failed to fetch metrics from Prometheus: %w", err)
}
fmt.Printf("[DEBUG PROM] Fetched %d available metrics from Prometheus\n", len(availableMetrics))
// Build a set for O(1) lookup
availableSet := make(map[string]bool)
for _, metric := range availableMetrics {
availableSet[metric] = true
}
// Step 3: Calculate compatibility
missingMetricsMap := make(map[string]bool)
for _, metric := range metricsToCheck {
if !availableSet[metric] {
missingMetricsMap[metric] = true
}
}
result.FoundMetrics = result.TotalMetrics - len(missingMetricsMap)
// Convert missing metrics map to slice
result.MissingMetrics = make([]string, 0, len(missingMetricsMap))
for metric := range missingMetricsMap {
result.MissingMetrics = append(result.MissingMetrics, metric)
}
// Step 4: Build per-query breakdown
for i, query := range queries {
metrics, ok := queryMetrics[i]
if !ok {
// Query wasn't parsed successfully, skip
continue
}
queryResult := validator.QueryResult{
PanelTitle: query.PanelTitle,
PanelID: query.PanelID,
QueryRefID: query.RefID,
TotalMetrics: len(metrics),
}
// Check which metrics from this query are missing
queryMissing := make([]string, 0)
for _, metric := range metrics {
if missingMetricsMap[metric] {
queryMissing = append(queryMissing, metric)
}
}
queryResult.MissingMetrics = queryMissing
queryResult.FoundMetrics = queryResult.TotalMetrics - len(queryMissing)
// Calculate query-level compatibility score
if queryResult.TotalMetrics > 0 {
queryResult.CompatibilityScore = float64(queryResult.FoundMetrics) / float64(queryResult.TotalMetrics)
} else {
queryResult.CompatibilityScore = 1.0 // No metrics = perfect compatibility
}
result.QueryBreakdown = append(result.QueryBreakdown, queryResult)
}
// Step 5: Calculate overall compatibility score
if result.TotalMetrics > 0 {
result.CompatibilityScore = float64(result.FoundMetrics) / float64(result.TotalMetrics)
} else {
result.CompatibilityScore = 1.0 // No metrics = perfect compatibility
}
fmt.Printf("[DEBUG PROM] Validation complete! Score: %.2f, Found: %d/%d metrics\n",
result.CompatibilityScore, result.FoundMetrics, result.TotalMetrics)
return result, nil
}
+74
View File
@@ -0,0 +1,74 @@
package validator
import (
"context"
"fmt"
"net/http"
)
// DatasourceValidator validates dashboard queries against a datasource
// Implementations exist per datasource type (Prometheus, MySQL, etc.)
type DatasourceValidator interface {
// ValidateQueries checks if queries are compatible with the datasource
ValidateQueries(ctx context.Context, queries []Query, datasource Datasource) (*ValidationResult, error)
}
// Query represents a dashboard query to validate
type Query struct {
RefID string // Query reference ID (A, B, C, etc.)
QueryText string // The actual query text (PromQL, SQL, etc.)
PanelTitle string // Panel title for user-friendly reporting
PanelID int // Panel ID for reference
}
// Datasource contains connection information for a datasource
type Datasource struct {
UID string // Datasource UID from dashboard
Type string // Datasource type (prometheus, mysql, etc.)
Name string // Datasource name for reporting
URL string // Datasource URL for API calls
HTTPClient *http.Client // Authenticated HTTP client for making requests
}
// ValidationResult contains validation results for a datasource
type ValidationResult struct {
TotalQueries int // Total number of queries found
CheckedQueries int // Number of queries successfully checked
TotalMetrics int // Total metrics/entities referenced
FoundMetrics int // Metrics found in datasource
MissingMetrics []string // List of missing metrics
QueryBreakdown []QueryResult // Per-query results
CompatibilityScore float64 // Overall compatibility (0.0 - 1.0)
}
// QueryResult contains validation results for a single query
type QueryResult struct {
PanelTitle string // Panel title
PanelID int // Panel ID
QueryRefID string // Query reference ID
TotalMetrics int // Metrics in this query
FoundMetrics int // Metrics found
MissingMetrics []string // Missing metrics for this query
CompatibilityScore float64 // Query compatibility (0.0 - 1.0)
}
// validatorRegistry holds registered validator constructors
// Validators register themselves using RegisterValidator in their init() functions
var validatorRegistry = make(map[string]func() DatasourceValidator)
// RegisterValidator registers a validator constructor for a datasource type
// This is called by validator implementations in their init() functions
// Example: validator.RegisterValidator("prometheus", func() validator.DatasourceValidator { return NewValidator() })
func RegisterValidator(dsType string, constructor func() DatasourceValidator) {
validatorRegistry[dsType] = constructor
}
// GetValidator returns a validator for the given datasource type
// Returns an error if the datasource type is not supported
func GetValidator(dsType string) (DatasourceValidator, error) {
constructor, ok := validatorRegistry[dsType]
if !ok {
return nil, fmt.Errorf("unsupported datasource type: %s", dsType)
}
return constructor(), nil
}
@@ -0,0 +1,164 @@
package validator
import (
"testing"
"github.com/stretchr/testify/require"
)
func TestIsVariableReference(t *testing.T) {
tests := []struct {
name string
input string
expected bool
}{
{"dollar brace", "${prometheus}", true},
{"dollar simple", "$datasource", true},
{"double bracket", "[[prometheus]]", true},
{"concrete uid", "abcd1234", false},
{"empty string", "", false},
{"dollar only", "$", false},
{"empty braces", "${}", false},
{"number start", "$123", true}, // Changed: Grafana ACCEPTS digits (per \w+ regex)
{"all digits", "$999", true}, // New: All digits are valid per \w+
{"special chars dash", "$ds-name", false}, // Changed: Grafana REJECTS dashes (not in \w)
{"underscore", "$DS_PROMETHEUS", true},
{"complex variable", "${DS_PROMETHEUS}", true},
{"simple letter", "$p", true},
{"with fieldpath", "${var.field}", true}, // New: Test fieldPath syntax
{"with format", "[[var:text]]", true}, // New: Test format syntax
{"brace with format", "${var:json}", true}, // New: Test brace format syntax
{"digit in brackets", "[[123]]", true}, // New: Digits allowed in all patterns
{"empty brackets", "[[]]", false}, // New: Empty brackets rejected
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := isVariableReference(tt.input)
require.Equal(t, tt.expected, result, "isVariableReference(%q) returned unexpected result", tt.input)
})
}
}
func TestExtractVariableName(t *testing.T) {
tests := []struct {
name string
input string
expected string
}{
{"dollar brace", "${prometheus}", "prometheus"},
{"dollar simple", "$datasource", "datasource"},
{"double bracket", "[[prometheus]]", "prometheus"},
{"not variable", "concrete-uid", ""},
{"empty", "", ""},
{"complex name", "${DS_PROMETHEUS}", "DS_PROMETHEUS"},
{"with underscore", "$DS_NAME", "DS_NAME"},
{"digit variable", "$123", "123"}, // New: Digits are valid
{"with fieldpath", "${var.field}", "var"}, // Changed: Extract only name, not fieldPath
{"with format brace", "${var:json}", "var"}, // Changed: Extract only name, not format
{"with format bracket", "[[var:text]]", "var"}, // Changed: Extract only name, not format
{"fieldpath and format", "${var.field:json}", "var"}, // New: Extract only name from complex syntax
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := extractVariableName(tt.input)
require.Equal(t, tt.expected, result, "extractVariableName(%q) returned unexpected result", tt.input)
})
}
}
func TestIsPrometheusVariable(t *testing.T) {
// Dashboard with Prometheus __inputs
dashboardWithPrometheus := map[string]interface{}{
"__inputs": []interface{}{
map[string]interface{}{
"name": "DS_PROMETHEUS",
"type": "datasource",
"pluginId": "prometheus",
},
},
}
// Dashboard with MySQL __inputs
dashboardWithMySQL := map[string]interface{}{
"__inputs": []interface{}{
map[string]interface{}{
"name": "DS_MYSQL",
"type": "datasource",
"pluginId": "mysql",
},
},
}
// Dashboard without __inputs
dashboardWithoutInputs := map[string]interface{}{
"title": "Test Dashboard",
}
tests := []struct {
name string
varRef string
dashboard map[string]interface{}
expected bool
}{
{"prometheus variable with inputs", "${DS_PROMETHEUS}", dashboardWithPrometheus, true},
{"prometheus simple var", "$DS_PROMETHEUS", dashboardWithPrometheus, true},
{"mysql variable", "${DS_MYSQL}", dashboardWithMySQL, false},
{"not variable", "concrete-uid", dashboardWithPrometheus, false},
{"variable without inputs", "${prometheus}", dashboardWithoutInputs, true}, // Fallback to true for MVP
{"wrong variable name", "${OTHER}", dashboardWithPrometheus, false},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := isPrometheusVariable(tt.varRef, tt.dashboard)
require.Equal(t, tt.expected, result, "isPrometheusVariable(%q, dashboard) returned unexpected result", tt.varRef)
})
}
}
func TestResolveDatasourceUID(t *testing.T) {
singleUID := "prom-uid-123"
dashboardWithPrometheus := map[string]interface{}{
"__inputs": []interface{}{
map[string]interface{}{
"name": "DS_PROMETHEUS",
"type": "datasource",
"pluginId": "prometheus",
},
},
}
dashboardWithMySQL := map[string]interface{}{
"__inputs": []interface{}{
map[string]interface{}{
"name": "DS_MYSQL",
"type": "datasource",
"pluginId": "mysql",
},
},
}
tests := []struct {
name string
uid string
dashboard map[string]interface{}
expectedUID string
description string
}{
{"concrete uid", "concrete-123", dashboardWithPrometheus, "concrete-123", "should return concrete UID as-is"},
{"prometheus variable", "${DS_PROMETHEUS}", dashboardWithPrometheus, singleUID, "should resolve to single datasource UID"},
{"prometheus simple var", "$DS_PROMETHEUS", dashboardWithPrometheus, singleUID, "should resolve simple $ syntax"},
{"mysql variable", "${DS_MYSQL}", dashboardWithMySQL, "${DS_MYSQL}", "should return non-Prometheus variable as-is"},
{"empty uid", "", dashboardWithPrometheus, "", "should return empty string as-is"},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := resolveDatasourceUID(tt.uid, singleUID, tt.dashboard)
require.Equal(t, tt.expectedUID, result, "resolveDatasourceUID(%q, %q, dashboard): %s", tt.uid, singleUID, tt.description)
})
}
}
@@ -0,0 +1,49 @@
/*
* This file was generated by grafana-app-sdk. DO NOT EDIT.
*/
import { Spec } from './types.spec.gen';
import { Status } from './types.status.gen';
export interface Metadata {
name: string;
namespace: string;
generateName?: string;
selfLink?: string;
uid?: string;
resourceVersion?: string;
generation?: number;
creationTimestamp?: string;
deletionTimestamp?: string;
deletionGracePeriodSeconds?: number;
labels?: Record<string, string>;
annotations?: Record<string, string>;
ownerReferences?: OwnerReference[];
finalizers?: string[];
managedFields?: ManagedFieldsEntry[];
}
export interface OwnerReference {
apiVersion: string;
kind: string;
name: string;
uid: string;
controller?: boolean;
blockOwnerDeletion?: boolean;
}
export interface ManagedFieldsEntry {
manager?: string;
operation?: string;
apiVersion?: string;
time?: string;
fieldsType?: string;
subresource?: string;
}
export interface DashboardCompatibilityScore {
kind: string;
apiVersion: string;
metadata: Metadata;
spec: Spec;
status: Status;
}
@@ -0,0 +1,30 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
// metadata contains embedded CommonMetadata and can be extended with custom string fields
// TODO: use CommonMetadata instead of redefining here; currently needs to be defined here
// without external reference as using the CommonMetadata reference breaks thema codegen.
export interface Metadata {
updateTimestamp: string;
createdBy: string;
uid: string;
creationTimestamp: string;
deletionTimestamp?: string;
finalizers: string[];
resourceVersion: string;
generation: number;
updatedBy: string;
labels: Record<string, string>;
}
export const defaultMetadata = (): Metadata => ({
updateTimestamp: "",
createdBy: "",
uid: "",
creationTimestamp: "",
finalizers: [],
resourceVersion: "",
generation: 0,
updatedBy: "",
labels: {},
});
@@ -0,0 +1,42 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
// DataSourceMapping specifies a datasource to validate dashboard queries against.
// Maps logical datasource references in the dashboard to actual datasource instances.
export interface DataSourceMapping {
// Unique identifier of the datasource instance.
// Example: "prometheus-prod-us-west"
uid: string;
// Type of datasource plugin.
// MVP: Only "prometheus" supported.
// Future: "mysql", "postgres", "elasticsearch", etc.
type: string;
// Optional human-readable name for display in results.
// If not provided, UID will be used in error messages.
// Example: "Production Prometheus (US-West)"
name?: string;
}
export const defaultDataSourceMapping = (): DataSourceMapping => ({
uid: "",
type: "",
});
export interface Spec {
// Complete dashboard JSON object to validate.
// Must be a v1 dashboard schema (contains "panels" array).
// v2 dashboards (with "elements" structure) are not yet supported.
dashboardJson: Record<string, any>;
// Array of datasources to validate against.
// The validator will check dashboard queries against each datasource
// and provide per-datasource compatibility results.
//
// MVP: Only single datasource supported (array length = 1), Prometheus type only.
// Future: Will support multiple datasources for dashboards with mixed queries.
datasourceMappings: DataSourceMapping[];
}
export const defaultSpec = (): Spec => ({
dashboardJson: {},
datasourceMappings: [],
});
@@ -0,0 +1,142 @@
// Code generated - EDITING IS FUTILE. DO NOT EDIT.
// DataSourceResult contains validation results for a single datasource.
// Provides aggregate statistics and per-query breakdown of compatibility.
export interface DataSourceResult {
// Datasource UID that was validated (matches DataSourceMapping.uid)
uid: string;
// Datasource type (matches DataSourceMapping.type)
type: string;
// Optional display name (matches DataSourceMapping.name if provided)
name?: string;
// Total number of queries in the dashboard targeting this datasource.
// Includes all panel targets/queries that reference this datasource.
totalQueries: number;
// Number of queries successfully validated.
// May be less than totalQueries if some queries couldn't be parsed.
checkedQueries: number;
// Total number of unique metrics/identifiers referenced across all queries.
// For Prometheus: metric names extracted from PromQL expressions.
// For SQL datasources: table and column names.
totalMetrics: number;
// Number of metrics that exist in the datasource schema.
// foundMetrics <= totalMetrics
foundMetrics: number;
// Array of metric names that were referenced but don't exist.
// Useful for debugging why a dashboard shows "no data".
// Example for Prometheus: ["http_requests_total", "api_latency_seconds"]
missingMetrics: string[];
// Per-query breakdown showing which specific queries have issues.
// One entry per query target (refId: "A", "B", "C", etc.) in each panel.
// Allows pinpointing exactly which panel/query needs fixing.
queryBreakdown: QueryBreakdown[];
// Overall compatibility score for this datasource (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// Used to calculate the global compatibilityScore in status.
compatibilityScore: number;
}
export const defaultDataSourceResult = (): DataSourceResult => ({
uid: "",
type: "",
totalQueries: 0,
checkedQueries: 0,
totalMetrics: 0,
foundMetrics: 0,
missingMetrics: [],
queryBreakdown: [],
compatibilityScore: 0,
});
// QueryBreakdown provides compatibility details for a single query within a panel.
// Granular per-query results allow users to identify exactly which queries need fixing.
//
// Note: A panel can have multiple queries (refId: "A", "B", "C", etc.),
// so there may be multiple QueryBreakdown entries for the same panelID.
export interface QueryBreakdown {
// Human-readable panel title for context.
// Example: "CPU Usage", "Request Rate"
panelTitle: string;
// Numeric panel ID from dashboard JSON.
// Used to correlate with dashboard structure.
panelID: number;
// Query identifier within the panel.
// Values: "A", "B", "C", etc. (from panel.targets[].refId)
// Uniquely identifies which query in a multi-query panel this refers to.
queryRefId: string;
// Number of unique metrics referenced in this specific query.
// For Prometheus: metrics extracted from the PromQL expr.
// Example: rate(http_requests_total[5m]) references 1 metric.
totalMetrics: number;
// Number of those metrics that exist in the datasource.
// foundMetrics <= totalMetrics
foundMetrics: number;
// Array of missing metric names specific to this query.
// Helps identify exactly which part of a query expression will fail.
// Empty array means query is fully compatible.
missingMetrics: string[];
// Compatibility percentage for this individual query (0-100).
// Calculated as: (foundMetrics / totalMetrics) * 100
// 100 = query will work perfectly, 0 = query will return no data.
compatibilityScore: number;
}
export const defaultQueryBreakdown = (): QueryBreakdown => ({
panelTitle: "",
panelID: 0,
queryRefId: "",
totalMetrics: 0,
foundMetrics: 0,
missingMetrics: [],
compatibilityScore: 0,
});
export interface OperatorState {
// lastEvaluation is the ResourceVersion last evaluated
lastEvaluation: string;
// state describes the state of the lastEvaluation.
// It is limited to three possible states for machine evaluation.
state: "success" | "in_progress" | "failed";
// descriptiveState is an optional more descriptive state field which has no requirements on format
descriptiveState?: string;
// details contains any extra information that is operator-specific
details?: Record<string, any>;
}
export const defaultOperatorState = (): OperatorState => ({
lastEvaluation: "",
state: "success",
});
export interface Status {
// Overall compatibility score across all datasources (0-100).
// Calculated as: (total found metrics / total referenced metrics) * 100
//
// Score interpretation:
// - 100: Perfect compatibility, all queries will work
// - 80-99: Excellent, minor missing metrics
// - 50-79: Fair, significant missing metrics
// - 0-49: Poor, most queries will fail
compatibilityScore: number;
// Per-datasource validation results.
// Array length matches spec.datasourceMappings.
// Each element contains detailed metrics and query-level breakdown.
datasourceResults: DataSourceResult[];
// ISO 8601 timestamp of when validation was last performed.
// Example: "2024-01-15T10:30:00Z"
lastChecked?: string;
// operatorStates is a map of operator ID to operator state evaluations.
// Any operator which consumes this kind SHOULD add its state evaluation information to this field.
operatorStates?: Record<string, OperatorState>;
// Human-readable summary of validation result.
// Examples: "All queries compatible", "3 missing metrics found"
message?: string;
// additionalFields is reserved for future use
additionalFields?: Record<string, any>;
}
export const defaultStatus = (): Status => ({
compatibilityScore: 0,
datasourceResults: [],
});
+1 -1
View File
@@ -97,7 +97,7 @@ require (
github.com/google/gnostic-models v0.7.1 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f // indirect
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f // indirect
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // indirect
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // indirect
github.com/grafana/dataplane/sdata v0.0.9 // indirect
+2 -2
View File
@@ -215,8 +215,8 @@ github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f h1:Br4SaUL3dnVopKKNhDavCLgehw60jdtl/sIxdfzmVts=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f h1:3bXOyht68qkfvD6Y8z8XoenFbytSSOIkr/s+AqRzj0o=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f/go.mod h1:Ji0SfJChcwjgq8ljy6Y5CcYfHfAYKXjKYeysOoDS/6s=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
-3
View File
@@ -18,9 +18,6 @@ metaV0Alpha1: {
type?: "grafana" | "commercial" | "community" | "private" | "private-glob"
org?: string
}
angular?: {
detected: bool
}
translations?: [string]: string
// +listType=atomic
children?: [...string]
-11
View File
@@ -215,7 +215,6 @@ type MetaSpec struct {
Module *MetaV0alpha1SpecModule `json:"module,omitempty"`
BaseURL *string `json:"baseURL,omitempty"`
Signature *MetaV0alpha1SpecSignature `json:"signature,omitempty"`
Angular *MetaV0alpha1SpecAngular `json:"angular,omitempty"`
Translations map[string]string `json:"translations,omitempty"`
// +listType=atomic
Children []string `json:"children,omitempty"`
@@ -461,16 +460,6 @@ func NewMetaV0alpha1SpecSignature() *MetaV0alpha1SpecSignature {
return &MetaV0alpha1SpecSignature{}
}
// +k8s:openapi-gen=true
type MetaV0alpha1SpecAngular struct {
Detected bool `json:"detected"`
}
// NewMetaV0alpha1SpecAngular creates a new MetaV0alpha1SpecAngular object.
func NewMetaV0alpha1SpecAngular() *MetaV0alpha1SpecAngular {
return &MetaV0alpha1SpecAngular{}
}
// +k8s:openapi-gen=true
type MetaJSONDataType string
File diff suppressed because one or more lines are too long
+1 -15
View File
@@ -565,10 +565,6 @@ func pluginStorePluginToMeta(plugin pluginstore.Plugin, loadingStrategy plugins.
metaSpec.Children = plugin.Children
}
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: plugin.Angular.Detected,
}
if len(plugin.Translations) > 0 {
metaSpec.Translations = plugin.Translations
}
@@ -668,10 +664,6 @@ func pluginToMetaSpec(plugin *plugins.Plugin) pluginsv0alpha1.MetaSpec {
metaSpec.Children = children
}
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: plugin.Angular.Detected,
}
if len(plugin.Translations) > 0 {
metaSpec.Translations = plugin.Translations
}
@@ -712,8 +704,7 @@ type grafanaComPluginVersionMeta struct {
Rel string `json:"rel"`
Href string `json:"href"`
} `json:"links"`
AngularDetected bool `json:"angularDetected"`
Scopes []string `json:"scopes"`
Scopes []string `json:"scopes"`
}
// grafanaComPluginVersionMetaToMetaSpec converts a grafanaComPluginVersionMeta to a pluginsv0alpha1.MetaSpec.
@@ -753,10 +744,5 @@ func grafanaComPluginVersionMetaToMetaSpec(gcomMeta grafanaComPluginVersionMeta)
metaSpec.Signature = signature
}
// Set angular info
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: gcomMeta.AngularDetected,
}
return metaSpec
}
+7 -1
View File
@@ -336,7 +336,7 @@ rudderstack_data_plane_url =
rudderstack_sdk_url =
# Rudderstack v3 SDK, optional, defaults to false. If set, Rudderstack v3 SDK will be used instead of v1
rudderstack_v3_sdk_url =
rudderstack_v3_sdk_url =
# Rudderstack Config url, optional, used by Rudderstack SDK to fetch source config
rudderstack_config_url =
@@ -2079,8 +2079,14 @@ enable =
# To enable features by default, set `Expression: "true"` in:
# https://github.com/grafana/grafana/blob/main/pkg/services/featuremgmt/registry.go
# The feature_toggles section supports feature flags of a number of types,
# including boolean, string, integer, float, and structured values, following the OpenFeature specification.
#
# feature1 = true
# feature2 = false
# feature3 = "foobar"
# feature4 = 1.5
# feature5 = { "foo": "bar" }
[feature_toggles.openfeature]
# This is EXPERIMENTAL. Please, do not use this section
+8 -3
View File
@@ -323,7 +323,7 @@
;rudderstack_sdk_url =
# Rudderstack v3 SDK, optional, defaults to false. If set, Rudderstack v3 SDK will be used instead of v1
;rudderstack_v3_sdk_url =
;rudderstack_v3_sdk_url =
# Rudderstack Config url, optional, used by Rudderstack SDK to fetch source config
;rudderstack_config_url =
@@ -1913,7 +1913,7 @@ default_datasource_uid =
# client_queue_max_size is the maximum size in bytes of the client queue
# for Live connections. Defaults to 4MB.
;client_queue_max_size =
;client_queue_max_size =
#################################### Grafana Image Renderer Plugin ##########################
[plugin.grafana-image-renderer]
@@ -1996,9 +1996,14 @@ default_datasource_uid =
;enable = feature1,feature2
# The feature_toggles section supports feature flags of a number of types,
# including boolean, string, integer, float, and structured values, following the OpenFeature specification.
;feature1 = true
;feature2 = false
;feature3 = "foobar"
;feature4 = 1.5
;feature5 = { "foo": "bar" }
[date_formats]
# For information on what formatting patterns that are supported https://momentjs.com/docs/#/displaying/
@@ -66,17 +66,18 @@ Please refer to plugin documentation to see what RBAC permissions the plugin has
The following list contains app plugins that have fine-grained RBAC support.
| App plugin | App plugin ID | App plugin permission documentation |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| [Access policies](https://grafana.com/docs/grafana-cloud/account-management/authentication-and-permissions/access-policies/) | `grafana-auth-app` | [RBAC actions for Access Policies](ref:cloud-access-policies-action-definitions) |
| [Adaptive Metrics](https://grafana.com/docs/grafana-cloud/cost-management-and-billing/reduce-costs/metrics-costs/control-metrics-usage-via-adaptive-metrics/adaptive-metrics-plugin/) | `grafana-adaptive-metrics-app` | [RBAC actions for Adaptive Metrics](ref:adaptive-metrics-permissions) |
| [Cloud Provider](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/monitor-cloud-provider/) | `grafana-csp-app` | [Cloud Provider Observability role-based access control](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/monitor-cloud-provider/rbac/) |
| [Incident](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/incident/) | `grafana-incident-app` | n/a |
| [Kubernetes Monitoring](/docs/grafana-cloud/monitor-infrastructure/kubernetes-monitoring/) | `grafana-k8s-app` | [Kubernetes Monitoring role-based access control](/docs/grafana-cloud/monitor-infrastructure/kubernetes-monitoring/configuration/control-access/#precision-access-with-rbac-custom-plugin-roles) |
| [OnCall](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/oncall/) | `grafana-oncall-app` | [Configure RBAC for OnCall](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/oncall/manage/user-and-team-management/#manage-users-and-teams-for-grafana-oncall) |
| [Performance Testing (K6)](https://grafana.com/docs/grafana-cloud/testing/k6/) | `k6-app` | [Configure RBAC for K6](https://grafana.com/docs/grafana-cloud/testing/k6/projects-and-users/configure-rbac/) |
| [Private data source connect (PDC)](https://grafana.com/docs/grafana-cloud/connect-externally-hosted/private-data-source-connect/) | `grafana-pdc-app` | n/a |
| [Service Level Objective (SLO)](https://grafana.com/docs/grafana-cloud/alerting-and-irm/slo/) | `grafana-slo-app` | [Configure RBAC for SLO](https://grafana.com/docs/grafana-cloud/alerting-and-irm/slo/set-up/rbac/) |
| App plugin | App plugin ID | App plugin permission documentation |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| [Access policies](https://grafana.com/docs/grafana-cloud/account-management/authentication-and-permissions/access-policies/) | `grafana-auth-app` | [RBAC actions for Access Policies](ref:cloud-access-policies-action-definitions) |
| [Adaptive Metrics](https://grafana.com/docs/grafana-cloud/cost-management-and-billing/reduce-costs/metrics-costs/control-metrics-usage-via-adaptive-metrics/adaptive-metrics-plugin/) | `grafana-adaptive-metrics-app` | [RBAC actions for Adaptive Metrics](ref:adaptive-metrics-permissions) |
| [Cloud Provider](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/monitor-cloud-provider/) | `grafana-csp-app` | [Cloud Provider Observability role-based access control](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/monitor-cloud-provider/rbac/) |
| [Incident](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/incident/) | `grafana-incident-app` | n/a |
| [Kubernetes Monitoring](/docs/grafana-cloud/monitor-infrastructure/kubernetes-monitoring/) | `grafana-k8s-app` | [Kubernetes Monitoring role-based access control](/docs/grafana-cloud/monitor-infrastructure/kubernetes-monitoring/configuration/control-access/#precision-access-with-rbac-custom-plugin-roles) |
| [OnCall](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/oncall/) | `grafana-oncall-app` | [Configure RBAC for OnCall](https://grafana.com/docs/grafana-cloud/alerting-and-irm/irm/oncall/manage/user-and-team-management/#manage-users-and-teams-for-grafana-oncall) |
| [Performance Testing (K6)](https://grafana.com/docs/grafana-cloud/testing/k6/) | `k6-app` | [Configure RBAC for K6](https://grafana.com/docs/grafana-cloud/testing/k6/projects-and-users/configure-rbac/) |
| [Private data source connect (PDC)](https://grafana.com/docs/grafana-cloud/connect-externally-hosted/private-data-source-connect/) | `grafana-pdc-app` | n/a |
| [Service Level Objective (SLO)](https://grafana.com/docs/grafana-cloud/alerting-and-irm/slo/) | `grafana-slo-app` | [Configure RBAC for SLO](https://grafana.com/docs/grafana-cloud/alerting-and-irm/slo/set-up/rbac/) |
| [Synthetic Monitoring](https://grafana.com/docs/grafana-cloud/testing/synthetic-monitoring/) | `grafana-synthetic-monitoring-app` | [Configure RBAC for Synthetic Monitoring](https://grafana.com/docs/grafana-cloud/testing/synthetic-monitoring/user-and-team-management/) |
### Revoke fine-grained access from app plugins
@@ -2836,9 +2836,11 @@ For more information about Grafana Enterprise, refer to [Grafana Enterprise](../
Keys of features to enable, separated by space.
#### `FEATURE_TOGGLE_NAME = false`
#### `FEATURE_NAME = <value>`
Some feature toggles for stable features are on by default. Use this setting to disable an on-by-default feature toggle with the name FEATURE_TOGGLE_NAME, for example, `exploreMixedDatasource = false`.
Use a key-value pair to set feature flag values explicitly, overriding any default values. A few different types are supported, following the OpenFeature specification. See the defaults.ini file for more details.
For example, to disable an on-by-default feature toggle named `exploreMixedDatasource`, specify `exploreMixedDatasource = false`.
<hr>
@@ -1,6 +1,7 @@
import { test, expect } from '@grafana/plugin-e2e';
import { setScopes } from '../utils/scope-helpers';
import { setScopes, setupScopeRoutes } from '../utils/scope-helpers';
import { testScopes } from '../utils/scopes';
import {
getAdHocFilterOptionValues,
@@ -13,6 +14,7 @@ import {
} from './cuj-selectors';
import { prepareAPIMocks } from './utils';
const USE_LIVE_DATA = Boolean(process.env.API_CONFIG_PATH);
const DASHBOARD_UNDER_TEST = 'cuj-dashboard-1';
test.use({
@@ -34,6 +36,11 @@ test.describe(
const adHocFilterPills = getAdHocFilterPills(page);
const scopesSelectorInput = getScopesSelectorInput(page);
// Set up routes before any navigation (only for mocked mode)
if (!USE_LIVE_DATA) {
await setupScopeRoutes(page, testScopes());
}
await test.step('1.Apply filtering to a whole dashboard', async () => {
const dashboardPage = await gotoDashboardPage({ uid: DASHBOARD_UNDER_TEST });
@@ -66,6 +66,17 @@ export function getScopesDashboards(page: Page) {
return page.locator('[data-testid^="scopes-dashboards-"][role="treeitem"]');
}
/**
* Clicks the first available dashboard in the scopes dashboard list.
*/
export async function clickFirstScopesDashboard(page: Page) {
const dashboards = getScopesDashboards(page);
// Wait for at least one dashboard to be visible
await expect(dashboards.first()).toBeVisible({ timeout: 10000 });
// Click - Playwright will automatically wait for the element to be actionable
await dashboards.first().click();
}
export function getScopesDashboardsSearchInput(page: Page) {
return page.getByTestId('scopes-dashboards-search');
}
@@ -1,8 +1,10 @@
import { test, expect } from '@grafana/plugin-e2e';
import { setScopes } from '../utils/scope-helpers';
import { setScopes, setupScopeRoutes } from '../utils/scope-helpers';
import { testScopes } from '../utils/scopes';
import {
clickFirstScopesDashboard,
getAdHocFilterPills,
getGroupByInput,
getGroupByValues,
@@ -21,6 +23,7 @@ test.use({
},
});
const USE_LIVE_DATA = Boolean(process.env.API_CONFIG_PATH);
const DASHBOARD_UNDER_TEST = 'cuj-dashboard-1';
const DASHBOARD_UNDER_TEST_2 = 'cuj-dashboard-2';
const NAVIGATE_TO = 'cuj-dashboard-3';
@@ -38,6 +41,11 @@ test.describe(
const adhocFilterPills = getAdHocFilterPills(page);
const groupByValues = getGroupByValues(page);
// Set up routes before any navigation (only for mocked mode)
if (!USE_LIVE_DATA) {
await setupScopeRoutes(page, testScopes());
}
await test.step('1.Search dashboard', async () => {
await gotoDashboardPage({ uid: DASHBOARD_UNDER_TEST });
@@ -74,7 +82,7 @@ test.describe(
await expect(markdownContent).toContainText(`now-12h`);
await scopesDashboards.first().click();
await clickFirstScopesDashboard(page);
await page.waitForURL('**/d/**');
await expect(markdownContent).toBeVisible();
@@ -117,10 +125,10 @@ test.describe(
await groupByVariable.press('Enter');
await groupByVariable.press('Escape');
await expect(scopesDashboards.first()).toBeVisible();
const { getRequests, waitForExpectedRequests } = await trackDashboardReloadRequests(page);
await scopesDashboards.first().click();
await clickFirstScopesDashboard(page);
await page.waitForURL('**/d/**');
await waitForExpectedRequests();
await page.waitForLoadState('networkidle');
@@ -158,8 +166,7 @@ test.describe(
const oldFilters = `GroupByVar: ${selectedValues}\n\nAdHocVar: ${processedPills}`;
await expect(markdownContent).toContainText(oldFilters);
await expect(scopesDashboards.first()).toBeVisible();
await scopesDashboards.first().click();
await clickFirstScopesDashboard(page);
await page.waitForURL('**/d/**');
const newPillCount = await adhocFilterPills.count();
@@ -165,9 +165,8 @@ test.describe(
await refreshBtn.click();
await page.waitForLoadState('networkidle');
expect(await panelContent.textContent()).not.toBe(panelContents);
// Wait for the panel content to change (not just for network to complete)
await expect(panelContent).not.toHaveText(panelContents!, { timeout: 10000 });
});
await test.step('6.Turn off refresh', async () => {
@@ -9,6 +9,7 @@ import {
openScopesSelector,
searchScopes,
selectScope,
setupScopeRoutes,
} from '../utils/scope-helpers';
import { testScopes } from '../utils/scopes';
@@ -36,32 +37,37 @@ test.describe(
const scopesSelector = getScopesSelectorInput(page);
const recentScopesSelector = getRecentScopesSelector(page);
const scopeTreeCheckboxes = getScopeTreeCheckboxes(page);
const scopes = testScopes();
// Set up routes once before any navigation (only for mocked mode)
if (!USE_LIVE_DATA) {
await setupScopeRoutes(page, scopes);
}
await test.step('1.View and select any scope', async () => {
await gotoDashboardPage({ uid: DASHBOARD_UNDER_TEST });
expect.soft(scopesSelector).toHaveAttribute('data-value', '');
const scopes = testScopes();
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes); //used only in mocked scopes version
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes);
let scopeName = await getScopeTreeName(page, 0);
const firstLevelScopes = scopes[0].children!; //used only in mocked scopes version
const firstLevelScopes = scopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : firstLevelScopes);
scopeName = await getScopeTreeName(page, 1);
const secondLevelScopes = firstLevelScopes[0].children!; //used only in mocked scopes version
const secondLevelScopes = firstLevelScopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : secondLevelScopes);
const selectedScopes = [secondLevelScopes[0]]; //used only in mocked scopes version
const selectedScopes = [secondLevelScopes[0]];
scopeName = await getScopeLeafName(page, 0);
let scopeTitle = await getScopeLeafTitle(page, 0);
await selectScope(page, scopeName, USE_LIVE_DATA ? undefined : selectedScopes[0]);
await applyScopes(page, USE_LIVE_DATA ? undefined : selectedScopes); //used only in mocked scopes version
await applyScopes(page, USE_LIVE_DATA ? undefined : selectedScopes);
expect.soft(scopesSelector).toHaveAttribute('data-value', scopeTitle);
});
@@ -70,28 +76,27 @@ test.describe(
await gotoDashboardPage({ uid: DASHBOARD_UNDER_TEST });
expect.soft(scopesSelector).toHaveAttribute('data-value', '');
const scopes = testScopes();
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes); //used only in mocked scopes version
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes);
let scopeName = await getScopeTreeName(page, 0);
const firstLevelScopes = scopes[0].children!; //used only in mocked scopes version
const firstLevelScopes = scopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : firstLevelScopes);
scopeName = await getScopeTreeName(page, 1);
const secondLevelScopes = firstLevelScopes[0].children!; //used only in mocked scopes version
const secondLevelScopes = firstLevelScopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : secondLevelScopes);
const scopeTitles: string[] = [];
const selectedScopes = [secondLevelScopes[0], secondLevelScopes[1]]; //used only in mocked scopes version
const selectedScopes = [secondLevelScopes[0], secondLevelScopes[1]];
for (let i = 0; i < selectedScopes.length; i++) {
scopeName = await getScopeLeafName(page, i);
scopeTitles.push(await getScopeLeafTitle(page, i));
await selectScope(page, scopeName, USE_LIVE_DATA ? undefined : selectedScopes[i]); //used only in mocked scopes version
await selectScope(page, scopeName, USE_LIVE_DATA ? undefined : selectedScopes[i]);
}
await applyScopes(page, USE_LIVE_DATA ? undefined : selectedScopes); //used only in mocked scopes version
await applyScopes(page, USE_LIVE_DATA ? undefined : selectedScopes);
await expect.soft(scopesSelector).toHaveAttribute('data-value', scopeTitles.join(' + '));
});
@@ -102,8 +107,7 @@ test.describe(
expect.soft(scopesSelector).toHaveAttribute('data-value', '');
const scopes = testScopes();
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes); //used only in mocked scopes version
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes);
await recentScopesSelector.click();
@@ -121,26 +125,25 @@ test.describe(
expect.soft(scopesSelector).toHaveAttribute('data-value', '');
const scopes = testScopes();
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes);
let scopeName = await getScopeTreeName(page, 1);
const firstLevelScopes = scopes[2].children!; //used only in mocked scopes version
const firstLevelScopes = scopes[2].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : firstLevelScopes);
scopeName = await getScopeTreeName(page, 1);
const secondLevelScopes = firstLevelScopes[0].children!; //used only in mocked scopes version
const secondLevelScopes = firstLevelScopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : secondLevelScopes);
const selectedScopes = [secondLevelScopes[0]]; //used only in mocked scopes version
const selectedScopes = [secondLevelScopes[0]];
scopeName = await getScopeLeafName(page, 0);
let scopeTitle = await getScopeLeafTitle(page, 0);
await selectScope(page, scopeName, USE_LIVE_DATA ? undefined : selectedScopes[0]);
await applyScopes(page, USE_LIVE_DATA ? undefined : []); //used only in mocked scopes version
await applyScopes(page, USE_LIVE_DATA ? undefined : []);
expect.soft(scopesSelector).toHaveAttribute('data-value', new RegExp(`^${scopeTitle}`));
});
@@ -148,17 +151,16 @@ test.describe(
await test.step('5.View pre-completed production entity values as I type', async () => {
await gotoDashboardPage({ uid: DASHBOARD_UNDER_TEST });
const scopes = testScopes();
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes); //used only in mocked scopes version
await openScopesSelector(page, USE_LIVE_DATA ? undefined : scopes);
let scopeName = await getScopeTreeName(page, 0);
const firstLevelScopes = scopes[0].children!; //used only in mocked scopes version
const firstLevelScopes = scopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : firstLevelScopes);
scopeName = await getScopeTreeName(page, 1);
const secondLevelScopes = firstLevelScopes[0].children!; //used only in mocked scopes version
const secondLevelScopes = firstLevelScopes[0].children!;
await expandScopesSelection(page, scopeName, USE_LIVE_DATA ? undefined : secondLevelScopes);
const scopeSearchOne = await getScopeLeafTitle(page, 0);
@@ -1,6 +1,6 @@
import { test, expect } from '@grafana/plugin-e2e';
import { applyScopes, openScopesSelector, selectScope } from '../utils/scope-helpers';
import { applyScopes, openScopesSelector, selectScope, setupScopeRoutes } from '../utils/scope-helpers';
import { testScopesWithRedirect } from '../utils/scopes';
test.use({
@@ -16,8 +16,13 @@ test.describe('Scope Redirect Functionality', () => {
test('should redirect to custom URL when scope has redirectUrl', async ({ page, gotoDashboardPage }) => {
const scopes = testScopesWithRedirect();
await test.step('Navigate to dashboard and open scopes selector', async () => {
await test.step('Set up routes and navigate to dashboard', async () => {
// Set up routes BEFORE navigation to ensure all requests are mocked
await setupScopeRoutes(page, scopes);
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
});
await test.step('Open scopes selector', async () => {
await openScopesSelector(page, scopes);
});
@@ -40,8 +45,12 @@ test.describe('Scope Redirect Functionality', () => {
test('should prioritize redirectUrl over scope navigation fallback', async ({ page, gotoDashboardPage }) => {
const scopes = testScopesWithRedirect();
await test.step('Navigate to dashboard and open scopes selector', async () => {
await test.step('Set up routes and navigate to dashboard', async () => {
await setupScopeRoutes(page, scopes);
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
});
await test.step('Open scopes selector', async () => {
await openScopesSelector(page, scopes);
});
@@ -68,8 +77,12 @@ test.describe('Scope Redirect Functionality', () => {
}) => {
const scopes = testScopesWithRedirect();
await test.step('Navigate to dashboard and select scope', async () => {
await test.step('Set up routes and navigate to dashboard', async () => {
await setupScopeRoutes(page, scopes);
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
});
await test.step('Select and apply scope', async () => {
await openScopesSelector(page, scopes);
await selectScope(page, 'sn-redirect-fallback', scopes[1]);
await applyScopes(page, [scopes[1]]);
@@ -112,8 +125,12 @@ test.describe('Scope Redirect Functionality', () => {
}) => {
const scopes = testScopesWithRedirect();
await test.step('Navigate to dashboard and select scope', async () => {
await test.step('Set up routes and navigate to dashboard', async () => {
await setupScopeRoutes(page, scopes);
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
});
await test.step('Select and apply scope', async () => {
await openScopesSelector(page, scopes);
await selectScope(page, 'sn-redirect-fallback', scopes[1]);
await applyScopes(page, [scopes[1]]);
@@ -151,9 +168,13 @@ test.describe('Scope Redirect Functionality', () => {
test('should not redirect to redirectPath when on active scope navigation', async ({ page, gotoDashboardPage }) => {
const scopes = testScopesWithRedirect();
await test.step('Set up routes and navigate to dashboard', async () => {
await setupScopeRoutes(page, scopes);
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
});
await test.step('Set up scope navigation to dashboard-1', async () => {
// First, apply a scope that creates scope navigation to dashboard-1 (without redirectPath)
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
await openScopesSelector(page, scopes);
await selectScope(page, 'sn-redirect-setup', scopes[2]);
await applyScopes(page, [scopes[2]]);
+183 -15
View File
@@ -6,7 +6,150 @@ import { Resource } from '../../public/app/features/apiserver/types';
import { testScopes } from './scopes';
const USE_LIVE_DATA = Boolean(process.env.API_CALLS_CONFIG_PATH);
const USE_LIVE_DATA = Boolean(process.env.API_CONFIG_PATH);
/**
* Sets up all scope-related API routes before navigation.
* This ensures that ALL scope API requests (including those made during initial page load)
* are intercepted by the mocks, preventing RTK Query from caching real API responses.
*
* Call this BEFORE navigating to a page (e.g., before gotoDashboardPage).
*/
export async function setupScopeRoutes(page: Page, scopes: TestScope[]): Promise<void> {
// Route for scope node children (tree structure)
await page.route(`**/apis/scope.grafana.app/v0alpha1/namespaces/*/find/scope_node_children*`, async (route) => {
const url = new URL(route.request().url());
const parentParam = url.searchParams.get('parent');
const queryParam = url.searchParams.get('query');
// Find the appropriate scopes based on parent
let scopesToReturn = scopes;
if (parentParam) {
// Find nested scopes based on parent name
const findChildren = (items: TestScope[]): TestScope[] => {
for (const item of items) {
if (item.name === parentParam && item.children) {
return item.children;
}
if (item.children) {
const found = findChildren(item.children);
if (found.length > 0) {
return found;
}
}
}
return [];
};
scopesToReturn = findChildren(scopes);
if (scopesToReturn.length === 0) {
scopesToReturn = scopes; // Fallback to root scopes
}
}
// Filter by search query if provided
if (queryParam) {
const query = queryParam.toLowerCase();
const filterByQuery = (items: TestScope[]): TestScope[] => {
const results: TestScope[] = [];
for (const item of items) {
// Exact match on name or title containing the query
if (item.name.toLowerCase() === query || item.title.toLowerCase() === query) {
results.push(item);
} else if (item.name.toLowerCase().includes(query) || item.title.toLowerCase().includes(query)) {
results.push(item);
}
// Also search in children
if (item.children) {
results.push(...filterByQuery(item.children));
}
}
return results;
};
scopesToReturn = filterByQuery(scopesToReturn);
}
await route.fulfill({
status: 200,
contentType: 'application/json',
body: JSON.stringify({
apiVersion: 'scope.grafana.app/v0alpha1',
kind: 'FindScopeNodeChildrenResults',
metadata: {},
items: scopesToReturn.map((scope) => ({
kind: 'ScopeNode',
apiVersion: 'scope.grafana.app/v0alpha1',
metadata: {
name: scope.name,
namespace: 'default',
},
spec: {
title: scope.title,
description: scope.title,
disableMultiSelect: scope.disableMultiSelect ?? false,
nodeType: scope.children ? 'container' : 'leaf',
...(parentParam && { parentName: parentParam }),
...((scope.addLinks || scope.children) && {
linkType: 'scope',
linkId: `scope-${scope.name}`,
}),
...(scope.redirectPath && { redirectPath: scope.redirectPath }),
},
})),
}),
});
});
// Route for individual scope fetching
await page.route(`**/apis/scope.grafana.app/v0alpha1/namespaces/*/scopes/*`, async (route) => {
const url = route.request().url();
const scopeName = url.split('/scopes/')[1]?.split('?')[0];
// Find the scope in the test data
const findScope = (items: TestScope[]): TestScope | undefined => {
for (const item of items) {
if (`scope-${item.name}` === scopeName) {
return item;
}
if (item.children) {
const found = findScope(item.children);
if (found) {
return found;
}
}
}
return undefined;
};
const scope = findScope(scopes);
if (scope) {
await route.fulfill({
status: 200,
contentType: 'application/json',
body: JSON.stringify({
kind: 'Scope',
apiVersion: 'scope.grafana.app/v0alpha1',
metadata: {
name: `scope-${scope.name}`,
namespace: 'default',
},
spec: {
title: scope.title,
description: '',
filters: scope.filters,
category: scope.category,
type: scope.type,
},
}),
});
} else {
await route.fulfill({ status: 404 });
}
});
// Note: Dashboard bindings and navigations routes are set up dynamically in applyScopes()
// with scope-specific URL patterns to avoid cache issues. They are not set up here.
}
export type TestScope = {
name: string;
@@ -24,6 +167,9 @@ export type TestScope = {
type ScopeDashboardBinding = Resource<ScopeDashboardBindingSpec, ScopeDashboardBindingStatus, 'ScopeDashboardBinding'>;
/**
* Sets up a route for scope node children requests and waits for the response.
*/
export async function scopeNodeChildrenRequest(
page: Page,
scopes: TestScope[],
@@ -68,10 +214,13 @@ export async function scopeNodeChildrenRequest(
return page.waitForResponse((response) => response.url().includes(`/find/scope_node_children`));
}
/**
* Opens the scopes selector dropdown and waits for the tree to load.
*/
export async function openScopesSelector(page: Page, scopes?: TestScope[]) {
const click = async () => await page.getByTestId('scopes-selector-input').click();
if (!scopes) {
if (!scopes || USE_LIVE_DATA) {
await click();
return;
}
@@ -82,10 +231,13 @@ export async function openScopesSelector(page: Page, scopes?: TestScope[]) {
await responsePromise;
}
/**
* Expands a scope tree node and waits for children to load.
*/
export async function expandScopesSelection(page: Page, parentScope: string, scopes?: TestScope[]) {
const click = async () => await page.getByTestId(`scopes-tree-${parentScope}-expand`).click();
if (!scopes) {
if (!scopes || USE_LIVE_DATA) {
await click();
return;
}
@@ -96,6 +248,9 @@ export async function expandScopesSelection(page: Page, parentScope: string, sco
await responsePromise;
}
/**
* Sets up a route for individual scope requests and waits for the response.
*/
export async function scopeSelectRequest(page: Page, selectedScope: TestScope): Promise<Response> {
await page.route(
`**/apis/scope.grafana.app/v0alpha1/namespaces/*/scopes/scope-${selectedScope.name}`,
@@ -125,6 +280,9 @@ export async function scopeSelectRequest(page: Page, selectedScope: TestScope):
return page.waitForResponse((response) => response.url().includes(`/scopes/scope-${selectedScope.name}`));
}
/**
* Selects a scope in the tree.
*/
export async function selectScope(page: Page, scopeName: string, selectedScope?: TestScope) {
const click = async () => {
const element = page.locator(
@@ -134,7 +292,7 @@ export async function selectScope(page: Page, scopeName: string, selectedScope?:
await element.click({ force: true });
};
if (!selectedScope) {
if (!selectedScope || USE_LIVE_DATA) {
await click();
return;
}
@@ -145,14 +303,22 @@ export async function selectScope(page: Page, scopeName: string, selectedScope?:
await responsePromise;
}
/**
* Applies the selected scopes and waits for the selector to close and page to settle.
* Sets up routes dynamically with scope-specific URL patterns to avoid cache issues.
*/
export async function applyScopes(page: Page, scopes?: TestScope[]) {
const click = async () => {
await page.getByTestId('scopes-selector-apply').scrollIntoViewIfNeeded();
await page.getByTestId('scopes-selector-apply').click({ force: true });
};
if (!scopes) {
if (!scopes || USE_LIVE_DATA) {
await click();
// Wait for the apply button to disappear (selector closed)
await page.waitForSelector('[data-testid="scopes-selector-apply"]', { state: 'hidden', timeout: 5000 });
// Wait for any resulting API calls (dashboard bindings, etc.) to complete
await page.waitForLoadState('networkidle');
return;
}
@@ -166,7 +332,7 @@ export async function applyScopes(page: Page, scopes?: TestScope[]) {
const groups: string[] = ['Most relevant', 'Dashboards', 'Something else', ''];
// Mock scope_dashboard_bindings endpoint
// Mock scope_dashboard_bindings endpoint with scope-specific URL pattern
await page.route(dashboardBindingsUrl, async (route) => {
await route.fulfill({
status: 200,
@@ -220,7 +386,7 @@ export async function applyScopes(page: Page, scopes?: TestScope[]) {
});
});
// Mock scope_navigations endpoint
// Mock scope_navigations endpoint with scope-specific URL pattern
await page.route(scopeNavigationsUrl, async (route) => {
await route.fulfill({
status: 200,
@@ -266,21 +432,23 @@ export async function applyScopes(page: Page, scopes?: TestScope[]) {
(response) =>
response.url().includes(`/find/scope_dashboard_bindings`) || response.url().includes(`/find/scope_navigations`)
);
const scopeRequestPromises: Array<Promise<Response>> = [];
for (const scope of scopes) {
scopeRequestPromises.push(scopeSelectRequest(page, scope));
}
await click();
await responsePromise;
await Promise.all(scopeRequestPromises);
// Wait for the apply button to disappear (selector closed)
await page.waitForSelector('[data-testid="scopes-selector-apply"]', { state: 'hidden', timeout: 5000 });
// Wait for any resulting API calls to complete
await page.waitForLoadState('networkidle');
}
export async function searchScopes(page: Page, value: string, resultScopes: TestScope[]) {
/**
* Searches for scopes in the tree and waits for results.
* Sets up a route dynamically with filtered results to return only matching scopes.
*/
export async function searchScopes(page: Page, value: string, resultScopes?: TestScope[]) {
const click = async () => await page.getByTestId('scopes-tree-search').fill(value);
if (!resultScopes) {
if (!resultScopes || USE_LIVE_DATA) {
await click();
return;
}
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = false
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-1
View File
@@ -3,7 +3,6 @@
[feature_toggles]
unifiedStorageSearchUI = true
grafanaAPIServerWithExperimentalAPIs = true
unifiedStorageSearchSprinkles = true
[unified_storage]
enable_search = true
-5
View File
@@ -1156,11 +1156,6 @@
"count": 2
}
},
"public/app/core/config.ts": {
"no-barrel-files/no-barrel-files": {
"count": 2
}
},
"public/app/core/navigation/types.ts": {
"@typescript-eslint/no-explicit-any": {
"count": 1
+15 -8
View File
@@ -32,14 +32,14 @@ require (
github.com/armon/go-radix v1.0.0 // @grafana/grafana-app-platform-squad
github.com/aws/aws-sdk-go v1.55.7 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2 v1.40.0 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/credentials v1.18.21 // @grafana/grafana-operator-experience-squad
github.com/aws/aws-sdk-go-v2/credentials v1.18.21 // indirect; @grafana/grafana-operator-experience-squad
github.com/aws/aws-sdk-go-v2/service/cloudwatch v1.45.3 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs v1.51.0 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/service/ec2 v1.225.2 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/service/oam v1.18.3 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/service/resourcegroupstaggingapi v1.26.6 // @grafana/aws-datasources
github.com/aws/aws-sdk-go-v2/service/secretsmanager v1.40.1 // @grafana/grafana-operator-experience-squad
github.com/aws/aws-sdk-go-v2/service/sts v1.39.1 // @grafana/grafana-operator-experience-squad
github.com/aws/aws-sdk-go-v2/service/sts v1.39.1 // indirect; @grafana/grafana-operator-experience-squad
github.com/aws/smithy-go v1.23.2 // @grafana/aws-datasources
github.com/beevik/etree v1.4.1 // @grafana/grafana-backend-group
github.com/benbjohnson/clock v1.3.5 // @grafana/alerting-backend
@@ -82,14 +82,14 @@ require (
github.com/golang/protobuf v1.5.4 // @grafana/grafana-backend-group
github.com/golang/snappy v1.0.0 // @grafana/alerting-backend
github.com/google/go-cmp v0.7.0 // @grafana/grafana-backend-group
github.com/google/go-github/v70 v70.0.0 // indirect; @grafana/grafana-git-ui-sync-team
github.com/google/go-github/v70 v70.0.0 // @grafana/grafana-git-ui-sync-team
github.com/google/go-querystring v1.1.0 // indirect; @grafana/oss-big-tent
github.com/google/uuid v1.6.0 // @grafana/grafana-backend-group
github.com/google/wire v0.7.0 // @grafana/grafana-backend-group
github.com/googleapis/gax-go/v2 v2.15.0 // @grafana/grafana-backend-group
github.com/gorilla/mux v1.8.1 // @grafana/grafana-backend-group
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 // @grafana/grafana-app-platform-squad
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f // @grafana/alerting-backend
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f // @grafana/alerting-backend
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // @grafana/identity-access-team
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // @grafana/identity-access-team
github.com/grafana/dataplane/examples v0.0.1 // @grafana/observability-metrics
@@ -113,6 +113,7 @@ require (
github.com/grafana/otel-profiling-go v0.5.1 // @grafana/grafana-backend-group
github.com/grafana/pyroscope-go/godeltaprof v0.1.9 // @grafana/observability-traces-and-profiling
github.com/grafana/pyroscope/api v1.2.1-0.20251118081820-ace37f973a0f // @grafana/observability-traces-and-profiling
github.com/grafana/tempo v1.5.1-0.20250529124718-87c2dc380cec // @grafana/observability-traces-and-profiling
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0 // @grafana/grafana-search-and-storage
github.com/grpc-ecosystem/go-grpc-middleware/providers/prometheus v1.1.0 // @grafana/plugins-platform-backend
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.3 // @grafana/grafana-backend-group
@@ -260,12 +261,13 @@ require (
github.com/grafana/grafana/pkg/aggregator v0.0.0 // @grafana/grafana-app-platform-squad
github.com/grafana/grafana/pkg/apimachinery v0.0.0 // @grafana/grafana-app-platform-squad
github.com/grafana/grafana/pkg/apiserver v0.0.0 // @grafana/grafana-app-platform-squad
github.com/grafana/grafana/pkg/plugins v0.0.0 // @grafana/plugins-platform-backend
// This needs to be here for other projects that import grafana/grafana
// For local development grafana/grafana will always use the local files
// Check go.work file for details
github.com/grafana/grafana/pkg/promlib v0.0.8 // @grafana/oss-big-tent
github.com/grafana/grafana/pkg/semconv v0.0.0-20250804150913-990f1c69ecc2 // @grafana/grafana-app-platform-squad
github.com/grafana/grafana/pkg/semconv v0.0.0 // @grafana/grafana-app-platform-squad
)
// Replace the workspace versions
@@ -294,6 +296,8 @@ replace (
github.com/grafana/grafana/pkg/aggregator => ./pkg/aggregator
github.com/grafana/grafana/pkg/apimachinery => ./pkg/apimachinery
github.com/grafana/grafana/pkg/apiserver => ./pkg/apiserver
github.com/grafana/grafana/pkg/plugins => ./pkg/plugins
github.com/grafana/grafana/pkg/semconv => ./pkg/semconv
)
require (
@@ -652,11 +656,12 @@ require (
sigs.k8s.io/yaml v1.6.0 // indirect
)
require github.com/grafana/tempo v1.5.1-0.20250529124718-87c2dc380cec // @grafana/observability-traces-and-profiling
require (
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c // indirect
github.com/IBM/pgxpoolprometheus v1.1.2 // indirect
github.com/Machiel/slugify v1.0.1 // indirect
github.com/ProtonMail/go-crypto v1.3.0 // indirect
github.com/cloudflare/circl v1.6.1 // indirect
github.com/containerd/log v0.1.0 // indirect
github.com/containerd/platforms v0.2.1 // indirect
github.com/cpuguy83/dockercfg v0.3.2 // indirect
@@ -676,6 +681,8 @@ require (
github.com/google/gnostic v0.7.1 // indirect
github.com/gophercloud/gophercloud/v2 v2.9.0 // indirect
github.com/grafana/sqlds/v5 v5.0.3 // indirect
github.com/hashicorp/go-secure-stdlib/plugincontainer v0.4.2 // indirect
github.com/joshlf/go-acl v0.0.0-20200411065538-eae00ae38531 // indirect
github.com/lufia/plan9stats v0.0.0-20240909124753-873cd0166683 // indirect
github.com/magiconair/properties v1.8.10 // indirect
github.com/moby/go-archive v0.1.0 // indirect
@@ -697,7 +704,7 @@ require (
replace github.com/crewjam/saml => github.com/grafana/saml v0.4.15-0.20240917091248-ae3bbdad8a56
// Use our fork of the upstream Alertmanager.
replace github.com/prometheus/alertmanager => github.com/grafana/prometheus-alertmanager v0.25.1-0.20250911094103-5456b6e45604
replace github.com/prometheus/alertmanager => github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f
exclude github.com/mattn/go-sqlite3 v2.0.3+incompatible
+17 -6
View File
@@ -680,6 +680,7 @@ github.com/Azure/azure-storage-blob-go v0.15.0 h1:rXtgp8tN1p29GvpGgfJetavIG0V7Og
github.com/Azure/azure-storage-blob-go v0.15.0/go.mod h1:vbjsVbX0dlxnRc4FFMPsS9BsJWPcne7GB7onqlPvz58=
github.com/Azure/go-ansiterm v0.0.0-20170929234023-d6e3b3328b78/go.mod h1:LmzpDX56iTiv29bbRTIsUNlaFfuhWRQBWjQdVyAevI8=
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c h1:udKWzYgxTojEKWjV8V+WSxDXJ4NFATAsZjh8iIbsQIg=
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=
github.com/Azure/go-autorest v11.2.8+incompatible/go.mod h1:r+4oMnoxhatjLLJ6zxSWATqVooLgysK6ZNox3g/xq24=
github.com/Azure/go-autorest v14.2.0+incompatible h1:V5VMDjClD3GiElqLWO7mz2MxNAK/vTfRHdAubSIPRgs=
github.com/Azure/go-autorest v14.2.0+incompatible/go.mod h1:r+4oMnoxhatjLLJ6zxSWATqVooLgysK6ZNox3g/xq24=
@@ -737,6 +738,8 @@ github.com/HdrHistogram/hdrhistogram-go v1.1.2/go.mod h1:yDgFjdqOqDEKOvasDdhWNXY
github.com/IBM/pgxpoolprometheus v1.1.2 h1:sHJwxoL5Lw4R79Zt+H4Uj1zZ4iqXJLdk7XDE7TPs97U=
github.com/IBM/pgxpoolprometheus v1.1.2/go.mod h1:+vWzISN6S9ssgurhUNmm6AlXL9XLah3TdWJktquKTR8=
github.com/JohnCGriffin/overflow v0.0.0-20211019200055-46fa312c352c/go.mod h1:X0CRv0ky0k6m906ixxpzmDRLvX58TFUKS2eePweuyxk=
github.com/Machiel/slugify v1.0.1 h1:EfWSlRWstMadsgzmiV7d0yVd2IFlagWH68Q+DcYCm4E=
github.com/Machiel/slugify v1.0.1/go.mod h1:fTFGn5uWEynW4CUMG7sWkYXOf1UgDxyTM3DbR6Qfg3k=
github.com/Masterminds/goutils v1.1.1 h1:5nUrii3FMTL5diU80unEVvNevw1nH4+ZV4DSLVJLSYI=
github.com/Masterminds/goutils v1.1.1/go.mod h1:8cTjp+g8YejhMuvIA5y2vz3BpJxksy863GQaJW2MFNU=
github.com/Masterminds/semver v1.5.0 h1:H65muMkzWKEuNDnfl9d70GUjFniHKHRbFPGBuZ3QEww=
@@ -759,6 +762,8 @@ github.com/Nvveen/Gotty v0.0.0-20120604004816-cd527374f1e5/go.mod h1:lmUJ/7eu/Q8
github.com/OneOfOne/xxhash v1.2.2/go.mod h1:HSdplMjZKSmBqAxg5vPj2TmRDmfkzw+cTzAElWljhcU=
github.com/OneOfOne/xxhash v1.2.5 h1:zl/OfRA6nftbBK9qTohYBJ5xvw6C/oNKizR7cZGl3cI=
github.com/OneOfOne/xxhash v1.2.5/go.mod h1:eZbhyaAYD41SGSSsnmcpxVoRiQ/MPUTjUdIIOT9Um7Q=
github.com/ProtonMail/go-crypto v1.3.0 h1:ILq8+Sf5If5DCpHQp4PbZdS1J7HDFRXz/+xKBiRGFrw=
github.com/ProtonMail/go-crypto v1.3.0/go.mod h1:9whxjD8Rbs29b4XWbB8irEcE8KHMqaR2e7GWU1R+/PE=
github.com/PuerkitoBio/purell v1.0.0/go.mod h1:c11w/QuzBsJSee3cPx9rAFu61PvFxuPbtSwDGJws/X0=
github.com/PuerkitoBio/purell v1.1.0/go.mod h1:c11w/QuzBsJSee3cPx9rAFu61PvFxuPbtSwDGJws/X0=
github.com/PuerkitoBio/purell v1.1.1/go.mod h1:c11w/QuzBsJSee3cPx9rAFu61PvFxuPbtSwDGJws/X0=
@@ -1026,6 +1031,8 @@ github.com/chzyer/test v0.0.0-20180213035817-a1ea475d72b1/go.mod h1:Q3SI9o4m/ZMn
github.com/circonus-labs/circonus-gometrics v2.3.1+incompatible/go.mod h1:nmEj6Dob7S7YxXgwXpfOuvO54S+tGdZdw9fuRZt25Ag=
github.com/circonus-labs/circonusllhist v0.1.3/go.mod h1:kMXHVDlOchFAehlya5ePtbp5jckzBHf4XRpQvBOLI+I=
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
github.com/cloudflare/circl v1.6.1 h1:zqIqSPIndyBh1bjLVVDHMPpVKqp8Su/V+6MeDzzQBQ0=
github.com/cloudflare/circl v1.6.1/go.mod h1:uddAzsPgqdMAYatqJ0lsjX1oECcQLIlRpzZh3pJrofs=
github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGXZJjfX53e64911xZQV5JYwmTeXPW+k8Sc=
github.com/cncf/udpa/go v0.0.0-20200629203442-efcf912fb354/go.mod h1:WmhPx2Nbnhtbo57+VJT5O0JRkEi1Wbu0z5j0R8u5Hbk=
github.com/cncf/udpa/go v0.0.0-20201120205902-5459f2c99403/go.mod h1:WmhPx2Nbnhtbo57+VJT5O0JRkEi1Wbu0z5j0R8u5Hbk=
@@ -1620,8 +1627,8 @@ github.com/gorilla/sessions v1.2.1 h1:DHd3rPN5lE3Ts3D8rKkQ8x/0kqfeNmBAaiSi+o7Fsg
github.com/gorilla/sessions v1.2.1/go.mod h1:dk2InVEVJ0sfLlnXv9EAgkf6ecYs/i80K/zI+bUmuGM=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f h1:Br4SaUL3dnVopKKNhDavCLgehw60jdtl/sIxdfzmVts=
github.com/grafana/alerting v0.0.0-20251231150637-b7821017d69f/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f h1:3bXOyht68qkfvD6Y8z8XoenFbytSSOIkr/s+AqRzj0o=
github.com/grafana/alerting v0.0.0-20260112172717-98a49ed9557f/go.mod h1:Ji0SfJChcwjgq8ljy6Y5CcYfHfAYKXjKYeysOoDS/6s=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
@@ -1664,8 +1671,6 @@ github.com/grafana/grafana/apps/quotas v0.0.0-20251209183543-1013d74f13f2 h1:rDP
github.com/grafana/grafana/apps/quotas v0.0.0-20251209183543-1013d74f13f2/go.mod h1:M7bV60iRB61y0ISPG1HX/oNLZtlh0ZF22rUYwNkAKjo=
github.com/grafana/grafana/pkg/promlib v0.0.8 h1:VUWsqttdf0wMI4j9OX9oNrykguQpZcruudDAFpJJVw0=
github.com/grafana/grafana/pkg/promlib v0.0.8/go.mod h1:U1ezG/MGaEPoThqsr3lymMPN5yIPdVTJnDZ+wcXT+ao=
github.com/grafana/grafana/pkg/semconv v0.0.0-20250804150913-990f1c69ecc2 h1:A65jWgLk4Re28gIuZcpC0aTh71JZ0ey89hKGE9h543s=
github.com/grafana/grafana/pkg/semconv v0.0.0-20250804150913-990f1c69ecc2/go.mod h1:2HRzUK/xQEYc+8d5If/XSusMcaYq9IptnBSHACiQcOQ=
github.com/grafana/jsonparser v0.0.0-20240425183733-ea80629e1a32 h1:NznuPwItog+rwdVg8hAuGKP29ndRSzJAwhxKldkP8oQ=
github.com/grafana/jsonparser v0.0.0-20240425183733-ea80629e1a32/go.mod h1:796sq+UcONnSlzA3RtlBZ+b/hrerkZXiEmO8oMjyRwY=
github.com/grafana/loki/pkg/push v0.0.0-20250823105456-332df2b20000 h1:/5LKSYgLmAhwA4m6iGUD4w1YkydEWWjazn9qxCFT8W0=
@@ -1676,8 +1681,8 @@ github.com/grafana/nanogit v0.3.0 h1:XNEef+4Vi+465ZITJs/g/xgnDRJbWhhJ7iQrAnWZ0oQ
github.com/grafana/nanogit v0.3.0/go.mod h1:6s6CCTpyMOHPpcUZaLGI+rgBEKdmxVbhqSGgCK13j7Y=
github.com/grafana/otel-profiling-go v0.5.1 h1:stVPKAFZSa7eGiqbYuG25VcqYksR6iWvF3YH66t4qL8=
github.com/grafana/otel-profiling-go v0.5.1/go.mod h1:ftN/t5A/4gQI19/8MoWurBEtC6gFw8Dns1sJZ9W4Tls=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20250911094103-5456b6e45604 h1:aXfUhVN/Ewfpbko2CCtL65cIiGgwStOo4lWH2b6gw2U=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20250911094103-5456b6e45604/go.mod h1:O/QP1BCm0HHIzbKvgMzqb5sSyH88rzkFk84F4TfJjBU=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f h1:9tRhudagkQO2s61SLFLSziIdCm7XlkfypVKDxpcHokg=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f/go.mod h1:AsVdCBeDFN9QbgpJg+8voDAcgsW0RmNvBd70ecMMdC0=
github.com/grafana/pyroscope-go/godeltaprof v0.1.9 h1:c1Us8i6eSmkW+Ez05d3co8kasnuOY813tbMN8i/a3Og=
github.com/grafana/pyroscope-go/godeltaprof v0.1.9/go.mod h1:2+l7K7twW49Ct4wFluZD3tZ6e0SjanjcUUBPVD/UuGU=
github.com/grafana/pyroscope/api v1.2.1-0.20251118081820-ace37f973a0f h1:fTlIj5n4x5dU63XHItug7GLjtnaeJdPqBlqg4zlABq0=
@@ -1753,6 +1758,8 @@ github.com/hashicorp/go-rootcerts v1.0.2 h1:jzhAVGtqPKbwpyCPELlgNWhE1znq+qwJtW5O
github.com/hashicorp/go-rootcerts v1.0.2/go.mod h1:pqUvnprVnM5bf7AOirdbb01K4ccR319Vf4pU3K5EGc8=
github.com/hashicorp/go-secure-stdlib/parseutil v0.2.0 h1:U+kC2dOhMFQctRfhK0gRctKAPTloZdMU5ZJxaesJ/VM=
github.com/hashicorp/go-secure-stdlib/parseutil v0.2.0/go.mod h1:Ll013mhdmsVDuoIXVfBtvgGJsXDYkTw1kooNcoCXuE0=
github.com/hashicorp/go-secure-stdlib/plugincontainer v0.4.2 h1:gCNiM4T5xEc4IpT8vM50CIO+AtElr5kO9l2Rxbq+Sz8=
github.com/hashicorp/go-secure-stdlib/plugincontainer v0.4.2/go.mod h1:6ZM4ZdwClyAsiU2uDBmRHCvq0If/03BMbF9U+U7G5pA=
github.com/hashicorp/go-secure-stdlib/strutil v0.1.2 h1:kes8mmyCpxJsI7FTwtzRqEy9CdjCtrXrXGuOpxEA7Ts=
github.com/hashicorp/go-secure-stdlib/strutil v0.1.2/go.mod h1:Gou2R9+il93BqX25LAKCLuM+y9U2T4hlwvT1yprcna4=
github.com/hashicorp/go-sockaddr v1.0.0/go.mod h1:7Xibr9yA9JjQq1JpNB2Vw7kxv8xerXegt+ozgdvDeDU=
@@ -1877,6 +1884,10 @@ github.com/jonboulle/clockwork v0.5.0 h1:Hyh9A8u51kptdkR+cqRpT1EebBwTn1oK9YfGYbd
github.com/jonboulle/clockwork v0.5.0/go.mod h1:3mZlmanh0g2NDKO5TWZVJAfofYk64M7XN3SzBPjZF60=
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
github.com/josharian/intern v1.0.0/go.mod h1:5DoeVV0s6jJacbCEi61lwdGj/aVlrQvzHFFd8Hwg//Y=
github.com/joshlf/go-acl v0.0.0-20200411065538-eae00ae38531 h1:hgVxRoDDPtQE68PT4LFvNlPz2nBKd3OMlGKIQ69OmR4=
github.com/joshlf/go-acl v0.0.0-20200411065538-eae00ae38531/go.mod h1:fqTUQpVYBvhCNIsMXGl2GE9q6z94DIP6NtFKXCSTVbg=
github.com/joshlf/testutil v0.0.0-20170608050642-b5d8aa79d93d h1:J8tJzRyiddAFF65YVgxli+TyWBi0f79Sld6rJP6CBcY=
github.com/joshlf/testutil v0.0.0-20170608050642-b5d8aa79d93d/go.mod h1:b+Q3v8Yrg5o15d71PSUraUzYb+jWl6wQMSBXSGS/hv0=
github.com/jpillora/backoff v0.0.0-20180909062703-3050d21c67d7/go.mod h1:2iMrUgbbvHEiQClaW2NsSzMyGHqN+rDFqY705q49KG0=
github.com/jpillora/backoff v1.0.0 h1:uvFg412JmmHBHw7iwprIxkPMI+sGQ4kzOWsMeHnm2EA=
github.com/jpillora/backoff v1.0.0/go.mod h1:J/6gKK9jxlEcS3zixgDgUAsiuZ7yrSoa/FX5e0EB2j4=
+2 -1
View File
@@ -14,6 +14,7 @@ use (
./apps/collections
./apps/correlations
./apps/dashboard
./apps/dashvalidator
./apps/example
./apps/folder
./apps/iam
@@ -38,6 +39,6 @@ use (
./pkg/semconv
)
replace github.com/prometheus/alertmanager => github.com/grafana/prometheus-alertmanager v0.25.1-0.20250911094103-5456b6e45604
replace github.com/prometheus/alertmanager => github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f
replace github.com/crewjam/saml => github.com/grafana/saml v0.4.15-0.20240917091248-ae3bbdad8a56
+5 -1
View File
@@ -280,7 +280,6 @@ github.com/Azure/go-amqp v0.17.0/go.mod h1:9YJ3RhxRT1gquYnzpZO1vcYMMpAdJT+QEg6fw
github.com/Azure/go-amqp v1.4.0 h1:Xj3caqi4comOF/L1Uc5iuBxR/pB6KumejC01YQOqOR4=
github.com/Azure/go-amqp v1.4.0/go.mod h1:vZAogwdrkbyK3Mla8m/CxSc/aKdnTZ4IbPxl51Y5WZE=
github.com/Azure/go-ansiterm v0.0.0-20210617225240-d185dfc1b5a1/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=
github.com/Azure/go-autorest/autorest v0.11.18/go.mod h1:dSiJPy22c3u0OtOKDNttNgqpNFY/GeWa7GH/Pz56QRA=
github.com/Azure/go-autorest/autorest/azure/auth v0.5.13 h1:Ov8avRZi2vmrE2JcXw+tu5K/yB41r7xK9GZDiBF7NdM=
github.com/Azure/go-autorest/autorest/azure/auth v0.5.13/go.mod h1:5BAVfWLWXihP47vYrPuBKKf4cS0bXI+KM9Qx6ETDJYo=
@@ -906,6 +905,8 @@ github.com/gorilla/mux v1.8.0/go.mod h1:DVbg23sWSpFRCP0SfiEN6jmj59UnW/n46BH5rLB7
github.com/gorilla/websocket v1.4.2/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
github.com/grafana/alerting v0.0.0-20250729175202-b4b881b7b263/go.mod h1:VKxaR93Gff0ZlO2sPcdPVob1a/UzArFEW5zx3Bpyhls=
github.com/grafana/alerting v0.0.0-20251009192429-9427c24835ae/go.mod h1:VGjS5gDwWEADPP6pF/drqLxEImgeuHlEW5u8E5EfIrM=
github.com/grafana/alerting v0.0.0-20260112110054-6c6f13659ad3 h1:KVncUdAc5YwY/OQmw6HgzJmbRKn6IwrhvtcBAd1yDHo=
github.com/grafana/alerting v0.0.0-20260112110054-6c6f13659ad3/go.mod h1:Oy4MthJqfErlieO14ryZXdukDrUACy8Lg56P3zP7S1k=
github.com/grafana/authlib v0.0.0-20250710201142-9542f2f28d43/go.mod h1:1fWkOiL+m32NBgRHZtlZGz2ji868tPZACYbqP3nBRJI=
github.com/grafana/authlib/types v0.0.0-20250710201142-9542f2f28d43/go.mod h1:qeWYbnWzaYGl88JlL9+DsP1GT2Cudm58rLtx13fKZdw=
github.com/grafana/authlib/types v0.0.0-20250926065801-df98203cff37/go.mod h1:qeWYbnWzaYGl88JlL9+DsP1GT2Cudm58rLtx13fKZdw=
@@ -996,6 +997,8 @@ github.com/grafana/prometheus-alertmanager v0.25.1-0.20250331083058-4563aec7a975
github.com/grafana/prometheus-alertmanager v0.25.1-0.20250331083058-4563aec7a975/go.mod h1:FGdGvhI40Dq+CTQaSzK9evuve774cgOUdGfVO04OXkw=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20250604130045-92c8f6389b36 h1:AjZ58JRw1ZieFH/SdsddF5BXtsDKt5kSrKNPWrzYz3Y=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20250604130045-92c8f6389b36/go.mod h1:O/QP1BCm0HHIzbKvgMzqb5sSyH88rzkFk84F4TfJjBU=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f h1:9tRhudagkQO2s61SLFLSziIdCm7XlkfypVKDxpcHokg=
github.com/grafana/prometheus-alertmanager v0.25.1-0.20260112162805-d29cc9cf7f0f/go.mod h1:AsVdCBeDFN9QbgpJg+8voDAcgsW0RmNvBd70ecMMdC0=
github.com/grafana/pyroscope-go/godeltaprof v0.1.8/go.mod h1:2+l7K7twW49Ct4wFluZD3tZ6e0SjanjcUUBPVD/UuGU=
github.com/grafana/sqlds/v4 v4.2.4/go.mod h1:BQRjUG8rOqrBI4NAaeoWrIMuoNgfi8bdhCJ+5cgEfLU=
github.com/grafana/sqlds/v4 v4.2.7/go.mod h1:BQRjUG8rOqrBI4NAaeoWrIMuoNgfi8bdhCJ+5cgEfLU=
@@ -1911,6 +1914,7 @@ go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.37.0/go.mod h
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.19.0/go.mod h1:oVdCUtjq9MK9BlS7TtucsQwUcXcymNiEDjgDD2jMtZU=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.22.0/go.mod h1:hYwym2nDEeZfG/motx0p7L7J1N1vyzIThemQsb4g2qY=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.28.0/go.mod h1:Y5+XiUG4Emn1hTfciPzGPJaSI+RpDts6BnCIir0SLqk=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.36.0/go.mod h1:r49hO7CgrxY9Voaj3Xe8pANWtr0Oq916d0XAmOoCZAQ=
go.opentelemetry.io/otel/exporters/prometheus v0.58.0/go.mod h1:7qo/4CLI+zYSNbv0GMNquzuss2FVZo3OYrGh96n4HNc=
go.opentelemetry.io/otel/exporters/stdout/stdoutmetric v1.36.0/go.mod h1:dowW6UsM9MKbJq5JTz2AMVp3/5iW5I/TStsk8S+CfHw=
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.36.0/go.mod h1:PD57idA/AiFD5aqoxGxCvT/ILJPeHy3MjqU/NS7KogY=
+2 -4
View File
@@ -62,8 +62,7 @@
"stats": "webpack --mode production --config scripts/webpack/webpack.prod.js --profile --json > compilation-stats.json",
"storybook": "yarn workspace @grafana/ui storybook --ci",
"storybook:build": "yarn workspace @grafana/ui storybook:build",
"themes-schema": "typescript-json-schema ./tsconfig.json NewThemeOptions --include 'packages/grafana-data/src/themes/createTheme.ts' --out public/app/features/theme-playground/schema.generated.json",
"themes-generate": "yarn themes-schema && esbuild --target=es6 ./scripts/cli/generateSassVariableFiles.ts --bundle --conditions=@grafana-app/source --platform=node --tsconfig=./scripts/cli/tsconfig.json | node",
"themes-generate": "yarn workspace @grafana/data themes-schema && esbuild --target=es6 ./scripts/cli/generateSassVariableFiles.ts --bundle --conditions=@grafana-app/source --platform=node --tsconfig=./scripts/cli/tsconfig.json | node",
"themes:usage": "eslint . --ignore-pattern '*.test.ts*' --ignore-pattern '*.spec.ts*' --cache --plugin '@grafana' --rule '{ @grafana/theme-token-usage: \"error\" }'",
"typecheck": "tsc --noEmit && yarn run packages:typecheck",
"plugins:build-bundled": "echo 'bundled plugins are no longer supported'",
@@ -254,7 +253,6 @@
"ts-jest": "29.4.0",
"ts-node": "10.9.2",
"typescript": "5.9.2",
"typescript-json-schema": "^0.65.1",
"webpack": "5.101.0",
"webpack-assets-manifest": "^5.1.0",
"webpack-cli": "6.0.1",
@@ -265,7 +263,7 @@
"webpackbar": "^7.0.0",
"yaml": "^2.0.0",
"yargs": "^18.0.0",
"zod": "^4.0.0"
"zod": "^4.3.0"
},
"dependencies": {
"@bsull/augurs": "^0.10.0",
@@ -34,6 +34,8 @@ export function createBaseQuery({ baseURL }: CreateBaseQueryOptions): BaseQueryF
getBackendSrv().fetch({
...requestOptions,
url: baseURL + requestOptions.url,
// Default to GET so backend_srv correctly skips success alerts for queries
method: requestOptions.method ?? 'GET',
showErrorAlert: requestOptions.showErrorAlert ?? false,
data: requestOptions.body,
headers,
+7 -3
View File
@@ -47,11 +47,12 @@
"LICENSE_APACHE2"
],
"scripts": {
"build": "tsc -p ./tsconfig.build.json && rollup -c rollup.config.ts --configPlugin esbuild",
"build": "yarn themes-schema && tsc -p ./tsconfig.build.json && rollup -c rollup.config.ts --configPlugin esbuild",
"clean": "rimraf ./dist ./compiled ./unstable ./package.tgz",
"typecheck": "tsc --emitDeclarationOnly false --noEmit",
"prepack": "cp package.json package.json.bak && node ../../scripts/prepare-npm-package.js",
"postpack": "mv package.json.bak package.json"
"postpack": "mv package.json.bak package.json",
"themes-schema": "tsx ./src/themes/scripts/generateSchema.ts"
},
"dependencies": {
"@braintree/sanitize-url": "7.0.1",
@@ -81,10 +82,12 @@
"tinycolor2": "1.6.0",
"tslib": "2.8.1",
"uplot": "1.6.32",
"xss": "^1.0.14"
"xss": "^1.0.14",
"zod": "^4.3.0"
},
"devDependencies": {
"@grafana/scenes": "6.38.0",
"@rollup/plugin-json": "6.1.0",
"@rollup/plugin-node-resolve": "16.0.1",
"@testing-library/react": "16.3.0",
"@types/history": "4.7.11",
@@ -101,6 +104,7 @@
"rollup": "^4.22.4",
"rollup-plugin-esbuild": "6.2.1",
"rollup-plugin-node-externals": "^8.0.0",
"tsx": "^4.21.0",
"typescript": "5.9.2"
},
"peerDependencies": {
+3 -2
View File
@@ -1,3 +1,4 @@
import json from '@rollup/plugin-json';
import { createRequire } from 'node:module';
import { entryPoint, plugins, esmOutput, cjsOutput } from '../rollup.config.parts';
@@ -8,13 +9,13 @@ const pkg = rq('./package.json');
export default [
{
input: entryPoint,
plugins,
plugins: [...plugins, json()],
output: [cjsOutput(pkg, 'grafana-data'), esmOutput(pkg, 'grafana-data')],
treeshake: false,
},
{
input: 'src/unstable.ts',
plugins,
plugins: [...plugins, json()],
output: [cjsOutput(pkg, 'grafana-data'), esmOutput(pkg, 'grafana-data')],
treeshake: false,
},
@@ -106,3 +106,4 @@ export { findNumericFieldMinMax } from '../field/fieldOverrides';
export { type PanelOptionsSupplier } from '../panel/PanelPlugin';
export { sanitize, sanitizeUrl } from '../text/sanitize';
export { type NestedValueAccess, type NestedPanelOptions, isNestedPanelOptions } from '../utils/OptionsUIBuilders';
export { NewThemeOptionsSchema } from '../themes/createTheme';
@@ -1,83 +1,103 @@
import { merge } from 'lodash';
import { z } from 'zod';
import { alpha, darken, emphasize, getContrastRatio, lighten } from './colorManipulator';
import { palette } from './palette';
import { DeepPartial, ThemeRichColor } from './types';
import { DeepRequired, ThemeRichColor, ThemeRichColorInputSchema } from './types';
const ThemeColorsModeSchema = z.enum(['light', 'dark']);
/** @internal */
export type ThemeColorsMode = 'light' | 'dark';
export type ThemeColorsMode = z.infer<typeof ThemeColorsModeSchema>;
const createThemeColorsBaseSchema = <TColor>(color: TColor) =>
z
.object({
mode: ThemeColorsModeSchema,
primary: color,
secondary: color,
info: color,
error: color,
success: color,
warning: color,
text: z.object({
primary: z.string().optional(),
secondary: z.string().optional(),
disabled: z.string().optional(),
link: z.string().optional(),
/** Used for auto white or dark text on colored backgrounds */
maxContrast: z.string().optional(),
}),
background: z.object({
/** Dashboard and body background */
canvas: z.string().optional(),
/** Primary content pane background (panels etc) */
primary: z.string().optional(),
/** Cards and elements that need to stand out on the primary background */
secondary: z.string().optional(),
/**
* For popovers and menu backgrounds. This is the same color as primary in most light themes but in dark
* themes it has a brighter shade to help give it contrast against the primary background.
**/
elevated: z.string().optional(),
}),
border: z.object({
weak: z.string().optional(),
medium: z.string().optional(),
strong: z.string().optional(),
}),
gradients: z.object({
brandVertical: z.string().optional(),
brandHorizontal: z.string().optional(),
}),
action: z.object({
/** Used for selected menu item / select option */
selected: z.string().optional(),
/**
* @alpha (Do not use from plugins)
* Used for selected items when background only change is not enough (Currently only used for FilterPill)
**/
selectedBorder: z.string().optional(),
/** Used for hovered menu item / select option */
hover: z.string().optional(),
/** Used for button/colored background hover opacity */
hoverOpacity: z.number().optional(),
/** Used focused menu item / select option */
focus: z.string().optional(),
/** Used for disabled buttons and inputs */
disabledBackground: z.string().optional(),
/** Disabled text */
disabledText: z.string().optional(),
/** Disablerd opacity */
disabledOpacity: z.number().optional(),
}),
hoverFactor: z.number(),
contrastThreshold: z.number(),
tonalOffset: z.number(),
})
.partial();
// Need to override the zod type to include the generic properly
/** @internal */
export interface ThemeColorsBase<TColor> {
mode: ThemeColorsMode;
export type ThemeColorsBase<TColor> = DeepRequired<
Omit<
z.infer<ReturnType<typeof createThemeColorsBaseSchema>>,
'primary' | 'secondary' | 'info' | 'error' | 'success' | 'warning'
>
> & {
primary: TColor;
secondary: TColor;
info: TColor;
error: TColor;
success: TColor;
warning: TColor;
text: {
primary: string;
secondary: string;
disabled: string;
link: string;
/** Used for auto white or dark text on colored backgrounds */
maxContrast: string;
};
background: {
/** Dashboard and body background */
canvas: string;
/** Primary content pane background (panels etc) */
primary: string;
/** Cards and elements that need to stand out on the primary background */
secondary: string;
/**
* For popovers and menu backgrounds. This is the same color as primary in most light themes but in dark
* themes it has a brighter shade to help give it contrast against the primary background.
**/
elevated: string;
};
border: {
weak: string;
medium: string;
strong: string;
};
gradients: {
brandVertical: string;
brandHorizontal: string;
};
action: {
/** Used for selected menu item / select option */
selected: string;
/**
* @alpha (Do not use from plugins)
* Used for selected items when background only change is not enough (Currently only used for FilterPill)
**/
selectedBorder: string;
/** Used for hovered menu item / select option */
hover: string;
/** Used for button/colored background hover opacity */
hoverOpacity: number;
/** Used focused menu item / select option */
focus: string;
/** Used for disabled buttons and inputs */
disabledBackground: string;
/** Disabled text */
disabledText: string;
/** Disablerd opacity */
disabledOpacity: number;
};
hoverFactor: number;
contrastThreshold: number;
tonalOffset: number;
}
};
export interface ThemeHoverStrengh {}
@@ -89,8 +109,10 @@ export interface ThemeColors extends ThemeColorsBase<ThemeRichColor> {
emphasize(color: string, amount?: number): string;
}
export const ThemeColorsInputSchema = createThemeColorsBaseSchema(ThemeRichColorInputSchema);
/** @internal */
export type ThemeColorsInput = DeepPartial<ThemeColorsBase<ThemeRichColor>>;
export type ThemeColorsInput = z.infer<typeof ThemeColorsInputSchema>;
class DarkColors implements ThemeColorsBase<Partial<ThemeRichColor>> {
mode: ThemeColorsMode = 'dark';
@@ -1,3 +1,5 @@
import { z } from 'zod';
/** @beta */
export interface ThemeShape {
/**
@@ -34,9 +36,12 @@ export interface Radii {
}
/** @internal */
export interface ThemeShapeInput {
borderRadius?: number;
}
export const ThemeShapeInputSchema = z.object({
borderRadius: z.int().nonnegative().optional(),
});
/** @internal */
export type ThemeShapeInput = z.infer<typeof ThemeShapeInputSchema>;
export function createShape(options: ThemeShapeInput): ThemeShape {
const baseBorderRadius = options.borderRadius ?? 6;
@@ -1,11 +1,15 @@
// Code based on Material UI
// The MIT License (MIT)
// Copyright (c) 2014 Call-Em-All
import { z } from 'zod';
/** @internal */
export type ThemeSpacingOptions = {
gridSize?: number;
};
export const ThemeSpacingOptionsSchema = z.object({
gridSize: z.int().positive().optional(),
});
/** @internal */
export type ThemeSpacingOptions = z.infer<typeof ThemeSpacingOptionsSchema>;
/** @internal */
export type ThemeSpacingArgument = number | string;
+24 -15
View File
@@ -1,28 +1,37 @@
import * as z from 'zod';
import { createBreakpoints } from './breakpoints';
import { createColors, ThemeColorsInput } from './createColors';
import { createColors, ThemeColorsInputSchema } from './createColors';
import { createComponents } from './createComponents';
import { createShadows } from './createShadows';
import { createShape, ThemeShapeInput } from './createShape';
import { createSpacing, ThemeSpacingOptions } from './createSpacing';
import { createShape, ThemeShapeInputSchema } from './createShape';
import { createSpacing, ThemeSpacingOptionsSchema } from './createSpacing';
import { createTransitions } from './createTransitions';
import { createTypography, ThemeTypographyInput } from './createTypography';
import { createTypography, ThemeTypographyInputSchema } from './createTypography';
import { createV1Theme } from './createV1Theme';
import { createVisualizationColors, ThemeVisualizationColorsInput } from './createVisualizationColors';
import { createVisualizationColors, ThemeVisualizationColorsInputSchema } from './createVisualizationColors';
import { GrafanaTheme2 } from './types';
import { zIndex } from './zIndex';
/** @internal */
export interface NewThemeOptions {
name?: string;
colors?: ThemeColorsInput;
spacing?: ThemeSpacingOptions;
shape?: ThemeShapeInput;
typography?: ThemeTypographyInput;
visualization?: ThemeVisualizationColorsInput;
}
export const NewThemeOptionsSchema = z.object({
name: z.string(),
id: z.string(),
colors: ThemeColorsInputSchema.optional(),
spacing: ThemeSpacingOptionsSchema.optional(),
shape: ThemeShapeInputSchema.optional(),
typography: ThemeTypographyInputSchema.optional(),
visualization: ThemeVisualizationColorsInputSchema.optional(),
});
/** @internal */
export function createTheme(options: NewThemeOptions = {}): GrafanaTheme2 {
export type NewThemeOptions = z.infer<typeof NewThemeOptionsSchema>;
/** @internal */
export function createTheme(
options: Omit<NewThemeOptions, 'id' | 'name'> & {
name?: NewThemeOptions['name'];
} = {}
): GrafanaTheme2 {
const {
name,
colors: colorsInput = {},
@@ -1,6 +1,7 @@
// Code based on Material UI
// The MIT License (MIT)
// Copyright (c) 2014 Call-Em-All
import { z } from 'zod';
import { ThemeColors } from './createColors';
@@ -40,18 +41,20 @@ export interface ThemeTypographyVariant {
letterSpacing?: string;
}
export interface ThemeTypographyInput {
fontFamily?: string;
fontFamilyMonospace?: string;
fontSize?: number;
fontWeightLight?: number;
fontWeightRegular?: number;
fontWeightMedium?: number;
fontWeightBold?: number;
// hat's the font-size on the html element.
export const ThemeTypographyInputSchema = z.object({
fontFamily: z.string().optional(),
fontFamilyMonospace: z.string().optional(),
fontSize: z.number().positive().optional(),
fontWeightLight: z.number().positive().optional(),
fontWeightRegular: z.number().positive().optional(),
fontWeightMedium: z.number().positive().optional(),
fontWeightBold: z.number().positive().optional(),
// what's the font-size on the html element.
// 16px is the default font-size used by browsers.
htmlFontSize?: number;
}
htmlFontSize: z.number().positive().optional(),
});
export type ThemeTypographyInput = z.infer<typeof ThemeTypographyInputSchema>;
const defaultFontFamily = "'Inter', 'Helvetica', 'Arial', sans-serif";
const defaultFontFamilyMonospace = "'Roboto Mono', monospace";
@@ -1,3 +1,5 @@
import { z } from 'zod';
import { FALLBACK_COLOR } from '../types/fieldColor';
import { ThemeColors } from './createColors';
@@ -26,29 +28,44 @@ export interface ThemeVizColor<T extends ThemeVizColorName> {
type ThemeVizColorName = 'red' | 'orange' | 'yellow' | 'green' | 'blue' | 'purple';
type ThemeVizColorShadeName<T extends ThemeVizColorName> =
| `super-light-${T}`
| `light-${T}`
| T
| `semi-dark-${T}`
| `dark-${T}`;
const createShadeSchema = <T>(color: T extends ThemeVizColorName ? T : never) =>
z.enum([`super-light-${color}`, `light-${color}`, color, `semi-dark-${color}`, `dark-${color}`]);
type ThemeVizHueGeneric<T> = T extends ThemeVizColorName
? {
name: T;
shades: Array<ThemeVizColor<T>>;
}
: never;
type ThemeVizColorShadeName<T extends ThemeVizColorName> = z.infer<ReturnType<typeof createShadeSchema<T>>>;
const createHueSchema = <T>(color: T extends ThemeVizColorName ? T : never) =>
z.object({
name: z.literal(color),
shades: z.array(
z.object({
color: z.string(),
name: createShadeSchema(color),
aliases: z.array(z.string()).optional(),
primary: z.boolean().optional(),
})
),
});
const ThemeVizHueSchema = z.union([
createHueSchema('red'),
createHueSchema('orange'),
createHueSchema('yellow'),
createHueSchema('green'),
createHueSchema('blue'),
createHueSchema('purple'),
]);
/**
* @alpha
*/
export type ThemeVizHue = ThemeVizHueGeneric<ThemeVizColorName>;
export type ThemeVizHue = z.infer<typeof ThemeVizHueSchema>;
export type ThemeVisualizationColorsInput = {
hues?: ThemeVizHue[];
palette?: string[];
};
export const ThemeVisualizationColorsInputSchema = z.object({
hues: z.array(ThemeVizHueSchema).optional(),
palette: z.array(z.string()).optional(),
});
export type ThemeVisualizationColorsInput = z.infer<typeof ThemeVisualizationColorsInputSchema>;
/**
* @internal
+14 -11
View File
@@ -1,6 +1,6 @@
import { Registry, RegistryItem } from '../utils/Registry';
import { createTheme } from './createTheme';
import { createTheme, NewThemeOptionsSchema } from './createTheme';
import * as extraThemes from './themeDefinitions';
import { GrafanaTheme2 } from './types';
@@ -42,9 +42,6 @@ export function getBuiltInThemes(allowedExtras: string[]) {
return sortedThemes;
}
/**
* There is also a backend list at pkg/services/preference/themes.go
*/
const themeRegistry = new Registry<ThemeRegistryItem>(() => {
return [
{ id: 'system', name: 'System preference', build: getSystemPreferenceTheme },
@@ -53,13 +50,19 @@ const themeRegistry = new Registry<ThemeRegistryItem>(() => {
];
});
for (const [id, theme] of Object.entries(extraThemes)) {
themeRegistry.register({
id,
name: theme.name ?? '',
build: () => createTheme(theme),
isExtra: true,
});
for (const [name, json] of Object.entries(extraThemes)) {
const result = NewThemeOptionsSchema.safeParse(json);
if (!result.success) {
console.error(`Invalid theme definition for theme ${name}: ${result.error.message}`);
} else {
const theme = result.data;
themeRegistry.register({
id: theme.id,
name: theme.name,
build: () => createTheme(theme),
isExtra: true,
});
}
}
function getSystemPreferenceTheme() {
@@ -0,0 +1,608 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"name": {
"type": "string"
},
"id": {
"type": "string"
},
"colors": {
"type": "object",
"properties": {
"mode": {
"type": "string",
"enum": ["light", "dark"]
},
"primary": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"secondary": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"info": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"error": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"success": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"warning": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"main": {
"type": "string"
},
"shade": {
"type": "string"
},
"text": {
"type": "string"
},
"border": {
"type": "string"
},
"transparent": {
"type": "string"
},
"borderTransparent": {
"type": "string"
},
"contrastText": {
"type": "string"
}
},
"additionalProperties": false
},
"text": {
"type": "object",
"properties": {
"primary": {
"type": "string"
},
"secondary": {
"type": "string"
},
"disabled": {
"type": "string"
},
"link": {
"type": "string"
},
"maxContrast": {
"type": "string"
}
},
"additionalProperties": false
},
"background": {
"type": "object",
"properties": {
"canvas": {
"type": "string"
},
"primary": {
"type": "string"
},
"secondary": {
"type": "string"
},
"elevated": {
"type": "string"
}
},
"additionalProperties": false
},
"border": {
"type": "object",
"properties": {
"weak": {
"type": "string"
},
"medium": {
"type": "string"
},
"strong": {
"type": "string"
}
},
"additionalProperties": false
},
"gradients": {
"type": "object",
"properties": {
"brandVertical": {
"type": "string"
},
"brandHorizontal": {
"type": "string"
}
},
"additionalProperties": false
},
"action": {
"type": "object",
"properties": {
"selected": {
"type": "string"
},
"selectedBorder": {
"type": "string"
},
"hover": {
"type": "string"
},
"hoverOpacity": {
"type": "number"
},
"focus": {
"type": "string"
},
"disabledBackground": {
"type": "string"
},
"disabledText": {
"type": "string"
},
"disabledOpacity": {
"type": "number"
}
},
"additionalProperties": false
},
"hoverFactor": {
"type": "number"
},
"contrastThreshold": {
"type": "number"
},
"tonalOffset": {
"type": "number"
}
},
"additionalProperties": false
},
"spacing": {
"type": "object",
"properties": {
"gridSize": {
"type": "integer",
"exclusiveMinimum": 0,
"maximum": 9007199254740991
}
},
"additionalProperties": false
},
"shape": {
"type": "object",
"properties": {
"borderRadius": {
"type": "integer",
"minimum": 0,
"maximum": 9007199254740991
}
},
"additionalProperties": false
},
"typography": {
"type": "object",
"properties": {
"fontFamily": {
"type": "string"
},
"fontFamilyMonospace": {
"type": "string"
},
"fontSize": {
"type": "number",
"exclusiveMinimum": 0
},
"fontWeightLight": {
"type": "number",
"exclusiveMinimum": 0
},
"fontWeightRegular": {
"type": "number",
"exclusiveMinimum": 0
},
"fontWeightMedium": {
"type": "number",
"exclusiveMinimum": 0
},
"fontWeightBold": {
"type": "number",
"exclusiveMinimum": 0
},
"htmlFontSize": {
"type": "number",
"exclusiveMinimum": 0
}
},
"additionalProperties": false
},
"visualization": {
"type": "object",
"properties": {
"hues": {
"type": "array",
"items": {
"anyOf": [
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "red"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-red", "light-red", "red", "semi-dark-red", "dark-red"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "orange"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-orange", "light-orange", "orange", "semi-dark-orange", "dark-orange"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "yellow"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-yellow", "light-yellow", "yellow", "semi-dark-yellow", "dark-yellow"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "green"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-green", "light-green", "green", "semi-dark-green", "dark-green"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "blue"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-blue", "light-blue", "blue", "semi-dark-blue", "dark-blue"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
},
{
"type": "object",
"properties": {
"name": {
"type": "string",
"const": "purple"
},
"shades": {
"type": "array",
"items": {
"type": "object",
"properties": {
"color": {
"type": "string"
},
"name": {
"type": "string",
"enum": ["super-light-purple", "light-purple", "purple", "semi-dark-purple", "dark-purple"]
},
"aliases": {
"type": "array",
"items": {
"type": "string"
}
},
"primary": {
"type": "boolean"
}
},
"required": ["color", "name"],
"additionalProperties": false
}
}
},
"required": ["name", "shades"],
"additionalProperties": false
}
]
}
},
"palette": {
"type": "array",
"items": {
"type": "string"
}
}
},
"additionalProperties": false
}
},
"required": ["name", "id"],
"additionalProperties": false
}

Some files were not shown because too many files have changed in this diff Show More