Compare commits

..

78 Commits

Author SHA1 Message Date
Ryan McKinley
ae0cf22b46 merge main 2025-12-11 09:47:23 +03:00
Ryan McKinley
2fc1210b38 Stars: Enable the collections apiserver (#115076) 2025-12-11 06:36:09 +00:00
Ryan McKinley
8542b2f6a2 Live: Move dashboard events from the raw http server to the apiserver (#115066) 2025-12-11 09:26:35 +03:00
Ryan McKinley
a6043deb33 UnifiedStorage: Include RV when fieldSelectors are processed in the backend (#115110) 2025-12-11 09:15:01 +03:00
Stephanie Hingtgen
3697c8dafc Dashboards: Fix logging for conversions (#115126) 2025-12-10 23:52:35 -06:00
Charandas
3a4022061d K8s: discourage nil authorizer return for APIBuilder as well (#115116) 2025-12-10 23:06:09 +00:00
beejeebus
2a65e0cdcb Revert "Wire up data source config metrics correctly"
This reverts commit e433cfa02d.
2025-12-10 17:38:00 -05:00
Paul Marbach
000c00aee9 Sparkline: Improve min/max logic to avoid issues for very narrow deltas (#115030)
* Sparkline: Prevent infinite loop when rendering a sparkline with a single value

* some tests for this case

* refactor out utils, experiment with getting highlightIndex working

* add comments throughout for #112977

* remove unused import

* Update Sparkline.test.tsx

* fix points mode rendering

* Sparkline: Improve min/max logic to avoid issues for very narrow deltas

* spread all config

* defaults deep

* delete unused import

* remove go.work.sum delta

* line break at end of file
2025-12-10 16:54:29 -05:00
beejeebus
e433cfa02d Wire up data source config metrics correctly
Fix metrics for data source configuration CRUD.

Make sure to only create one histogram and only register it with prometheus once.
2025-12-10 16:16:22 -05:00
Will Assis
30045c02c0 unified-storage: add index on resource_history key_path column (#115113) 2025-12-10 16:03:58 -05:00
Paul Marbach
63bfc1596c Gauge: Updates to the design of the panel edit (#115097)
* Gauge: Updates to the design of the panel edit

* i18n

* remove unused const

* i18n

* fix gdev and migration tests

* reduce bar gauge glow factor
2025-12-10 21:03:44 +00:00
Charandas
da14be859e Authorization: panic when specific authorizer returns nil (#114982) 2025-12-10 13:01:34 -08:00
ismail simsek
30d3bb39c0 Chore: Remove deprecated language_provider methods in prometheus package (#114361)
* remove deprecated language provider and its methods

* remove more deprecated code

* yarn lint:prune
2025-12-10 21:07:18 +01:00
Yunwen Zheng
b3f98d4cc3 SaveProvisionedDashboard: Provisioned dashboard "Save as copy" flow (#114435)
* SaveProvisionedDashboard: Save as copy flow

* add copy tags toggle
2025-12-10 14:01:51 -05:00
Isabel Matwawana
f368139802 Docs: Add permissions information (#115107) 2025-12-10 18:24:36 +00:00
Adela Almasan
7ae9f94de7 Suggestions: Handle errors (#114868)
Co-authored-by: Paul Marbach <paul.marbach@grafana.com>
2025-12-10 18:17:24 +00:00
Alexander Akhmetov
a46f0a222e Alerting: Initialize rule routine with initial alert rule fingerprint (#114979)
Alerting: Initialize rule routine with initial fingerprint
2025-12-10 19:14:30 +01:00
Paul Marbach
1146ac790c Sparkline: Prevent infinite loop when rendering a sparkline with a single value (#114203)
* Sparkline: Prevent infinite loop when rendering a sparkline with a single value

* some tests for this case

* refactor out utils, experiment with getting highlightIndex working

* add comments throughout for #112977

* remove unused import

* Update Sparkline.test.tsx

* fix points mode rendering
2025-12-10 12:37:05 -05:00
Gábor Farkas
a847f36df2 datasources: querier: log caller (#115087) 2025-12-10 18:23:59 +01:00
Gabriel MABILLE
9e1fe16873 AuthZ: Remove automatic Admin grant for root folders and dashboards (#115098) 2025-12-10 10:00:24 -07:00
Tania
3ec1c27ad4 Chore: Migrate pluginsAutoUpdate flag to OpenFeature (#114404)
* Chore: Migrate pluginsAutoUpdate flag to OpenFeature

* Update workspace

* fixup! Chore: Migrate pluginsAutoUpdate flag to OpenFeature

* Add a test

* Refactor

* Apply suggestion from @hairyhenderson

Co-authored-by: Dave Henderson <dave.henderson@grafana.com>

* Apply suggestions

* Update pkg/services/updatemanager/plugins_test.go

Co-authored-by: Will Browne <wbrowne@users.noreply.github.com>

* Reorder code blocks

---------

Co-authored-by: Dave Henderson <dave.henderson@grafana.com>
Co-authored-by: Will Browne <wbrowne@users.noreply.github.com>
2025-12-10 17:40:30 +01:00
Yunwen Zheng
094b6a36dc Add feature flag: recentlyViewedDashboards (#115042) 2025-12-10 11:28:19 -05:00
Yuri Tseretyan
47f7b3e095 Alerting: Dedicated permission for Template testing API (#115032) 2025-12-10 10:56:29 -05:00
owensmallwood
5e7b900416 Unified Storage: Adds readme for setting up quotas/overrides (#115031)
* adds readme for setting up quotas/overrides

* updates namespace wording

* updates docs

* update test

* Revert "update test"

This reverts commit ad43e355ba.
2025-12-10 09:21:52 -06:00
Rafael Bortolon Paulovic
5eae7d4f22 feat: legacy ListIterator with batches (#115038)
* feat: legacy ListIterator with batches

* chore: address code review

* chore: remove nil check in nextBatch

* chore: move close before count check

* chore: add err field to batchingIterator for its own errors

* chore: remove unused import
2025-12-10 16:12:08 +01:00
Rafael Bortolon Paulovic
8c6ccdd1ab feat(dashboard): Org-aware cache for schema migration (#115025)
* fix: use dsIndexProvider cache on migrations

* chore: use same comment as before

* feat: org-aware TTL cache for schemaversion migration and warmup for single tenant

* chore: use LRU cache

* chore: change DefaultCacheTTL to 1 minute

* chore: address copilot reviews

* chore: use expirable cache

* chore: remove unused import
2025-12-10 16:09:16 +01:00
Cauê Marcondes
85c643ece9 Elasticsearch: Add default query mode config setting (#112540)
* elasticsearch: Add default query mode config setting

* doc

* syncing default query mode with url

* addressing PR comments
2025-12-10 15:07:22 +00:00
Galen Kistler
8272edda96 Logs: Default columns API (#114309)
* Logs Drilldown(app-platform): add LogsDrilldownDefaultColumns api

---------

Co-authored-by: L2D2Grafana <liza.detrick@grafana.com>
Co-authored-by: Austin Pond <austin.pond@grafana.com>
2025-12-10 09:02:14 -06:00
Todd Treece
e56c2c5156 Plugins App: Remove unused import (#115096) 2025-12-10 14:59:12 +00:00
Todd Treece
ac55fad1ba Plugins App: Switch to resource authorizer (#115019) 2025-12-10 09:12:26 -05:00
Will Browne
c4c1708e38 Plugins: Sync channel close for app installer readiness (#115078)
sync channel close
2025-12-10 14:07:55 +00:00
Sonia Aguilar
92a8dd8b53 Alerting: Add gh in CLAUDE.md (#114992)
add gh in CLAUDE.md
2025-12-10 14:52:48 +01:00
Ashley Harrison
cc1bba85e4 VizLegend: Always display header for screenreader users (#115003)
always display vizlegend header for screenreader users
2025-12-10 13:44:21 +00:00
Ashley Harrison
27482194e3 InteractiveTable: Improve accessibility and reenable tests (#115002)
* attempt at fixing some stuff

* tidy up

* prettier

* fix suppressions
2025-12-10 13:44:08 +00:00
Ida Štambuk
ea331dc0d3 Dashboards: Add variables with datasource to tracking (#114110) 2025-12-10 14:39:58 +01:00
Steve Simpson
baee9fb214 Alerting: Add historian.alerting app permissions to service identity. (#115082) 2025-12-10 13:05:10 +00:00
Will Browne
39f4b2a959 Plugins: Rename current meta provider to catalog provider (#114966)
rename cloud provider to catalog provider
2025-12-10 12:22:05 +00:00
Will Assis
755b479be4 unified-storage: make sql backend update key_path for kv store (#114879)
* unified-storage: update resource_history_update_rv.sql to populate key_path in resource_history
2025-12-10 07:06:06 -05:00
Dominik Prokop
532a2e5f4d VariablesEditableElement: Set margin correctly (#115079) 2025-12-10 13:04:42 +01:00
Matias Chomicki
a7bbca3451 Logs Panel: Emphasize log line, rename field (#114579)
* Logs: Rename attributes field

* LogLine: emphasize log line body

* LogLine: improve light mode

* Lint

* Update tests

* Only override colors if displayed fields are used

* Fix small font size ignored with displayed fields

* Fix types
2025-12-10 11:19:31 +01:00
dependabot[bot]
633332c750 deps(docker): bump alpine from 3.22.2 to 3.23.0 (#114816)
Bumps alpine from 3.22.2 to 3.23.0.

---
updated-dependencies:
- dependency-name: alpine
  dependency-version: 3.23.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-10 10:34:23 +01:00
Misi
8911785fdf Chore: Regenerate iam app objects with 0.48.5 sdk (#115035)
* Regenerate iam app with 0.48.5 sdk

* update ws

---------

Co-authored-by: Ryan McKinley <ryantxu@gmail.com>
2025-12-10 09:15:12 +00:00
Stephanie Hingtgen
4fd03bc05e Folders: Fix error handling for zanzana (#115056) 2025-12-10 11:45:22 +03:00
Hugo Häggmark
d1761606fb Plugins: Add PluginContext to plugins when scenes is disabled (#114989)
Plugins: Add PluginContext to plugins when scenes is disabled
2025-12-10 06:44:46 +01:00
grafana-pr-automation[bot]
4fe481ec81 I18n: Download translations from Crowdin (#115054)
New Crowdin translations by GitHub Action

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2025-12-10 00:42:27 +00:00
owensmallwood
6b50e2d730 Unified Storage: Update yaml decoding for quotas to accomodate top-level overrides key (#115049)
* update yaml decoding for quotas to accomodate top-level overrides key

* update test

* fix test indentation
2025-12-09 23:14:22 +00:00
Stephanie Hingtgen
ff43c175c8 Folders: Remove duplicate check; improve unit test (#115048) 2025-12-09 22:15:25 +00:00
Jacob Valdez
ae03b08c25 Docs: Creating upgrade guide content for annotation table bloat (#114883) 2025-12-09 16:12:49 -06:00
Todd Treece
0088e55b8f Plugins App: PluginMeta -> Meta (#115034) 2025-12-09 16:01:22 -05:00
Stephanie Hingtgen
d9fc183e39 Folders: Prevent circular dependencies on apis level (#115040) 2025-12-09 15:00:01 -06:00
Nathan Marrs
6bbb00d0a2 Short URL: Change default expiration to never (#115029)
* Short Links: Change default expiration to never expire (-1)

Previously, short links defaulted to expiring after 7 days. This change
updates the default to -1 (never expire) to prevent automatic deletion
of shared dashboard links.

Changes:
- conf/defaults.ini: Set expire_time = -1 and update comment
- conf/sample.ini: Set expire_time = -1 and update comment
- pkg/setting/setting.go: Update MustInt default from 7 to -1

The cleanup logic already handles -1 correctly (only runs when > 0),
so no changes needed there.

This unblocks progress on short URL feature improvements by ensuring
shared links remain accessible indefinitely by default.

* fix go

* update docs / comments

* update missed comment in sample.ini

* Revert "fix go"

This reverts commit e0d099ae31.

* chore: update workspace dependencies

Run 'make update-workspace' to sync Go workspace dependencies.
This updates go.mod and go.sum files to match the current workspace state.

* chore: add modowner for apps/quotas dependency

Assign @grafana/grafana-search-and-storage as owner for apps/quotas
dependency to satisfy modowners CI check.
2025-12-09 20:23:27 +00:00
Paul Marbach
83b0b14af6 Suggestions: Hook project up to auto-triaging (#114984) 2025-12-09 14:53:55 -05:00
Todd Treece
18280e1aa6 Chore: Add owner for quotas in go.mod (#115039) 2025-12-09 19:48:19 +00:00
Mihai Doarna
b3980eeec8 IAM: Add tracing for legacy stores (#114974) 2025-12-09 13:46:18 -06:00
owensmallwood
b8acfade21 Unified Storage: Adds debug logs for checking quotas (#115036)
* adds debug logs for checking quotas

* make update-workspace
2025-12-09 18:59:46 +00:00
Jesse David Peterson
1013d74f13 TimeRange: Additional keyboard shortcut t = to complement t + for zoom in (#115022)
feat(time-range): additional keyboard shortcut "t =" for zoom in
2025-12-09 14:35:43 -04:00
Marc M.
4b999cd943 FeatureToggles: Add multiPropsVariables (#115020) 2025-12-09 18:11:49 +01:00
Stephanie Hingtgen
747da28fe4 Docs: Remove unused feature toggle logsinfinitescrolling (#114983) 2025-12-09 10:52:01 -06:00
Matias Chomicki
3197892094 Log Line Details: Header options and inline icons improvements (#114479)
* LogLineDetailsHeader: introduce divider

* LogLineDetails: improve icons spacing

* useKeyBindings: close sidebar details with escape
2025-12-09 17:15:05 +01:00
Matias Chomicki
6e4de0f81c LogLine: add trailing spaces for copying (#114543)
* LogLine: add trailing spaces for copying

* Lint
2025-12-09 17:08:09 +01:00
Kristina Demeshchik
533ee1f078 Dashboard : Allow applying variable regex to display text (#114426)
* Ability to apply regex to display text

* Frontend tests

* scenes-react version

* lock file

* adjust tests input

* adjust inputs

* unused variable

* change data type

* unit tests

* bump scenes

* bump scenes

* Update docs

* V2->V1 conversion

* re-generate files

* update openai snapshots
2025-12-09 10:55:51 -05:00
Rafael Bortolon Paulovic
45e679eeba fix: use dsIndexProvider cache on schema migrations (#115018)
* fix: use dsIndexProvider cache on migrations

* chore: use same comment as before
2025-12-09 16:54:54 +01:00
owensmallwood
a3daf0e39d Unified storage: Add quotas app to apiserver (#114425)
* initial generation

* went through doc to add new resource

* added dummy kind so grafana will run

* added dummy handler and custom route

* fix app name

* gets custom route working - still a dummy route

* adds groupOverride to manifest

* adds quotas to grpc client and server

* WIP - trying to get api recognized - not working

* Gets route working

* fixes group and resource vars

* expects group and resource as separate params

* set content-type header on response

* removes Quotas kind and regens

* Update grafana-app-sdk to v0.48.5

* Update codegen

* updates manifest

* formatting

* updates grafana-app-sdk version to 0.48.5

* regen ResourceClient mocks

* adds tests

* remove commented code

* uncomment go mod tidy

* fix tests and make update workspace

* adds quotas app to codeowners

* formatting

* make gen-apps

* deletes temp file

* fix generated folder code

* make gofmt

* make gen-go

* make update-workspace

* add COPY apps/quotas to Dockerfile

* fix test mock

* fixes undefined NewFolderStatus()

* make gen-apps, and add func for NewFolderStatus

* make gen-apps again

* make update-workspace

* regen folder_object_gen.go

* gofmt

* fix linting

* apps/folder make update-workspace

* make gen-apps

* make gen-apps

* fixes enterprise_imports.go

* go get testcontainers

* adds feature toggle

* make update-workspace

* fix go mod

* fix another client mock

---------

Co-authored-by: Steve Simpson <steve@grafana.com>
2025-12-09 09:40:34 -06:00
Rafael Bortolon Paulovic
297e886e1b fix: remove dsIndexProvider from Convert_V2alpha1_to_V0 (#115017) 2025-12-09 16:33:43 +01:00
Mihai Doarna
8602ec7924 IAM: Add integration tests for team search (#114996)
add integration tests for team search
2025-12-09 17:31:38 +02:00
Renato Costa
83311049ad fix: create dashboard in legacy storage within transaction (#114808)
fix: create dashboard within transaction
2025-12-09 10:16:33 -05:00
Kristina Demeshchik
8c4b3d1702 Dashboard: Default dashboard folder to current folder when importing (#114929)
* Set default folder when importing dashboard

* console

* remove unused import
2025-12-09 09:59:21 -05:00
Santiago
b8ad272159 Alerting: Fix header precedence in the remote writer (#114999) 2025-12-09 15:38:57 +01:00
Torkel Ödegaard
b5f1573aef AppChrome: Increase header height from 40-48 (#115004)
AppChrome: Inrease header height from 40-48
2025-12-09 15:19:50 +01:00
Jean-Philippe Quéméner
1f5fd1c0da chore(unified-storage): align how we do tracing (#114998) 2025-12-09 14:53:53 +01:00
Christian Simon
3e66c7ed21 CI: Add Docker Hub authentication to ephemeral instances workflow (#114851)
* CI: Add Docker Hub authentication to ephemeral instances workflow

Add Docker Hub login step to avoid unauthenticated image pull
rate-limiting in the ephemeral-instances-pr-comment workflow.

* Use the correct vault path
2025-12-09 13:15:15 +00:00
Alexander Akhmetov
c59d5d1c8e Alerting: Store instance annotations in alert rule state (#114975)
Alerting: Store annotations in alert instance state
2025-12-09 13:52:42 +01:00
Tobias Skarhed
6746c978b4 Scopes: Resolve path directly from leaf node bugfix (#114507)
* Resolve path directly from leaf node

* Add childrenLoaded field

* Add tests and remove parentNodeId from changeScopes

* Move parentNodeId patameter order

* Resotre call order

* Undo superflous change

* Add comments

* Make sure childrenLoaded state is properly set default to false

* Reference parent path

* Look for parent in state and fetch scopeNode if it is not avilable

* Check for undefined

* Add mock to test

* Set scopeNodeId with recent scopes

* Improve test selector

* Add scope node endpoint to mocks

* Never set childrenLoaded to true when inserting

* Remove unused import

* Pass on the already set childrenLoaded value

* Fix test
2025-12-09 13:44:00 +01:00
Oscar Kilhed
3cac27c611 Dashboards V2: Show banner when a dashboard has been converted from v2 to v1. (#114960)
* Show warning banner when convertion from v2 to v1

* fix edit state

* Reset eslint-suppression.json

* Always show, update copy

* reset eslint-supression.json

* Update copy

* reset eslint-supression.json
2025-12-09 13:38:26 +01:00
Ryan McKinley
8cae4eb0af Investigations: avoid useoldmanifestkinds and remove unused status (#114903) 2025-12-09 15:11:07 +03:00
Ryan McKinley
3b6b828cd9 Merge remote-tracking branch 'origin/main' into remove-mysqlAnsiQuotes 2025-11-20 16:39:33 +03:00
Ryan McKinley
1ac1ebd057 fix tests 2025-11-18 23:01:44 +03:00
Ryan McKinley
ffd34a8bb6 remove mysqlAnsiQuotes FF 2025-11-18 22:40:04 +03:00
549 changed files with 15854 additions and 7705 deletions

1
.github/CODEOWNERS vendored
View File

@@ -85,6 +85,7 @@
# Git Sync frontend owned by frontend team as a whole.
/apps/alerting/ @grafana/alerting-backend
/apps/quotas/ @grafana/grafana-search-and-storage
/apps/dashboard/ @grafana/grafana-app-platform-squad @grafana/dashboards-squad
/apps/folder/ @grafana/grafana-app-platform-squad
/apps/playlist/ @grafana/grafana-app-platform-squad

View File

@@ -1226,5 +1226,13 @@
"addToProject": {
"url": "https://github.com/orgs/grafana/projects/69"
}
},
{
"type": "label",
"name": "area/suggestions",
"action": "addToProject",
"addToProject": {
"url": "https://github.com/orgs/grafana/projects/56"
}
}
]

View File

@@ -469,5 +469,15 @@
"addToProject": {
"url": "https://github.com/orgs/grafana/projects/190"
}
},
{
"type": "changedfiles",
"matches": [
"public/app/features/panel/suggestions/**/*",
"public/app/plugins/panel/**/suggestions.ts",
"packages/grafana-data/src/types/suggestions*"
],
"action": "updateLabel",
"addLabel": "area/suggestions"
}
]

View File

@@ -85,6 +85,7 @@ area/scenes
area/search
area/security
area/streaming
area/suggestions
area/templating/repeating
area/tooltip
area/transformations

View File

@@ -33,6 +33,16 @@ jobs:
GCOM_TOKEN=ephemeral-instances-bot:gcom-token
REGISTRY=ephemeral-instances-bot:registry
GCP_SA_ACCOUNT_KEY_BASE64=ephemeral-instances-bot:sa-key
# Secrets placed in the ci/common/<path> path in Vault
common_secrets: |
DOCKERHUB_USERNAME=dockerhub:username
DOCKERHUB_PASSWORD=dockerhub:password
- name: Log in to Docker Hub to avoid unauthenticated image pull rate-limiting
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ env.DOCKERHUB_USERNAME }}
password: ${{ env.DOCKERHUB_PASSWORD }}
- name: Generate a GitHub app installation token
id: generate_token

View File

@@ -14,7 +14,7 @@ ARG JS_SRC=js-builder
# Dependabot cannot update dependencies listed in ARGs
# By using FROM instructions we can delegate dependency updates to dependabot
FROM alpine:3.22.2 AS alpine-base
FROM alpine:3.23.0 AS alpine-base
FROM ubuntu:22.04 AS ubuntu-base
FROM golang:1.25.5-alpine AS go-builder-base
FROM --platform=${JS_PLATFORM} node:24-alpine AS js-builder-base
@@ -93,6 +93,7 @@ COPY pkg/storage/unified/apistore pkg/storage/unified/apistore
COPY pkg/semconv pkg/semconv
COPY pkg/aggregator pkg/aggregator
COPY apps/playlist apps/playlist
COPY apps/quotas apps/quotas
COPY apps/plugins apps/plugins
COPY apps/shorturl apps/shorturl
COPY apps/annotation apps/annotation

View File

@@ -224,6 +224,8 @@ github.com/grafana/alerting v0.0.0-20251204145817-de8c2bbf9eba h1:psKWNETD5nGxmF
github.com/grafana/alerting v0.0.0-20251204145817-de8c2bbf9eba/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4 h1:jSojuc7njleS3UOz223WDlXOinmuLAIPI0z2vtq8EgI=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4/go.mod h1:VahT+GtfQIM+o8ht2StR6J9g+Ef+C2Vokh5uuSmOD/4=
github.com/grafana/grafana-app-sdk v0.48.5 h1:MS8l9fTZz+VbTfgApn09jw27GxhQ6fNOWGhC4ydvZmM=
github.com/grafana/grafana-app-sdk v0.48.5/go.mod h1:HJsMOSBmt/D/Ihs1SvagOwmXKi0coBMVHlfvdd+qe9Y=
github.com/grafana/grafana-app-sdk/logging v0.48.3 h1:72NUpGNiJXCNQz/on++YSsl38xuVYYBKv5kKQaOClX4=
github.com/grafana/grafana-app-sdk/logging v0.48.3/go.mod h1:Gh/nBWnspK3oDNWtiM5qUF/fardHzOIEez+SPI3JeHA=
github.com/grafana/loki/pkg/push v0.0.0-20250823105456-332df2b20000 h1:/5LKSYgLmAhwA4m6iGUD4w1YkydEWWjazn9qxCFT8W0=

View File

@@ -8,9 +8,8 @@ import (
func (stars *StarsSpec) Add(group, kind, name string) {
for i, r := range stars.Resource {
if r.Group == group && r.Kind == kind {
r.Names = append(r.Names, name)
slices.Sort(r.Names)
stars.Resource[i].Names = slices.Compact(r.Names)
stars.Resource[i].Names = append(r.Names, name)
stars.Normalize()
return
}
}
@@ -46,8 +45,15 @@ func (stars *StarsSpec) Normalize() {
resources := make([]StarsResource, 0, len(stars.Resource))
for _, r := range stars.Resource {
if len(r.Names) > 0 {
slices.Sort(r.Names)
r.Names = slices.Compact(r.Names) // removes any duplicates
unique := make([]string, 0, len(r.Names))
found := make(map[string]bool, len(r.Names))
for _, name := range r.Names {
if !found[name] {
unique = append(unique, name)
found[name] = true
}
}
r.Names = unique
resources = append(resources, r)
}
}

View File

@@ -39,7 +39,7 @@ func TestStarsWrite(t *testing.T) {
Resource: []StarsResource{{
Group: "g",
Kind: "k",
Names: []string{"a", "b", "c", "x"}, // added "b" (and sorted)
Names: []string{"a", "b", "x", "c"}, // added c to the end
}},
},
}, {

View File

@@ -57,6 +57,7 @@ require (
github.com/hashicorp/go-hclog v1.6.3 // indirect
github.com/hashicorp/go-multierror v1.1.1 // indirect
github.com/hashicorp/go-plugin v1.7.0 // indirect
github.com/hashicorp/golang-lru/v2 v2.0.7 // indirect
github.com/hashicorp/yamux v0.1.2 // indirect
github.com/jaegertracing/jaeger-idl v0.5.0 // indirect
github.com/josharian/intern v1.0.0 // indirect

View File

@@ -112,6 +112,8 @@ github.com/hashicorp/go-multierror v1.1.1 h1:H5DkEtf6CXdFp0N0Em5UCwQpXMWke8IA0+l
github.com/hashicorp/go-multierror v1.1.1/go.mod h1:iw975J/qwKPdAO1clOe2L8331t/9/fmwbPZ6JB6eMoM=
github.com/hashicorp/go-plugin v1.7.0 h1:YghfQH/0QmPNc/AZMTFE3ac8fipZyZECHdDPshfk+mA=
github.com/hashicorp/go-plugin v1.7.0/go.mod h1:BExt6KEaIYx804z8k4gRzRLEvxKVb+kn0NMcihqOqb8=
github.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k=
github.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM=
github.com/hashicorp/yamux v0.1.2 h1:XtB8kyFOyHXYVFnwT5C3+Bdo8gArse7j2AQ0DA0Uey8=
github.com/hashicorp/yamux v0.1.2/go.mod h1:C+zze2n6e/7wshOZep2A70/aQU6QBRWJO/G6FT1wIns=
github.com/jaegertracing/jaeger-idl v0.5.0 h1:zFXR5NL3Utu7MhPg8ZorxtCBjHrL3ReM1VoB65FOFGE=

View File

@@ -768,6 +768,10 @@ VariableRefresh: *"never" | "onDashboardLoad" | "onTimeRangeChanged"
// Accepted values are `dontHide` (show label and value), `hideLabel` (show value only), `hideVariable` (show nothing).
VariableHide: *"dontHide" | "hideLabel" | "hideVariable"
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
VariableRegexApplyTo: *"value" | "text"
// Determine the origin of the adhoc variable filter
FilterOrigin: "dashboard"
@@ -803,6 +807,7 @@ QueryVariableSpec: {
datasource?: DataSourceRef
query: DataQueryKind
regex: string | *""
regexApplyTo?: VariableRegexApplyTo
sort: VariableSort
definition?: string
options: [...VariableOption] | *[]

View File

@@ -772,6 +772,10 @@ VariableRefresh: *"never" | "onDashboardLoad" | "onTimeRangeChanged"
// Accepted values are `dontHide` (show label and value), `hideLabel` (show value only), `hideVariable` (show nothing), `inControlsMenu` (show in a drop-down menu).
VariableHide: *"dontHide" | "hideLabel" | "hideVariable" | "inControlsMenu"
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
VariableRegexApplyTo: *"value" | "text"
// Determine the origin of the adhoc variable filter
FilterOrigin: "dashboard"
@@ -806,6 +810,7 @@ QueryVariableSpec: {
description?: string
query: DataQueryKind
regex: string | *""
regexApplyTo?: VariableRegexApplyTo
sort: VariableSort
definition?: string
options: [...VariableOption] | *[]

View File

@@ -222,6 +222,8 @@ lineage: schemas: [{
// Optional field, if you want to extract part of a series name or metric node segment.
// Named capture groups can be used to separate the display text and value.
regex?: string
// Determine whether regex applies to variable value or display text
regexApplyTo?: #VariableRegexApplyTo
// Additional static options for query variable
staticOptions?: [...#VariableOption]
// Ordering of static options in relation to options returned from data source for query variable
@@ -249,6 +251,10 @@ lineage: schemas: [{
// Accepted values are 0 (show label and value), 1 (show value only), 2 (show nothing), 3 (show under the controls dropdown menu).
#VariableHide: 0 | 1 | 2 | 3 @cuetsy(kind="enum",memberNames="dontHide|hideLabel|hideVariable|inControlsMenu") @grafana(TSVeneer="type")
// Determine whether regex applies to variable value or display text
// Accepted values are "value" (apply to value used in queries) or "text" (apply to display text shown to users)
#VariableRegexApplyTo: "value" | "text" @cuetsy(kind="type")
// Sort variable options
// Accepted values are:
// `0`: No sorting

View File

@@ -222,6 +222,8 @@ lineage: schemas: [{
// Optional field, if you want to extract part of a series name or metric node segment.
// Named capture groups can be used to separate the display text and value.
regex?: string
// Determine whether regex applies to variable value or display text
regexApplyTo?: #VariableRegexApplyTo
// Additional static options for query variable
staticOptions?: [...#VariableOption]
// Ordering of static options in relation to options returned from data source for query variable
@@ -249,6 +251,10 @@ lineage: schemas: [{
// Accepted values are 0 (show label and value), 1 (show value only), 2 (show nothing), 3 (show under the controls dropdown menu).
#VariableHide: 0 | 1 | 2 | 3 @cuetsy(kind="enum",memberNames="dontHide|hideLabel|hideVariable|inControlsMenu") @grafana(TSVeneer="type")
// Determine whether regex applies to variable value or display text
// Accepted values are "value" (apply to value used in queries) or "text" (apply to display text shown to users)
#VariableRegexApplyTo: "value" | "text" @cuetsy(kind="type")
// Sort variable options
// Accepted values are:
// `0`: No sorting

View File

@@ -772,6 +772,10 @@ VariableRefresh: *"never" | "onDashboardLoad" | "onTimeRangeChanged"
// Accepted values are `dontHide` (show label and value), `hideLabel` (show value only), `hideVariable` (show nothing).
VariableHide: *"dontHide" | "hideLabel" | "hideVariable"
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
VariableRegexApplyTo: *"value" | "text"
// Determine the origin of the adhoc variable filter
FilterOrigin: "dashboard"
@@ -807,6 +811,7 @@ QueryVariableSpec: {
datasource?: DataSourceRef
query: DataQueryKind
regex: string | *""
regexApplyTo?: VariableRegexApplyTo
sort: VariableSort
definition?: string
options: [...VariableOption] | *[]

View File

@@ -1364,6 +1364,7 @@ type DashboardQueryVariableSpec struct {
Datasource *DashboardDataSourceRef `json:"datasource,omitempty"`
Query DashboardDataQueryKind `json:"query"`
Regex string `json:"regex"`
RegexApplyTo *DashboardVariableRegexApplyTo `json:"regexApplyTo,omitempty"`
Sort DashboardVariableSort `json:"sort"`
Definition *string `json:"definition,omitempty"`
Options []DashboardVariableOption `json:"options"`
@@ -1393,6 +1394,7 @@ func NewDashboardQueryVariableSpec() *DashboardQueryVariableSpec {
SkipUrlSync: false,
Query: *NewDashboardDataQueryKind(),
Regex: "",
RegexApplyTo: (func(input DashboardVariableRegexApplyTo) *DashboardVariableRegexApplyTo { return &input })(DashboardVariableRegexApplyToValue),
Options: []DashboardVariableOption{},
Multi: false,
IncludeAll: false,
@@ -1443,6 +1445,16 @@ const (
DashboardVariableRefreshOnTimeRangeChanged DashboardVariableRefresh = "onTimeRangeChanged"
)
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
// +k8s:openapi-gen=true
type DashboardVariableRegexApplyTo string
const (
DashboardVariableRegexApplyToValue DashboardVariableRegexApplyTo = "value"
DashboardVariableRegexApplyToText DashboardVariableRegexApplyTo = "text"
)
// Sort variable options
// Accepted values are:
// `disabled`: No sorting

View File

@@ -3646,6 +3646,12 @@ func schema_pkg_apis_dashboard_v2alpha1_DashboardQueryVariableSpec(ref common.Re
Format: "",
},
},
"regexApplyTo": {
SchemaProps: spec.SchemaProps{
Type: []string{"string"},
Format: "",
},
},
"sort": {
SchemaProps: spec.SchemaProps{
Default: "",

View File

@@ -776,6 +776,10 @@ VariableRefresh: *"never" | "onDashboardLoad" | "onTimeRangeChanged"
// Accepted values are `dontHide` (show label and value), `hideLabel` (show value only), `hideVariable` (show nothing), `inControlsMenu` (show in a drop-down menu).
VariableHide: *"dontHide" | "hideLabel" | "hideVariable" | "inControlsMenu"
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
VariableRegexApplyTo: *"value" | "text"
// Determine the origin of the adhoc variable filter
FilterOrigin: "dashboard"
@@ -810,6 +814,7 @@ QueryVariableSpec: {
description?: string
query: DataQueryKind
regex: string | *""
regexApplyTo?: VariableRegexApplyTo
sort: VariableSort
definition?: string
options: [...VariableOption] | *[]

View File

@@ -1367,6 +1367,7 @@ type DashboardQueryVariableSpec struct {
Description *string `json:"description,omitempty"`
Query DashboardDataQueryKind `json:"query"`
Regex string `json:"regex"`
RegexApplyTo *DashboardVariableRegexApplyTo `json:"regexApplyTo,omitempty"`
Sort DashboardVariableSort `json:"sort"`
Definition *string `json:"definition,omitempty"`
Options []DashboardVariableOption `json:"options"`
@@ -1396,6 +1397,7 @@ func NewDashboardQueryVariableSpec() *DashboardQueryVariableSpec {
SkipUrlSync: false,
Query: *NewDashboardDataQueryKind(),
Regex: "",
RegexApplyTo: (func(input DashboardVariableRegexApplyTo) *DashboardVariableRegexApplyTo { return &input })(DashboardVariableRegexApplyToValue),
Options: []DashboardVariableOption{},
Multi: false,
IncludeAll: false,
@@ -1447,6 +1449,16 @@ const (
DashboardVariableRefreshOnTimeRangeChanged DashboardVariableRefresh = "onTimeRangeChanged"
)
// Determine whether regex applies to variable value or display text
// Accepted values are `value` (apply to value used in queries) or `text` (apply to display text shown to users)
// +k8s:openapi-gen=true
type DashboardVariableRegexApplyTo string
const (
DashboardVariableRegexApplyToValue DashboardVariableRegexApplyTo = "value"
DashboardVariableRegexApplyToText DashboardVariableRegexApplyTo = "text"
)
// Sort variable options
// Accepted values are:
// `disabled`: No sorting

View File

@@ -3656,6 +3656,12 @@ func schema_pkg_apis_dashboard_v2beta1_DashboardQueryVariableSpec(ref common.Ref
Format: "",
},
},
"regexApplyTo": {
SchemaProps: spec.SchemaProps{
Type: []string{"string"},
Format: "",
},
},
"sort": {
SchemaProps: spec.SchemaProps{
Default: "",

File diff suppressed because one or more lines are too long

View File

@@ -12,13 +12,6 @@ import (
)
func RegisterConversions(s *runtime.Scheme, dsIndexProvider schemaversion.DataSourceIndexProvider, leIndexProvider schemaversion.LibraryElementIndexProvider) error {
// Wrap the provider once with 10s caching for all conversions.
// This prevents repeated DB queries across multiple conversion calls while allowing
// the cache to refresh periodically, making it suitable for long-lived singleton usage.
dsIndexProvider = schemaversion.WrapIndexProviderWithCache(dsIndexProvider)
// Wrap library element provider with caching as well
leIndexProvider = schemaversion.WrapLibraryElementProviderWithCache(leIndexProvider)
// v0 conversions
if err := s.AddConversionFunc((*dashv0.Dashboard)(nil), (*dashv1.Dashboard)(nil),
withConversionMetrics(dashv0.APIVERSION, dashv1.APIVERSION, func(a, b interface{}, scope conversion.Scope) error {
@@ -62,13 +55,13 @@ func RegisterConversions(s *runtime.Scheme, dsIndexProvider schemaversion.DataSo
// v2alpha1 conversions
if err := s.AddConversionFunc((*dashv2alpha1.Dashboard)(nil), (*dashv0.Dashboard)(nil),
withConversionMetrics(dashv2alpha1.APIVERSION, dashv0.APIVERSION, func(a, b interface{}, scope conversion.Scope) error {
return Convert_V2alpha1_to_V0(a.(*dashv2alpha1.Dashboard), b.(*dashv0.Dashboard), scope, dsIndexProvider)
return Convert_V2alpha1_to_V0(a.(*dashv2alpha1.Dashboard), b.(*dashv0.Dashboard), scope)
})); err != nil {
return err
}
if err := s.AddConversionFunc((*dashv2alpha1.Dashboard)(nil), (*dashv1.Dashboard)(nil),
withConversionMetrics(dashv2alpha1.APIVERSION, dashv1.APIVERSION, func(a, b interface{}, scope conversion.Scope) error {
return Convert_V2alpha1_to_V1beta1(a.(*dashv2alpha1.Dashboard), b.(*dashv1.Dashboard), scope, dsIndexProvider)
return Convert_V2alpha1_to_V1beta1(a.(*dashv2alpha1.Dashboard), b.(*dashv1.Dashboard), scope)
})); err != nil {
return err
}

View File

@@ -0,0 +1,454 @@
package conversion
import (
"context"
"sync/atomic"
"testing"
"time"
dashv0 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v0alpha1"
dashv1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v1beta1"
dashv2alpha1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v2alpha1"
dashv2beta1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v2beta1"
"github.com/grafana/grafana/apps/dashboard/pkg/migration"
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
common "github.com/grafana/grafana/pkg/apimachinery/apis/common/v0alpha1"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
)
// countingDataSourceProvider tracks how many times Index() is called
type countingDataSourceProvider struct {
datasources []schemaversion.DataSourceInfo
callCount atomic.Int64
}
func newCountingDataSourceProvider(datasources []schemaversion.DataSourceInfo) *countingDataSourceProvider {
return &countingDataSourceProvider{
datasources: datasources,
}
}
func (p *countingDataSourceProvider) Index(_ context.Context) *schemaversion.DatasourceIndex {
p.callCount.Add(1)
return schemaversion.NewDatasourceIndex(p.datasources)
}
func (p *countingDataSourceProvider) getCallCount() int64 {
return p.callCount.Load()
}
// countingLibraryElementProvider tracks how many times GetLibraryElementInfo() is called
type countingLibraryElementProvider struct {
elements []schemaversion.LibraryElementInfo
callCount atomic.Int64
}
func newCountingLibraryElementProvider(elements []schemaversion.LibraryElementInfo) *countingLibraryElementProvider {
return &countingLibraryElementProvider{
elements: elements,
}
}
func (p *countingLibraryElementProvider) GetLibraryElementInfo(_ context.Context) []schemaversion.LibraryElementInfo {
p.callCount.Add(1)
return p.elements
}
func (p *countingLibraryElementProvider) getCallCount() int64 {
return p.callCount.Load()
}
// createTestV0Dashboard creates a minimal v0 dashboard for testing
// The dashboard has a datasource with UID only (no type) to force provider lookup
// and includes library panels to test library element provider caching
func createTestV0Dashboard(namespace, title string) *dashv0.Dashboard {
return &dashv0.Dashboard{
ObjectMeta: metav1.ObjectMeta{
Name: "test-dashboard",
Namespace: namespace,
},
Spec: common.Unstructured{
Object: map[string]interface{}{
"title": title,
"schemaVersion": schemaversion.LATEST_VERSION,
// Variables with datasource reference that requires lookup
"templating": map[string]interface{}{
"list": []interface{}{
map[string]interface{}{
"name": "query_var",
"type": "query",
"query": "label_values(up, job)",
// Datasource with UID only - type needs to be looked up
"datasource": map[string]interface{}{
"uid": "ds1",
// type is intentionally omitted to trigger provider lookup
},
},
},
},
"panels": []interface{}{
map[string]interface{}{
"id": 1,
"title": "Test Panel",
"type": "timeseries",
"targets": []interface{}{
map[string]interface{}{
// Datasource with UID only - type needs to be looked up
"datasource": map[string]interface{}{
"uid": "ds1",
},
},
},
},
// Library panel reference - triggers library element provider lookup
map[string]interface{}{
"id": 2,
"title": "Library Panel with Horizontal Repeat",
"type": "library-panel-ref",
"gridPos": map[string]interface{}{
"h": 8,
"w": 12,
"x": 0,
"y": 8,
},
"libraryPanel": map[string]interface{}{
"uid": "lib-panel-repeat-h",
"name": "Library Panel with Horizontal Repeat",
},
},
// Another library panel reference
map[string]interface{}{
"id": 3,
"title": "Library Panel without Repeat",
"type": "library-panel-ref",
"gridPos": map[string]interface{}{
"h": 3,
"w": 6,
"x": 0,
"y": 16,
},
"libraryPanel": map[string]interface{}{
"uid": "lib-panel-no-repeat",
"name": "Library Panel without Repeat",
},
},
},
},
},
}
}
// createTestV1Dashboard creates a minimal v1beta1 dashboard for testing
// The dashboard has a datasource with UID only (no type) to force provider lookup
// and includes library panels to test library element provider caching
func createTestV1Dashboard(namespace, title string) *dashv1.Dashboard {
return &dashv1.Dashboard{
ObjectMeta: metav1.ObjectMeta{
Name: "test-dashboard",
Namespace: namespace,
},
Spec: common.Unstructured{
Object: map[string]interface{}{
"title": title,
"schemaVersion": schemaversion.LATEST_VERSION,
// Variables with datasource reference that requires lookup
"templating": map[string]interface{}{
"list": []interface{}{
map[string]interface{}{
"name": "query_var",
"type": "query",
"query": "label_values(up, job)",
// Datasource with UID only - type needs to be looked up
"datasource": map[string]interface{}{
"uid": "ds1",
// type is intentionally omitted to trigger provider lookup
},
},
},
},
"panels": []interface{}{
map[string]interface{}{
"id": 1,
"title": "Test Panel",
"type": "timeseries",
"targets": []interface{}{
map[string]interface{}{
// Datasource with UID only - type needs to be looked up
"datasource": map[string]interface{}{
"uid": "ds1",
},
},
},
},
// Library panel reference - triggers library element provider lookup
map[string]interface{}{
"id": 2,
"title": "Library Panel with Vertical Repeat",
"type": "library-panel-ref",
"gridPos": map[string]interface{}{
"h": 4,
"w": 6,
"x": 0,
"y": 8,
},
"libraryPanel": map[string]interface{}{
"uid": "lib-panel-repeat-v",
"name": "Library Panel with Vertical Repeat",
},
},
// Another library panel reference
map[string]interface{}{
"id": 3,
"title": "Library Panel without Repeat",
"type": "library-panel-ref",
"gridPos": map[string]interface{}{
"h": 3,
"w": 6,
"x": 6,
"y": 8,
},
"libraryPanel": map[string]interface{}{
"uid": "lib-panel-no-repeat",
"name": "Library Panel without Repeat",
},
},
},
},
},
}
}
// TestConversionCaching_V0_to_V2alpha1 verifies caching works when converting V0 to V2alpha1
func TestConversionCaching_V0_to_V2alpha1(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-h", Name: "Library Panel with Horizontal Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, time.Minute)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, time.Minute)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
// Convert multiple dashboards in the same namespace
numDashboards := 5
namespace := "default"
for i := 0; i < numDashboards; i++ {
source := createTestV0Dashboard(namespace, "Dashboard "+string(rune('A'+i)))
target := &dashv2alpha1.Dashboard{}
err := Convert_V0_to_V2alpha1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion %d should succeed", i)
require.NotNil(t, target.Spec)
}
// With caching, the underlying datasource provider should only be called once per namespace
// The test dashboard has datasources without type that require lookup
assert.Equal(t, int64(1), underlyingDS.getCallCount(),
"datasource provider should be called only once for %d conversions in same namespace", numDashboards)
// Library element provider should also be called only once per namespace due to caching
assert.Equal(t, int64(1), underlyingLE.getCallCount(),
"library element provider should be called only once for %d conversions in same namespace", numDashboards)
}
// TestConversionCaching_V0_to_V2beta1 verifies caching works when converting V0 to V2beta1
func TestConversionCaching_V0_to_V2beta1(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-h", Name: "Library Panel with Horizontal Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, time.Minute)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, time.Minute)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
numDashboards := 5
namespace := "default"
for i := 0; i < numDashboards; i++ {
source := createTestV0Dashboard(namespace, "Dashboard "+string(rune('A'+i)))
target := &dashv2beta1.Dashboard{}
err := Convert_V0_to_V2beta1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion %d should succeed", i)
require.NotNil(t, target.Spec)
}
assert.Equal(t, int64(1), underlyingDS.getCallCount(),
"datasource provider should be called only once for %d conversions in same namespace", numDashboards)
assert.Equal(t, int64(1), underlyingLE.getCallCount(),
"library element provider should be called only once for %d conversions in same namespace", numDashboards)
}
// TestConversionCaching_V1beta1_to_V2alpha1 verifies caching works when converting V1beta1 to V2alpha1
func TestConversionCaching_V1beta1_to_V2alpha1(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-v", Name: "Library Panel with Vertical Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, time.Minute)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, time.Minute)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
numDashboards := 5
namespace := "default"
for i := 0; i < numDashboards; i++ {
source := createTestV1Dashboard(namespace, "Dashboard "+string(rune('A'+i)))
target := &dashv2alpha1.Dashboard{}
err := Convert_V1beta1_to_V2alpha1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion %d should succeed", i)
require.NotNil(t, target.Spec)
}
assert.Equal(t, int64(1), underlyingDS.getCallCount(),
"datasource provider should be called only once for %d conversions in same namespace", numDashboards)
assert.Equal(t, int64(1), underlyingLE.getCallCount(),
"library element provider should be called only once for %d conversions in same namespace", numDashboards)
}
// TestConversionCaching_V1beta1_to_V2beta1 verifies caching works when converting V1beta1 to V2beta1
func TestConversionCaching_V1beta1_to_V2beta1(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-v", Name: "Library Panel with Vertical Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, time.Minute)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, time.Minute)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
numDashboards := 5
namespace := "default"
for i := 0; i < numDashboards; i++ {
source := createTestV1Dashboard(namespace, "Dashboard "+string(rune('A'+i)))
target := &dashv2beta1.Dashboard{}
err := Convert_V1beta1_to_V2beta1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion %d should succeed", i)
require.NotNil(t, target.Spec)
}
assert.Equal(t, int64(1), underlyingDS.getCallCount(),
"datasource provider should be called only once for %d conversions in same namespace", numDashboards)
assert.Equal(t, int64(1), underlyingLE.getCallCount(),
"library element provider should be called only once for %d conversions in same namespace", numDashboards)
}
// TestConversionCaching_MultipleNamespaces verifies that different namespaces get separate cache entries
func TestConversionCaching_MultipleNamespaces(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-h", Name: "Library Panel with Horizontal Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, time.Minute)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, time.Minute)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
namespaces := []string{"default", "org-2", "org-3"}
numDashboardsPerNs := 3
for _, ns := range namespaces {
for i := 0; i < numDashboardsPerNs; i++ {
source := createTestV0Dashboard(ns, "Dashboard "+string(rune('A'+i)))
target := &dashv2alpha1.Dashboard{}
err := Convert_V0_to_V2alpha1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion for namespace %s should succeed", ns)
}
}
// With caching, each namespace should result in one call to the underlying provider
expectedCalls := int64(len(namespaces))
assert.Equal(t, expectedCalls, underlyingDS.getCallCount(),
"datasource provider should be called once per namespace (%d namespaces)", len(namespaces))
assert.Equal(t, expectedCalls, underlyingLE.getCallCount(),
"library element provider should be called once per namespace (%d namespaces)", len(namespaces))
}
// TestConversionCaching_CacheDisabled verifies that TTL=0 disables caching
func TestConversionCaching_CacheDisabled(t *testing.T) {
datasources := []schemaversion.DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
elements := []schemaversion.LibraryElementInfo{
{UID: "lib-panel-repeat-h", Name: "Library Panel with Horizontal Repeat", Type: "timeseries"},
{UID: "lib-panel-no-repeat", Name: "Library Panel without Repeat", Type: "graph"},
}
underlyingDS := newCountingDataSourceProvider(datasources)
underlyingLE := newCountingLibraryElementProvider(elements)
// TTL of 0 should disable caching - the wrapper returns the underlying provider directly
cachedDS := schemaversion.WrapIndexProviderWithCache(underlyingDS, 0)
cachedLE := schemaversion.WrapLibraryElementProviderWithCache(underlyingLE, 0)
migration.ResetForTesting()
migration.Initialize(cachedDS, cachedLE, migration.DefaultCacheTTL)
numDashboards := 3
namespace := "default"
for i := 0; i < numDashboards; i++ {
source := createTestV0Dashboard(namespace, "Dashboard "+string(rune('A'+i)))
target := &dashv2alpha1.Dashboard{}
err := Convert_V0_to_V2alpha1(source, target, nil, cachedDS, cachedLE)
require.NoError(t, err, "conversion %d should succeed", i)
}
// Without caching, each conversion calls the underlying provider multiple times
// (once for each datasource lookup needed - variables and panels)
// The key check is that the count is GREATER than 1 per conversion (no caching benefit)
assert.Greater(t, underlyingDS.getCallCount(), int64(numDashboards),
"with cache disabled, conversions should call datasource provider multiple times")
// Library element provider is also called for each conversion without caching
assert.GreaterOrEqual(t, underlyingLE.getCallCount(), int64(numDashboards),
"with cache disabled, conversions should call library element provider multiple times")
}

View File

@@ -488,7 +488,7 @@ func withConversionDataLossDetection(sourceFuncName, targetFuncName string, conv
// Detect if data was lost
if dataLossErr := detectConversionDataLoss(sourceStats, targetStats, sourceFuncName, targetFuncName); dataLossErr != nil {
logger.Error("Dashboard conversion data loss detected",
getLogger().Error("Dashboard conversion data loss detected",
"sourceFunc", sourceFuncName,
"targetFunc", targetFuncName,
"sourcePanels", sourceStats.panelCount,
@@ -504,7 +504,7 @@ func withConversionDataLossDetection(sourceFuncName, targetFuncName string, conv
return dataLossErr
}
logger.Debug("Dashboard conversion completed without data loss",
getLogger().Debug("Dashboard conversion completed without data loss",
"sourceFunc", sourceFuncName,
"targetFunc", targetFuncName,
"panels", targetStats.panelCount,

View File

@@ -829,7 +829,7 @@ func TestDataLossDetectionOnAllInputFiles(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := testutil.NewDataSourceProvider(testutil.StandardTestConfig)
leProvider := testutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()

View File

@@ -35,7 +35,7 @@ func TestConversionMatrixExist(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
versions := []metav1.Object{
&dashv0.Dashboard{Spec: common.Unstructured{Object: map[string]any{"title": "dashboardV0"}}},
@@ -89,7 +89,7 @@ func TestDashboardConversionToAllVersions(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()
@@ -309,7 +309,7 @@ func TestMigratedDashboardsConversion(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()
@@ -428,7 +428,7 @@ func setupTestConversionScheme(t *testing.T) *runtime.Scheme {
t.Helper()
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
scheme := runtime.NewScheme()
err := RegisterConversions(scheme, dsProvider, leProvider)
@@ -527,7 +527,7 @@ func TestConversionMetrics(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Create a test registry for metrics
registry := prometheus.NewRegistry()
@@ -694,7 +694,7 @@ func TestConversionMetricsWrapper(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Create a test registry for metrics
registry := prometheus.NewRegistry()
@@ -864,7 +864,7 @@ func TestSchemaVersionExtraction(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Create a test registry for metrics
registry := prometheus.NewRegistry()
@@ -910,7 +910,7 @@ func TestConversionLogging(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Create a test registry for metrics
registry := prometheus.NewRegistry()
@@ -1003,7 +1003,7 @@ func TestConversionLogLevels(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("log levels and structured fields verification", func(t *testing.T) {
// Create test wrapper to verify logging behavior
@@ -1076,7 +1076,7 @@ func TestConversionLoggingFields(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
// Use TestLibraryElementProvider for tests that need library panel models with repeat options
leProvider := migrationtestutil.NewTestLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("verify all log fields are present", func(t *testing.T) {
// Test that the conversion wrapper includes all expected structured fields

View File

@@ -17,7 +17,9 @@ import (
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
)
var logger = logging.DefaultLogger.With("logger", "dashboard.conversion")
func getLogger() logging.Logger {
return logging.DefaultLogger.With("logger", "dashboard.conversion")
}
// getErroredSchemaVersionFunc determines the schema version function that errored
func getErroredSchemaVersionFunc(err error) string {
@@ -197,9 +199,9 @@ func withConversionMetrics(sourceVersionAPI, targetVersionAPI string, conversion
)
if errorType == "schema_minimum_version_error" {
logger.Warn("Dashboard conversion failed", logFields...)
getLogger().Warn("Dashboard conversion failed", logFields...)
} else {
logger.Error("Dashboard conversion failed", logFields...)
getLogger().Error("Dashboard conversion failed", logFields...)
}
} else {
// Record success metrics
@@ -235,7 +237,7 @@ func withConversionMetrics(sourceVersionAPI, targetVersionAPI string, conversion
)
}
logger.Debug("Dashboard conversion succeeded", successLogFields...)
getLogger().Debug("Dashboard conversion succeeded", successLogFields...)
}
return nil

View File

@@ -76,9 +76,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -155,9 +155,9 @@
"barGlow": false,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -234,9 +234,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -313,9 +313,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -392,9 +392,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -471,9 +471,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -550,9 +550,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -642,9 +642,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -721,9 +721,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -800,9 +800,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -879,9 +879,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -975,9 +975,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1054,9 +1054,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1133,9 +1133,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1212,9 +1212,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1291,9 +1291,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1387,9 +1387,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1470,9 +1470,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1553,9 +1553,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1645,10 +1645,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1731,10 +1731,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1831,10 +1831,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1919,10 +1919,10 @@
"centerGlow": true,
"rounded": true,
"sparkline": false,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2005,10 +2005,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2091,10 +2091,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2147,4 +2147,4 @@
"title": "Panel tests - Gauge (new)",
"uid": "panel-tests-gauge-new",
"weekStart": ""
}
}

View File

@@ -956,9 +956,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1162,4 +1162,4 @@
"title": "Panel tests - Old gauge to new",
"uid": "panel-tests-old-gauge-to-new",
"weekStart": ""
}
}

View File

@@ -42,7 +42,7 @@
"regex": "",
"skipUrlSync": false,
"refresh": 1
},
},
{
"name": "query_var",
"type": "query",
@@ -81,6 +81,7 @@
"allValue": ".*",
"multi": true,
"regex": "/.*9090.*/",
"regexApplyTo": "text",
"skipUrlSync": false,
"refresh": 2,
"sort": 1,
@@ -107,7 +108,7 @@
},
{
"selected": false,
"text": "staging",
"text": "staging",
"value": "staging"
},
{
@@ -335,6 +336,7 @@
"allValue": "*",
"multi": true,
"regex": "/host[0-9]+/",
"regexApplyTo": "value",
"skipUrlSync": false,
"refresh": 1,
"sort": 2,
@@ -354,4 +356,4 @@
},
"links": []
}
}
}

View File

@@ -82,9 +82,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -161,9 +161,9 @@
"barGlow": false,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -240,9 +240,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -319,9 +319,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -398,9 +398,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -477,9 +477,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -556,9 +556,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -648,9 +648,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -727,9 +727,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -806,9 +806,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -885,9 +885,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -981,9 +981,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1060,9 +1060,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1139,9 +1139,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1218,9 +1218,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1297,9 +1297,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1393,9 +1393,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1476,9 +1476,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1559,9 +1559,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1651,10 +1651,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1737,10 +1737,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1837,10 +1837,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1925,10 +1925,10 @@
"centerGlow": true,
"rounded": true,
"sparkline": false,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2011,10 +2011,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2097,10 +2097,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2160,4 +2160,4 @@
"storedVersion": "v0alpha1"
}
}
}
}

View File

@@ -78,9 +78,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -171,10 +171,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -268,10 +268,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -367,10 +367,10 @@
"centerGlow": true,
"rounded": true,
"sparkline": false,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -465,10 +465,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -562,10 +562,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -658,9 +658,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -750,9 +750,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -842,9 +842,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -934,9 +934,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1026,9 +1026,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1118,9 +1118,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1206,9 +1206,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1298,9 +1298,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1390,9 +1390,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1482,9 +1482,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1578,9 +1578,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1666,9 +1666,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1758,9 +1758,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1854,9 +1854,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1950,9 +1950,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2050,9 +2050,9 @@
"barGlow": false,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2142,9 +2142,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2234,9 +2234,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2327,10 +2327,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2821,4 +2821,4 @@
"variables": []
},
"status": {}
}
}

View File

@@ -82,9 +82,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -178,10 +178,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -278,10 +278,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -380,10 +380,10 @@
"centerGlow": true,
"rounded": true,
"sparkline": false,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -481,10 +481,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -581,10 +581,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -680,9 +680,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -775,9 +775,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -870,9 +870,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -965,9 +965,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1060,9 +1060,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1155,9 +1155,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1246,9 +1246,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1341,9 +1341,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1436,9 +1436,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1531,9 +1531,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1630,9 +1630,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1721,9 +1721,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1816,9 +1816,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1915,9 +1915,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2014,9 +2014,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2117,9 +2117,9 @@
"barGlow": false,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2212,9 +2212,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2307,9 +2307,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2403,10 +2403,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2897,4 +2897,4 @@
"variables": []
},
"status": {}
}
}

View File

@@ -962,9 +962,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1175,4 +1175,4 @@
"storedVersion": "v0alpha1"
}
}
}
}

View File

@@ -865,9 +865,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1615,4 +1615,4 @@
"variables": []
},
"status": {}
}
}

View File

@@ -902,9 +902,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1667,4 +1667,4 @@
"variables": []
},
"status": {}
}
}

View File

@@ -94,6 +94,7 @@
"query": "label_values(up, instance)",
"refresh": 2,
"regex": "/.*9090.*/",
"regexApplyTo": "text",
"skipUrlSync": false,
"sort": 1,
"tagValuesQuery": "",
@@ -362,6 +363,7 @@
},
"refresh": 1,
"regex": "/host[0-9]+/",
"regexApplyTo": "value",
"skipUrlSync": false,
"sort": 2,
"tagValuesQuery": "",

View File

@@ -110,6 +110,7 @@
}
},
"regex": "/.*9090.*/",
"regexApplyTo": "text",
"sort": "alphabeticalAsc",
"definition": "label_values(up, instance)",
"options": [
@@ -401,6 +402,7 @@
}
},
"regex": "/host[0-9]+/",
"regexApplyTo": "value",
"sort": "alphabeticalDesc",
"definition": "terms field:@host size:100",
"options": [],

View File

@@ -111,6 +111,7 @@
}
},
"regex": "/.*9090.*/",
"regexApplyTo": "text",
"sort": "alphabeticalAsc",
"definition": "label_values(up, instance)",
"options": [
@@ -404,6 +405,7 @@
}
},
"regex": "/host[0-9]+/",
"regexApplyTo": "value",
"sort": "alphabeticalDesc",
"definition": "terms field:@host size:100",
"options": [],

View File

@@ -20,7 +20,7 @@ func TestV0ConversionErrorHandling(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
tests := []struct {
name string
@@ -132,7 +132,7 @@ func TestV0ConversionErrorPropagation(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("ConvertDashboard_V0_to_V1beta1 returns error on migration failure", func(t *testing.T) {
source := &dashv0.Dashboard{
@@ -206,7 +206,7 @@ func TestV0ConversionSuccessPaths(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("Convert_V0_to_V1beta1 success path returns nil", func(t *testing.T) {
source := &dashv0.Dashboard{
@@ -275,7 +275,7 @@ func TestV0ConversionSecondStepErrors(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("Convert_V0_to_V2alpha1 sets status on first step error", func(t *testing.T) {
// Create a dashboard that will fail v0->v1beta1 conversion

View File

@@ -19,7 +19,7 @@ func TestV1ConversionErrorHandling(t *testing.T) {
// Initialize the migrator with a test data source provider
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("Convert_V1beta1_to_V2alpha1 sets status on conversion error", func(t *testing.T) {
// Create a dashboard that will cause conversion to fail

View File

@@ -229,6 +229,16 @@ func getBoolField(m map[string]interface{}, key string, defaultValue bool) bool
return defaultValue
}
func getUnionField[T ~string](m map[string]interface{}, key string) *T {
if val, ok := m[key]; ok {
if str, ok := val.(string); ok && str != "" {
result := T(str)
return &result
}
}
return nil
}
// Helper function to create int64 pointer
func int64Ptr(i int64) *int64 {
return &i
@@ -1195,6 +1205,7 @@ func buildQueryVariable(ctx context.Context, varMap map[string]interface{}, comm
Refresh: transformVariableRefreshToEnum(varMap["refresh"]),
Sort: transformVariableSortToEnum(varMap["sort"]),
Regex: schemaversion.GetStringValue(varMap, "regex"),
RegexApplyTo: getUnionField[dashv2alpha1.DashboardVariableRegexApplyTo](varMap, "regexApplyTo"),
Query: buildDataQueryKindForVariable(varMap["query"], datasourceType),
AllowCustomValue: getBoolField(varMap, "allowCustomValue", true),
},

View File

@@ -19,7 +19,7 @@ func TestV1beta1ToV2alpha1(t *testing.T) {
// Initialize the migrator with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()

View File

@@ -11,10 +11,10 @@ import (
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
)
func Convert_V2alpha1_to_V0(in *dashv2alpha1.Dashboard, out *dashv0.Dashboard, scope conversion.Scope, dsIndexProvider schemaversion.DataSourceIndexProvider) error {
func Convert_V2alpha1_to_V0(in *dashv2alpha1.Dashboard, out *dashv0.Dashboard, scope conversion.Scope) error {
// Convert v2alpha1 → v1beta1 first, then v1beta1 → v0
v1beta1 := &dashv1.Dashboard{}
if err := ConvertDashboard_V2alpha1_to_V1beta1(in, v1beta1, scope, dsIndexProvider); err != nil {
if err := ConvertDashboard_V2alpha1_to_V1beta1(in, v1beta1, scope); err != nil {
out.ObjectMeta = in.ObjectMeta
out.APIVersion = dashv0.APIVERSION
out.Kind = in.Kind
@@ -53,13 +53,13 @@ func Convert_V2alpha1_to_V0(in *dashv2alpha1.Dashboard, out *dashv0.Dashboard, s
return nil
}
func Convert_V2alpha1_to_V1beta1(in *dashv2alpha1.Dashboard, out *dashv1.Dashboard, scope conversion.Scope, dsIndexProvider schemaversion.DataSourceIndexProvider) error {
func Convert_V2alpha1_to_V1beta1(in *dashv2alpha1.Dashboard, out *dashv1.Dashboard, scope conversion.Scope) error {
out.ObjectMeta = in.ObjectMeta
out.APIVersion = dashv1.APIVERSION
out.Kind = in.Kind
// Convert the spec
if err := ConvertDashboard_V2alpha1_to_V1beta1(in, out, scope, dsIndexProvider); err != nil {
if err := ConvertDashboard_V2alpha1_to_V1beta1(in, out, scope); err != nil {
out.Status = dashv1.DashboardStatus{
Conversion: &dashv1.DashboardConversionStatus{
StoredVersion: ptr.To(dashv2alpha1.VERSION),
@@ -179,7 +179,7 @@ func Convert_V2beta1_to_V1beta1(in *dashv2beta1.Dashboard, out *dashv1.Dashboard
// Convert v2alpha1 → v1beta1
// Note: ConvertDashboard_V2alpha1_to_V1beta1 will set out.ObjectMeta from v2alpha1,
// but we've already set it from the original input, so it will be preserved
if err := ConvertDashboard_V2alpha1_to_V1beta1(v2alpha1, out, scope, dsIndexProvider); err != nil {
if err := ConvertDashboard_V2alpha1_to_V1beta1(v2alpha1, out, scope); err != nil {
out.Status = dashv1.DashboardStatus{
Conversion: &dashv1.DashboardConversionStatus{
StoredVersion: ptr.To(dashv2beta1.VERSION),

View File

@@ -18,7 +18,7 @@ func TestV2alpha1ConversionErrorHandling(t *testing.T) {
// Initialize the migrator with test data source and library element providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("Convert_V2alpha1_to_V1beta1 sets status on conversion", func(t *testing.T) {
// Create a dashboard for conversion
@@ -39,7 +39,7 @@ func TestV2alpha1ConversionErrorHandling(t *testing.T) {
}
target := &dashv1.Dashboard{}
err := Convert_V2alpha1_to_V1beta1(source, target, nil, dsProvider)
err := Convert_V2alpha1_to_V1beta1(source, target, nil)
// Convert_V2alpha1_to_V1beta1 doesn't return error, just sets status
require.NoError(t, err, "Convert_V2alpha1_to_V1beta1 doesn't return error")
@@ -90,7 +90,7 @@ func TestV2beta1ConversionErrorHandling(t *testing.T) {
// Initialize the migrator with test data source and library element providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
t.Run("Convert_V2beta1_to_V1beta1 sets status on first step failure", func(t *testing.T) {
// Create a dashboard that might cause conversion to fail on first step (v2beta1 -> v2alpha1)

View File

@@ -1,14 +1,12 @@
package conversion
import (
"context"
"fmt"
"k8s.io/apimachinery/pkg/conversion"
dashv1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v1beta1"
dashv2alpha1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v2alpha1"
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
"k8s.io/apimachinery/pkg/conversion"
)
// ConvertDashboard_V2alpha1_to_V1beta1 converts a v2alpha1 dashboard to v1beta1 format.
@@ -16,19 +14,13 @@ import (
// that represents the v1 dashboard JSON format.
// The dsIndexProvider is used to resolve default datasources when queries/variables/annotations
// don't have explicit datasource references.
func ConvertDashboard_V2alpha1_to_V1beta1(in *dashv2alpha1.Dashboard, out *dashv1.Dashboard, scope conversion.Scope, dsIndexProvider schemaversion.DataSourceIndexProvider) error {
func ConvertDashboard_V2alpha1_to_V1beta1(in *dashv2alpha1.Dashboard, out *dashv1.Dashboard, scope conversion.Scope) error {
out.ObjectMeta = in.ObjectMeta
out.APIVersion = dashv1.APIVERSION
out.Kind = in.Kind // Preserve the Kind from input (should be "Dashboard")
// Get datasource index for resolving default datasources
var dsIndex *schemaversion.DatasourceIndex
if dsIndexProvider != nil {
dsIndex = dsIndexProvider.Index(context.Background())
}
// Convert the spec to v1beta1 unstructured format
dashboardJSON, err := convertDashboardSpec_V2alpha1_to_V1beta1(&in.Spec, dsIndex)
dashboardJSON, err := convertDashboardSpec_V2alpha1_to_V1beta1(&in.Spec)
if err != nil {
return fmt.Errorf("failed to convert dashboard spec: %w", err)
}
@@ -39,7 +31,7 @@ func ConvertDashboard_V2alpha1_to_V1beta1(in *dashv2alpha1.Dashboard, out *dashv
return nil
}
func convertDashboardSpec_V2alpha1_to_V1beta1(in *dashv2alpha1.DashboardSpec, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertDashboardSpec_V2alpha1_to_V1beta1(in *dashv2alpha1.DashboardSpec) (map[string]interface{}, error) {
dashboard := make(map[string]interface{})
// Convert basic fields
@@ -75,7 +67,7 @@ func convertDashboardSpec_V2alpha1_to_V1beta1(in *dashv2alpha1.DashboardSpec, ds
}
// Convert panels from elements and layout
panels, err := convertPanelsFromElementsAndLayout(in.Elements, in.Layout, dsIndex)
panels, err := convertPanelsFromElementsAndLayout(in.Elements, in.Layout)
if err != nil {
return nil, fmt.Errorf("failed to convert panels: %w", err)
}
@@ -90,7 +82,7 @@ func convertDashboardSpec_V2alpha1_to_V1beta1(in *dashv2alpha1.DashboardSpec, ds
}
// Convert variables
variables := convertVariablesToV1(in.Variables, dsIndex)
variables := convertVariablesToV1(in.Variables)
if len(variables) > 0 {
dashboard["templating"] = map[string]interface{}{
"list": variables,
@@ -98,7 +90,7 @@ func convertDashboardSpec_V2alpha1_to_V1beta1(in *dashv2alpha1.DashboardSpec, ds
}
// Convert annotations - always include even if empty to prevent DashboardModel from adding built-in
annotations := convertAnnotationsToV1(in.Annotations, dsIndex)
annotations := convertAnnotationsToV1(in.Annotations)
dashboard["annotations"] = map[string]interface{}{
"list": annotations,
}
@@ -236,28 +228,28 @@ func countTotalPanels(panels []interface{}) int {
// - RowsLayout: Rows become row panels; nested structures are flattened
// - AutoGridLayout: Calculates gridPos based on column count and row height
// - TabsLayout: Tabs become expanded row panels; content is flattened
func convertPanelsFromElementsAndLayout(elements map[string]dashv2alpha1.DashboardElement, layout dashv2alpha1.DashboardGridLayoutKindOrRowsLayoutKindOrAutoGridLayoutKindOrTabsLayoutKind, dsIndex *schemaversion.DatasourceIndex) ([]interface{}, error) {
func convertPanelsFromElementsAndLayout(elements map[string]dashv2alpha1.DashboardElement, layout dashv2alpha1.DashboardGridLayoutKindOrRowsLayoutKindOrAutoGridLayoutKindOrTabsLayoutKind) ([]interface{}, error) {
if layout.GridLayoutKind != nil {
return convertGridLayoutToPanels(elements, layout.GridLayoutKind, dsIndex)
return convertGridLayoutToPanels(elements, layout.GridLayoutKind)
}
if layout.RowsLayoutKind != nil {
return convertRowsLayoutToPanels(elements, layout.RowsLayoutKind, dsIndex)
return convertRowsLayoutToPanels(elements, layout.RowsLayoutKind)
}
if layout.AutoGridLayoutKind != nil {
return convertAutoGridLayoutToPanels(elements, layout.AutoGridLayoutKind, dsIndex)
return convertAutoGridLayoutToPanels(elements, layout.AutoGridLayoutKind)
}
if layout.TabsLayoutKind != nil {
return convertTabsLayoutToPanels(elements, layout.TabsLayoutKind, dsIndex)
return convertTabsLayoutToPanels(elements, layout.TabsLayoutKind)
}
// No layout specified, return empty panels
return []interface{}{}, nil
}
func convertGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, gridLayout *dashv2alpha1.DashboardGridLayoutKind, dsIndex *schemaversion.DatasourceIndex) ([]interface{}, error) {
func convertGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, gridLayout *dashv2alpha1.DashboardGridLayoutKind) ([]interface{}, error) {
panels := make([]interface{}, 0, len(gridLayout.Spec.Items))
for _, item := range gridLayout.Spec.Items {
@@ -266,7 +258,7 @@ func convertGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement
return nil, fmt.Errorf("panel with uid %s not found in the dashboard elements", item.Spec.Element.Name)
}
panel, err := convertPanelFromElement(&element, &item, dsIndex)
panel, err := convertPanelFromElement(&element, &item)
if err != nil {
return nil, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -279,21 +271,21 @@ func convertGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement
// convertRowsLayoutToPanels converts a RowsLayout to V1 panels.
// All nested structures (rows within rows, tabs within rows) are flattened to the root level.
// Each row becomes a row panel, and nested content is added sequentially after it.
func convertRowsLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, rowsLayout *dashv2alpha1.DashboardRowsLayoutKind, dsIndex *schemaversion.DatasourceIndex) ([]interface{}, error) {
return convertNestedLayoutToPanels(elements, rowsLayout, nil, dsIndex, 0)
func convertRowsLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, rowsLayout *dashv2alpha1.DashboardRowsLayoutKind) ([]interface{}, error) {
return convertNestedLayoutToPanels(elements, rowsLayout, nil, 0)
}
// convertNestedLayoutToPanels handles arbitrary nesting of RowsLayout and TabsLayout.
// It processes each row/tab in order, tracking Y position to ensure panels don't overlap.
// The function recursively flattens nested structures to produce a flat V1 panel array.
func convertNestedLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, rowsLayout *dashv2alpha1.DashboardRowsLayoutKind, tabsLayout *dashv2alpha1.DashboardTabsLayoutKind, dsIndex *schemaversion.DatasourceIndex, yOffset int64) ([]interface{}, error) {
func convertNestedLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, rowsLayout *dashv2alpha1.DashboardRowsLayoutKind, tabsLayout *dashv2alpha1.DashboardTabsLayoutKind, yOffset int64) ([]interface{}, error) {
panels := make([]interface{}, 0)
currentY := yOffset
// Process RowsLayout
if rowsLayout != nil {
for _, row := range rowsLayout.Spec.Rows {
rowPanels, newY, err := processRowItem(elements, &row, dsIndex, currentY)
rowPanels, newY, err := processRowItem(elements, &row, currentY)
if err != nil {
return nil, err
}
@@ -305,7 +297,7 @@ func convertNestedLayoutToPanels(elements map[string]dashv2alpha1.DashboardEleme
// Process TabsLayout (tabs are converted to rows)
if tabsLayout != nil {
for _, tab := range tabsLayout.Spec.Tabs {
tabPanels, newY, err := processTabItem(elements, &tab, dsIndex, currentY)
tabPanels, newY, err := processTabItem(elements, &tab, currentY)
if err != nil {
return nil, err
}
@@ -324,7 +316,7 @@ func convertNestedLayoutToPanels(elements map[string]dashv2alpha1.DashboardEleme
// - Collapsed row: Panels stored inside row.panels with absolute Y positions
// - Expanded row: Panels added to top level after the row panel
// - Nested layouts: Parent row is preserved; nested content is flattened after it
func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dashv2alpha1.DashboardRowsLayoutRowKind, dsIndex *schemaversion.DatasourceIndex, startY int64) ([]interface{}, int64, error) {
func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dashv2alpha1.DashboardRowsLayoutRowKind, startY int64) ([]interface{}, int64, error) {
panels := make([]interface{}, 0)
currentY := startY
@@ -354,7 +346,7 @@ func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dash
}
// Then process nested rows
nestedPanels, err := convertNestedLayoutToPanels(elements, row.Spec.Layout.RowsLayoutKind, nil, dsIndex, currentY)
nestedPanels, err := convertNestedLayoutToPanels(elements, row.Spec.Layout.RowsLayoutKind, nil, currentY)
if err != nil {
return nil, 0, err
}
@@ -387,7 +379,7 @@ func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dash
}
// Then process nested tabs
nestedPanels, err := convertNestedLayoutToPanels(elements, nil, row.Spec.Layout.TabsLayoutKind, dsIndex, currentY)
nestedPanels, err := convertNestedLayoutToPanels(elements, nil, row.Spec.Layout.TabsLayoutKind, currentY)
if err != nil {
return nil, 0, err
}
@@ -429,7 +421,7 @@ func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dash
// Add collapsed panels if row is collapsed (panels use absolute Y positions)
if isCollapsed {
collapsedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, dsIndex, currentY+1)
collapsedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, currentY+1)
if err != nil {
return nil, 0, err
}
@@ -444,7 +436,7 @@ func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dash
// Add panels from row layout (only for expanded rows or hidden header rows)
if !isCollapsed || isHiddenHeader {
rowPanels, newY, err := extractExpandedPanels(elements, &row.Spec.Layout, dsIndex, currentY, isHiddenHeader, startY)
rowPanels, newY, err := extractExpandedPanels(elements, &row.Spec.Layout, currentY, isHiddenHeader, startY)
if err != nil {
return nil, 0, err
}
@@ -459,7 +451,7 @@ func processRowItem(elements map[string]dashv2alpha1.DashboardElement, row *dash
// Each tab becomes an expanded row panel (collapsed=false) with an empty panels array.
// The tab's content is flattened and added to the top level after the row panel.
// Nested layouts within the tab are recursively processed.
func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dashv2alpha1.DashboardTabsLayoutTabKind, dsIndex *schemaversion.DatasourceIndex, startY int64) ([]interface{}, int64, error) {
func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dashv2alpha1.DashboardTabsLayoutTabKind, startY int64) ([]interface{}, int64, error) {
panels := make([]interface{}, 0)
currentY := startY
@@ -487,7 +479,7 @@ func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dash
// Handle nested layouts inside the tab
if tab.Spec.Layout.RowsLayoutKind != nil {
// Nested RowsLayout inside tab
nestedPanels, err := convertNestedLayoutToPanels(elements, tab.Spec.Layout.RowsLayoutKind, nil, dsIndex, currentY)
nestedPanels, err := convertNestedLayoutToPanels(elements, tab.Spec.Layout.RowsLayoutKind, nil, currentY)
if err != nil {
return nil, 0, err
}
@@ -495,7 +487,7 @@ func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dash
currentY = getMaxYFromPanels(nestedPanels, currentY)
} else if tab.Spec.Layout.TabsLayoutKind != nil {
// Nested TabsLayout inside tab
nestedPanels, err := convertNestedLayoutToPanels(elements, nil, tab.Spec.Layout.TabsLayoutKind, dsIndex, currentY)
nestedPanels, err := convertNestedLayoutToPanels(elements, nil, tab.Spec.Layout.TabsLayoutKind, currentY)
if err != nil {
return nil, 0, err
}
@@ -512,7 +504,7 @@ func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dash
adjustedItem := item
adjustedItem.Spec.Y = item.Spec.Y + currentY
panel, err := convertPanelFromElement(&element, &adjustedItem, dsIndex)
panel, err := convertPanelFromElement(&element, &adjustedItem)
if err != nil {
return nil, 0, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -525,7 +517,7 @@ func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dash
}
} else if tab.Spec.Layout.AutoGridLayoutKind != nil {
// AutoGridLayout inside tab - convert with Y offset
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, tab.Spec.Layout.AutoGridLayoutKind, dsIndex, currentY)
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, tab.Spec.Layout.AutoGridLayoutKind, currentY)
if err != nil {
return nil, 0, err
}
@@ -540,7 +532,7 @@ func processTabItem(elements map[string]dashv2alpha1.DashboardElement, tab *dash
// Panels are positioned with absolute Y coordinates (baseY + relative Y).
// This matches V1 behavior where collapsed row panels store their children
// with Y positions as if the row were expanded at that location.
func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrAutoGridLayoutKindOrTabsLayoutKindOrRowsLayoutKind, dsIndex *schemaversion.DatasourceIndex, baseY int64) ([]interface{}, error) {
func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrAutoGridLayoutKindOrTabsLayoutKindOrRowsLayoutKind, baseY int64) ([]interface{}, error) {
panels := make([]interface{}, 0)
if layout.GridLayoutKind != nil {
@@ -552,7 +544,7 @@ func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.Dashbo
// Create a copy with adjusted Y position
adjustedItem := item
adjustedItem.Spec.Y = item.Spec.Y + baseY
panel, err := convertPanelFromElement(&element, &adjustedItem, dsIndex)
panel, err := convertPanelFromElement(&element, &adjustedItem)
if err != nil {
return nil, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -561,7 +553,7 @@ func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.Dashbo
}
// Handle AutoGridLayout for collapsed rows with Y offset
if layout.AutoGridLayoutKind != nil {
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, dsIndex, baseY)
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, baseY)
if err != nil {
return nil, err
}
@@ -571,7 +563,7 @@ func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.Dashbo
if layout.RowsLayoutKind != nil {
currentY := baseY
for _, row := range layout.RowsLayoutKind.Spec.Rows {
nestedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, dsIndex, currentY)
nestedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, currentY)
if err != nil {
return nil, err
}
@@ -582,7 +574,7 @@ func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.Dashbo
if layout.TabsLayoutKind != nil {
currentY := baseY
for _, tab := range layout.TabsLayoutKind.Spec.Tabs {
nestedPanels, err := extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements, &tab.Spec.Layout, dsIndex, currentY)
nestedPanels, err := extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements, &tab.Spec.Layout, currentY)
if err != nil {
return nil, err
}
@@ -596,7 +588,7 @@ func extractCollapsedPanelsWithAbsoluteY(elements map[string]dashv2alpha1.Dashbo
// extractCollapsedPanelsFromTabLayoutWithAbsoluteY extracts panels from a tab layout with absolute Y.
// Similar to extractCollapsedPanelsWithAbsoluteY but handles the tab-specific layout type.
func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrRowsLayoutKindOrAutoGridLayoutKindOrTabsLayoutKind, dsIndex *schemaversion.DatasourceIndex, baseY int64) ([]interface{}, error) {
func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrRowsLayoutKindOrAutoGridLayoutKindOrTabsLayoutKind, baseY int64) ([]interface{}, error) {
panels := make([]interface{}, 0)
if layout.GridLayoutKind != nil {
@@ -607,7 +599,7 @@ func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2
}
adjustedItem := item
adjustedItem.Spec.Y = item.Spec.Y + baseY
panel, err := convertPanelFromElement(&element, &adjustedItem, dsIndex)
panel, err := convertPanelFromElement(&element, &adjustedItem)
if err != nil {
return nil, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -615,7 +607,7 @@ func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2
}
}
if layout.AutoGridLayoutKind != nil {
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, dsIndex, baseY)
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, baseY)
if err != nil {
return nil, err
}
@@ -624,7 +616,7 @@ func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2
if layout.RowsLayoutKind != nil {
currentY := baseY
for _, row := range layout.RowsLayoutKind.Spec.Rows {
nestedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, dsIndex, currentY)
nestedPanels, err := extractCollapsedPanelsWithAbsoluteY(elements, &row.Spec.Layout, currentY)
if err != nil {
return nil, err
}
@@ -635,7 +627,7 @@ func extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements map[string]dashv2
if layout.TabsLayoutKind != nil {
currentY := baseY
for _, tab := range layout.TabsLayoutKind.Spec.Tabs {
nestedPanels, err := extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements, &tab.Spec.Layout, dsIndex, currentY)
nestedPanels, err := extractCollapsedPanelsFromTabLayoutWithAbsoluteY(elements, &tab.Spec.Layout, currentY)
if err != nil {
return nil, err
}
@@ -679,7 +671,7 @@ func getLayoutHeightFromTab(layout *dashv2alpha1.DashboardGridLayoutKindOrRowsLa
// - Explicit row: Add (currentY - 1) to relative Y for absolute positioning
//
// Returns the panels and the new Y position for the next row.
func extractExpandedPanels(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrAutoGridLayoutKindOrTabsLayoutKindOrRowsLayoutKind, dsIndex *schemaversion.DatasourceIndex, currentY int64, isHiddenHeader bool, startY int64) ([]interface{}, int64, error) {
func extractExpandedPanels(elements map[string]dashv2alpha1.DashboardElement, layout *dashv2alpha1.DashboardGridLayoutKindOrAutoGridLayoutKindOrTabsLayoutKindOrRowsLayoutKind, currentY int64, isHiddenHeader bool, startY int64) ([]interface{}, int64, error) {
panels := make([]interface{}, 0)
// For hidden headers, don't track Y changes (matches original behavior)
maxY := startY
@@ -700,7 +692,7 @@ func extractExpandedPanels(elements map[string]dashv2alpha1.DashboardElement, la
}
// For hidden headers: don't adjust Y, keep item.Spec.Y as-is
panel, err := convertPanelFromElement(&element, &adjustedItem, dsIndex)
panel, err := convertPanelFromElement(&element, &adjustedItem)
if err != nil {
return nil, 0, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -725,7 +717,7 @@ func extractExpandedPanels(elements map[string]dashv2alpha1.DashboardElement, la
yOffset = currentY - 1
}
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, dsIndex, yOffset)
autoGridPanels, err := convertAutoGridLayoutToPanelsWithOffset(elements, layout.AutoGridLayoutKind, yOffset)
if err != nil {
return nil, 0, err
}
@@ -788,7 +780,7 @@ func getLayoutHeight(layout *dashv2alpha1.DashboardGridLayoutKindOrAutoGridLayou
// convertAutoGridLayoutToPanelsWithOffset converts AutoGridLayout with a Y offset.
// Same as convertAutoGridLayoutToPanels but starts at yOffset instead of 0.
// Used when AutoGridLayout appears inside rows or tabs.
func convertAutoGridLayoutToPanelsWithOffset(elements map[string]dashv2alpha1.DashboardElement, autoGridLayout *dashv2alpha1.DashboardAutoGridLayoutKind, dsIndex *schemaversion.DatasourceIndex, yOffset int64) ([]interface{}, error) {
func convertAutoGridLayoutToPanelsWithOffset(elements map[string]dashv2alpha1.DashboardElement, autoGridLayout *dashv2alpha1.DashboardAutoGridLayoutKind, yOffset int64) ([]interface{}, error) {
panels := make([]interface{}, 0, len(autoGridLayout.Spec.Items))
const (
@@ -850,7 +842,7 @@ func convertAutoGridLayoutToPanelsWithOffset(elements map[string]dashv2alpha1.Da
},
}
panel, err := convertPanelFromElement(&element, &gridItem, dsIndex)
panel, err := convertPanelFromElement(&element, &gridItem)
if err != nil {
return nil, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -876,7 +868,7 @@ func convertAutoGridLayoutToPanelsWithOffset(elements map[string]dashv2alpha1.Da
//
// Width: 24 / maxColumnCount (default 3 columns = 8 units wide)
// Height: Predefined grid units per mode (see pixelsToGridUnits for custom)
func convertAutoGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, autoGridLayout *dashv2alpha1.DashboardAutoGridLayoutKind, dsIndex *schemaversion.DatasourceIndex) ([]interface{}, error) {
func convertAutoGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, autoGridLayout *dashv2alpha1.DashboardAutoGridLayoutKind) ([]interface{}, error) {
panels := make([]interface{}, 0, len(autoGridLayout.Spec.Items))
const (
@@ -963,7 +955,7 @@ func convertAutoGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardEle
}
}
panel, err := convertPanelFromElement(&element, &gridItem, dsIndex)
panel, err := convertPanelFromElement(&element, &gridItem)
if err != nil {
return nil, fmt.Errorf("failed to convert panel %s: %w", item.Spec.Element.Name, err)
}
@@ -984,11 +976,11 @@ func convertAutoGridLayoutToPanels(elements map[string]dashv2alpha1.DashboardEle
// V1 has no native tab concept, so tabs are converted to expanded row panels.
// Each tab becomes a row panel (collapsed=false, panels=[]) with its content
// flattened to the top level. Tab order is preserved in the output.
func convertTabsLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, tabsLayout *dashv2alpha1.DashboardTabsLayoutKind, dsIndex *schemaversion.DatasourceIndex) ([]interface{}, error) {
return convertNestedLayoutToPanels(elements, nil, tabsLayout, dsIndex, 0)
func convertTabsLayoutToPanels(elements map[string]dashv2alpha1.DashboardElement, tabsLayout *dashv2alpha1.DashboardTabsLayoutKind) ([]interface{}, error) {
return convertNestedLayoutToPanels(elements, nil, tabsLayout, 0)
}
func convertPanelFromElement(element *dashv2alpha1.DashboardElement, layoutItem *dashv2alpha1.DashboardGridLayoutItemKind, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertPanelFromElement(element *dashv2alpha1.DashboardElement, layoutItem *dashv2alpha1.DashboardGridLayoutItemKind) (map[string]interface{}, error) {
panel := make(map[string]interface{})
// Set grid position
@@ -1017,7 +1009,7 @@ func convertPanelFromElement(element *dashv2alpha1.DashboardElement, layoutItem
}
if element.PanelKind != nil {
return convertPanelKindToV1(element.PanelKind, panel, dsIndex)
return convertPanelKindToV1(element.PanelKind, panel)
}
if element.LibraryPanelKind != nil {
@@ -1027,7 +1019,7 @@ func convertPanelFromElement(element *dashv2alpha1.DashboardElement, layoutItem
return nil, fmt.Errorf("element has neither PanelKind nor LibraryPanelKind")
}
func convertPanelKindToV1(panelKind *dashv2alpha1.DashboardPanelKind, panel map[string]interface{}, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertPanelKindToV1(panelKind *dashv2alpha1.DashboardPanelKind, panel map[string]interface{}) (map[string]interface{}, error) {
spec := panelKind.Spec
panel["id"] = int(spec.Id)
@@ -1069,14 +1061,14 @@ func convertPanelKindToV1(panelKind *dashv2alpha1.DashboardPanelKind, panel map[
// Convert queries (targets)
targets := make([]map[string]interface{}, 0, len(spec.Data.Spec.Queries))
for _, query := range spec.Data.Spec.Queries {
target := convertPanelQueryToV1(&query, dsIndex)
target := convertPanelQueryToV1(&query)
targets = append(targets, target)
}
panel["targets"] = targets
// Detect mixed datasource - set panel.datasource to "mixed" if queries use different datasources
// This matches the frontend behavior in getPanelDataSource (layoutSerializers/utils.ts)
if mixedDS := detectMixedDatasource(spec.Data.Spec.Queries, dsIndex); mixedDS != nil {
if mixedDS := detectMixedDatasource(spec.Data.Spec.Queries); mixedDS != nil {
panel["datasource"] = mixedDS
}
@@ -1125,7 +1117,7 @@ func convertPanelKindToV1(panelKind *dashv2alpha1.DashboardPanelKind, panel map[
return panel, nil
}
func convertPanelQueryToV1(query *dashv2alpha1.DashboardPanelQueryKind, dsIndex *schemaversion.DatasourceIndex) map[string]interface{} {
func convertPanelQueryToV1(query *dashv2alpha1.DashboardPanelQueryKind) map[string]interface{} {
target := make(map[string]interface{})
// Copy query spec (excluding refId, hide, datasource which are handled separately)
@@ -1150,7 +1142,7 @@ func convertPanelQueryToV1(query *dashv2alpha1.DashboardPanelQueryKind, dsIndex
}
// Resolve datasource based on V2 input (reuse shared function)
datasource := getDataSourceForQuery(query.Spec.Datasource, query.Spec.Query.Kind, nil)
datasource := getDataSourceForQuery(query.Spec.Datasource, query.Spec.Query.Kind)
if datasource != nil {
target["datasource"] = datasource
}
@@ -1164,7 +1156,7 @@ func convertPanelQueryToV1(query *dashv2alpha1.DashboardPanelQueryKind, dsIndex
// - Else if queryKind (type) is non-empty → return {type} only
// - Else → return nil (no datasource)
// Used for variables and annotations. Panel queries use convertPanelQueryToV1Target.
func getDataSourceForQuery(explicitDS *dashv2alpha1.DashboardDataSourceRef, queryKind string, _ *schemaversion.DatasourceIndex) map[string]interface{} {
func getDataSourceForQuery(explicitDS *dashv2alpha1.DashboardDataSourceRef, queryKind string) map[string]interface{} {
// Case 1: Explicit datasource with UID provided
if explicitDS != nil && explicitDS.Uid != nil && *explicitDS.Uid != "" {
datasource := map[string]interface{}{
@@ -1195,7 +1187,7 @@ func getDataSourceForQuery(explicitDS *dashv2alpha1.DashboardDataSourceRef, quer
// Compares based on V2 input without runtime resolution:
// - If query has explicit datasource.uid → use that UID and type
// - Else → use query.Kind as type (empty UID)
func detectMixedDatasource(queries []dashv2alpha1.DashboardPanelQueryKind, _ *schemaversion.DatasourceIndex) map[string]interface{} {
func detectMixedDatasource(queries []dashv2alpha1.DashboardPanelQueryKind) map[string]interface{} {
if len(queries) == 0 {
return nil
}
@@ -1254,7 +1246,7 @@ func convertLibraryPanelKindToV1(libPanelKind *dashv2alpha1.DashboardLibraryPane
return panel, nil
}
func convertVariablesToV1(variables []dashv2alpha1.DashboardVariableKind, dsIndex *schemaversion.DatasourceIndex) []map[string]interface{} {
func convertVariablesToV1(variables []dashv2alpha1.DashboardVariableKind) []map[string]interface{} {
result := make([]map[string]interface{}, 0, len(variables))
for _, variable := range variables {
@@ -1262,7 +1254,7 @@ func convertVariablesToV1(variables []dashv2alpha1.DashboardVariableKind, dsInde
var err error
if variable.QueryVariableKind != nil {
varMap, err = convertQueryVariableToV1(variable.QueryVariableKind, dsIndex)
varMap, err = convertQueryVariableToV1(variable.QueryVariableKind)
} else if variable.DatasourceVariableKind != nil {
varMap, err = convertDatasourceVariableToV1(variable.DatasourceVariableKind)
} else if variable.CustomVariableKind != nil {
@@ -1274,9 +1266,9 @@ func convertVariablesToV1(variables []dashv2alpha1.DashboardVariableKind, dsInde
} else if variable.TextVariableKind != nil {
varMap, err = convertTextVariableToV1(variable.TextVariableKind)
} else if variable.GroupByVariableKind != nil {
varMap, err = convertGroupByVariableToV1(variable.GroupByVariableKind, dsIndex)
varMap, err = convertGroupByVariableToV1(variable.GroupByVariableKind)
} else if variable.AdhocVariableKind != nil {
varMap, err = convertAdhocVariableToV1(variable.AdhocVariableKind, dsIndex)
varMap, err = convertAdhocVariableToV1(variable.AdhocVariableKind)
} else if variable.SwitchVariableKind != nil {
varMap, err = convertSwitchVariableToV1(variable.SwitchVariableKind)
}
@@ -1289,7 +1281,7 @@ func convertVariablesToV1(variables []dashv2alpha1.DashboardVariableKind, dsInde
return result
}
func convertQueryVariableToV1(variable *dashv2alpha1.DashboardQueryVariableKind, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertQueryVariableToV1(variable *dashv2alpha1.DashboardQueryVariableKind) (map[string]interface{}, error) {
spec := variable.Spec
varMap := map[string]interface{}{
"name": spec.Name,
@@ -1320,6 +1312,9 @@ func convertQueryVariableToV1(variable *dashv2alpha1.DashboardQueryVariableKind,
if spec.Definition != nil {
varMap["definition"] = *spec.Definition
}
if spec.RegexApplyTo != nil {
varMap["regexApplyTo"] = string(*spec.RegexApplyTo)
}
varMap["allowCustomValue"] = spec.AllowCustomValue
// Convert query - handle LEGACY_STRING_VALUE_KEY
@@ -1336,7 +1331,7 @@ func convertQueryVariableToV1(variable *dashv2alpha1.DashboardQueryVariableKind,
}
// Resolve datasource - use explicit datasource or resolve from query kind (datasource type)/default
datasource := getDataSourceForQuery(spec.Datasource, spec.Query.Kind, dsIndex)
datasource := getDataSourceForQuery(spec.Datasource, spec.Query.Kind)
if datasource != nil {
varMap["datasource"] = datasource
}
@@ -1486,7 +1481,7 @@ func convertTextVariableToV1(variable *dashv2alpha1.DashboardTextVariableKind) (
return varMap, nil
}
func convertGroupByVariableToV1(variable *dashv2alpha1.DashboardGroupByVariableKind, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertGroupByVariableToV1(variable *dashv2alpha1.DashboardGroupByVariableKind) (map[string]interface{}, error) {
spec := variable.Spec
varMap := map[string]interface{}{
"name": spec.Name,
@@ -1509,7 +1504,7 @@ func convertGroupByVariableToV1(variable *dashv2alpha1.DashboardGroupByVariableK
}
// Resolve datasource - GroupBy variables don't have a query kind, so use empty string (will fall back to default)
datasource := getDataSourceForQuery(spec.Datasource, "", dsIndex)
datasource := getDataSourceForQuery(spec.Datasource, "")
if datasource != nil {
varMap["datasource"] = datasource
}
@@ -1517,7 +1512,7 @@ func convertGroupByVariableToV1(variable *dashv2alpha1.DashboardGroupByVariableK
return varMap, nil
}
func convertAdhocVariableToV1(variable *dashv2alpha1.DashboardAdhocVariableKind, dsIndex *schemaversion.DatasourceIndex) (map[string]interface{}, error) {
func convertAdhocVariableToV1(variable *dashv2alpha1.DashboardAdhocVariableKind) (map[string]interface{}, error) {
spec := variable.Spec
varMap := map[string]interface{}{
"name": spec.Name,
@@ -1536,7 +1531,7 @@ func convertAdhocVariableToV1(variable *dashv2alpha1.DashboardAdhocVariableKind,
varMap["allowCustomValue"] = spec.AllowCustomValue
// Resolve datasource - Adhoc variables don't have a query kind, so use empty string (will fall back to default)
datasource := getDataSourceForQuery(spec.Datasource, "", dsIndex)
datasource := getDataSourceForQuery(spec.Datasource, "")
if datasource != nil {
varMap["datasource"] = datasource
}
@@ -1663,7 +1658,7 @@ func convertSwitchVariableToV1(variable *dashv2alpha1.DashboardSwitchVariableKin
return varMap, nil
}
func convertAnnotationsToV1(annotations []dashv2alpha1.DashboardAnnotationQueryKind, dsIndex *schemaversion.DatasourceIndex) []map[string]interface{} {
func convertAnnotationsToV1(annotations []dashv2alpha1.DashboardAnnotationQueryKind) []map[string]interface{} {
result := make([]map[string]interface{}, 0, len(annotations))
for _, annotation := range annotations {
@@ -1686,7 +1681,7 @@ func convertAnnotationsToV1(annotations []dashv2alpha1.DashboardAnnotationQueryK
if annotation.Spec.Query != nil {
queryKind = annotation.Spec.Query.Kind
}
datasource := getDataSourceForQuery(annotation.Spec.Datasource, queryKind, dsIndex)
datasource := getDataSourceForQuery(annotation.Spec.Datasource, queryKind)
if datasource != nil {
annotationMap["datasource"] = datasource
}

View File

@@ -282,7 +282,7 @@ func TestV2alpha1ToV1beta1LayoutErrors(t *testing.T) {
// Initialize the migrator with test data source and library element providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()
@@ -498,7 +498,7 @@ func TestV2alpha1ToV1beta1BasicFields(t *testing.T) {
// Initialize the migrator with test data source and library element providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()

View File

@@ -767,6 +767,7 @@ func convertQueryVariableSpec_V2alpha1_to_V2beta1(in *dashv2alpha1.DashboardQuer
out.SkipUrlSync = in.SkipUrlSync
out.Description = in.Description
out.Regex = in.Regex
out.RegexApplyTo = (*dashv2beta1.DashboardVariableRegexApplyTo)(in.RegexApplyTo)
out.Sort = dashv2beta1.DashboardVariableSort(in.Sort)
out.Definition = in.Definition
out.Options = convertVariableOptions_V2alpha1_to_V2beta1(in.Options)

View File

@@ -18,7 +18,7 @@ func TestV2alpha1ToV2beta1(t *testing.T) {
// Initialize the migrator with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()

View File

@@ -806,6 +806,7 @@ func convertQueryVariableSpec_V2beta1_to_V2alpha1(in *dashv2beta1.DashboardQuery
out.SkipUrlSync = in.SkipUrlSync
out.Description = in.Description
out.Regex = in.Regex
out.RegexApplyTo = (*dashv2alpha1.DashboardVariableRegexApplyTo)(in.RegexApplyTo)
out.Sort = dashv2alpha1.DashboardVariableSort(in.Sort)
out.Definition = in.Definition
out.Options = convertVariableOptions_V2beta1_to_V2alpha1(in.Options)

View File

@@ -24,7 +24,7 @@ func TestV2beta1ToV2alpha1RoundTrip(t *testing.T) {
// Initialize the migrator with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()
@@ -107,7 +107,7 @@ func TestV2beta1ToV2alpha1FromOutputFiles(t *testing.T) {
// Initialize the migrator with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()
@@ -193,7 +193,7 @@ func TestV2beta1ToV2alpha1(t *testing.T) {
// Initialize the migrator with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
migration.Initialize(dsProvider, leProvider)
migration.Initialize(dsProvider, leProvider, migration.DefaultCacheTTL)
// Set up conversion scheme
scheme := runtime.NewScheme()

View File

@@ -4,13 +4,19 @@ import (
"context"
"fmt"
"sync"
"time"
"github.com/grafana/authlib/types"
"github.com/grafana/grafana-app-sdk/logging"
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
)
// DefaultCacheTTL is the default TTL for the datasource and library element caches.
const DefaultCacheTTL = time.Minute
// Initialize provides the migrator singleton with required dependencies and builds the map of migrations.
func Initialize(dsIndexProvider schemaversion.DataSourceIndexProvider, leIndexProvider schemaversion.LibraryElementIndexProvider) {
migratorInstance.init(dsIndexProvider, leIndexProvider)
func Initialize(dsIndexProvider schemaversion.DataSourceIndexProvider, leIndexProvider schemaversion.LibraryElementIndexProvider, cacheTTL time.Duration) {
migratorInstance.init(dsIndexProvider, leIndexProvider, cacheTTL)
}
// GetDataSourceIndexProvider returns the datasource index provider instance that was initialized.
@@ -38,6 +44,34 @@ func ResetForTesting() {
initOnce = sync.Once{}
}
// PreloadCache preloads the datasource and library element caches for the given namespaces.
func PreloadCache(ctx context.Context, nsInfos []types.NamespaceInfo) {
// Wait for initialization to complete
<-migratorInstance.ready
// Try to preload datasource cache
if preloadable, ok := migratorInstance.dsIndexProvider.(schemaversion.PreloadableCache); ok {
preloadable.Preload(ctx, nsInfos)
}
// Try to preload library element cache
if preloadable, ok := migratorInstance.leIndexProvider.(schemaversion.PreloadableCache); ok {
preloadable.Preload(ctx, nsInfos)
}
}
// PreloadCacheInBackground starts a goroutine that preloads the caches for the given namespaces.
func PreloadCacheInBackground(nsInfos []types.NamespaceInfo) {
go func() {
defer func() {
if r := recover(); r != nil {
logging.DefaultLogger.Error("panic during cache preloading", "error", r)
}
}()
PreloadCache(context.Background(), nsInfos)
}()
}
// Migrate migrates the given dashboard to the target version.
// This will block until the migrator is initialized.
func Migrate(ctx context.Context, dash map[string]interface{}, targetVersion int) error {
@@ -59,11 +93,15 @@ type migrator struct {
leIndexProvider schemaversion.LibraryElementIndexProvider
}
func (m *migrator) init(dsIndexProvider schemaversion.DataSourceIndexProvider, leIndexProvider schemaversion.LibraryElementIndexProvider) {
func (m *migrator) init(dsIndexProvider schemaversion.DataSourceIndexProvider, leIndexProvider schemaversion.LibraryElementIndexProvider, cacheTTL time.Duration) {
initOnce.Do(func() {
m.dsIndexProvider = dsIndexProvider
m.leIndexProvider = leIndexProvider
m.migrations = schemaversion.GetMigrations(dsIndexProvider, leIndexProvider)
// Wrap the provider with org-aware TTL caching for all conversions.
// This prevents repeated DB queries across multiple conversion calls while allowing
// the cache to refresh periodically, making it suitable for long-lived singleton usage.
m.dsIndexProvider = schemaversion.WrapIndexProviderWithCache(dsIndexProvider, cacheTTL)
// Wrap library element provider with caching as well
m.leIndexProvider = schemaversion.WrapLibraryElementProviderWithCache(leIndexProvider, cacheTTL)
m.migrations = schemaversion.GetMigrations(m.dsIndexProvider, m.leIndexProvider)
close(m.ready)
})
}

View File

@@ -10,10 +10,13 @@ import (
"path/filepath"
"strconv"
"strings"
"sync/atomic"
"testing"
"github.com/prometheus/client_golang/prometheus"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"k8s.io/apiserver/pkg/endpoints/request"
"github.com/grafana/grafana/apps/dashboard/pkg/migration/schemaversion"
migrationtestutil "github.com/grafana/grafana/apps/dashboard/pkg/migration/testutil"
@@ -31,7 +34,7 @@ func TestMigrate(t *testing.T) {
ResetForTesting()
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
Initialize(dsProvider, leProvider)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
t.Run("minimum version check", func(t *testing.T) {
err := Migrate(context.Background(), map[string]interface{}{
@@ -49,7 +52,7 @@ func TestMigrateSingleVersion(t *testing.T) {
// Use the same datasource provider as the frontend test to ensure consistency
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
Initialize(dsProvider, leProvider)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
runSingleVersionMigrationTests(t, SINGLE_VERSION_OUTPUT_DIR)
}
@@ -218,7 +221,7 @@ func TestSchemaMigrationMetrics(t *testing.T) {
// Initialize migration with test providers
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
Initialize(dsProvider, leProvider)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
// Create a test registry for metrics
registry := prometheus.NewRegistry()
@@ -304,7 +307,7 @@ func TestSchemaMigrationMetrics(t *testing.T) {
func TestSchemaMigrationLogging(t *testing.T) {
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.StandardTestConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
Initialize(dsProvider, leProvider)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
tests := []struct {
name string
@@ -423,7 +426,7 @@ func TestMigrateDevDashboards(t *testing.T) {
ResetForTesting()
dsProvider := migrationtestutil.NewDataSourceProvider(migrationtestutil.DevDashboardConfig)
leProvider := migrationtestutil.NewLibraryElementProvider()
Initialize(dsProvider, leProvider)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
runDevDashboardMigrationTests(t, schemaversion.LATEST_VERSION, DEV_DASHBOARDS_OUTPUT_DIR)
}
@@ -449,3 +452,232 @@ func runDevDashboardMigrationTests(t *testing.T, targetVersion int, outputDir st
})
}
}
func TestMigrateWithCache(t *testing.T) {
// Reset the migration singleton before each test
ResetForTesting()
datasources := []schemaversion.DataSourceInfo{
{UID: "ds-uid-1", Type: "prometheus", Name: "Prometheus", Default: true, APIVersion: "v1"},
{UID: "ds-uid-2", Type: "loki", Name: "Loki", Default: false, APIVersion: "v1"},
{UID: "ds-uid-3", Type: "prometheus", Name: "Prometheus 2", Default: false, APIVersion: "v1"},
}
// Create a dashboard at schema version 32 for V33 and V36 migration with datasource references
dashboard1 := map[string]interface{}{
"schemaVersion": 32,
"title": "Test Dashboard 1",
"panels": []interface{}{
map[string]interface{}{
"id": 1,
"type": "timeseries",
// String datasource that V33 will migrate to object reference
"datasource": "Prometheus",
"targets": []interface{}{
map[string]interface{}{
"refId": "A",
"datasource": "Loki",
},
},
},
},
}
// Create a dashboard at schema version 35 for testing V36 migration with datasource references in annotations
dashboard2 := map[string]interface{}{
"schemaVersion": 35,
"title": "Test Dashboard 2",
"annotations": map[string]interface{}{
"list": []interface{}{
map[string]interface{}{
"name": "Test Annotation",
"datasource": "Prometheus 2", // String reference that V36 should convert
"enable": true,
},
},
},
}
t.Run("with datasources", func(t *testing.T) {
ResetForTesting()
dsProvider := newCountingProvider(datasources)
leProvider := newCountingLibraryProvider(nil)
// Initialize the migration system with our counting providers
Initialize(dsProvider, leProvider, DefaultCacheTTL)
// Verify initial call count is zero
assert.Equal(t, dsProvider.GetCallCount(), int64(0))
// Create a context with namespace (required for caching)
ctx := request.WithNamespace(context.Background(), "default")
// First migration - should invoke the provider once to build the cache
dash1 := deepCopyDashboard(dashboard1)
err := Migrate(ctx, dash1, schemaversion.LATEST_VERSION)
require.NoError(t, err)
assert.Equal(t, int64(1), dsProvider.GetCallCount())
// Verify datasource conversion from string to object reference
panels := dash1["panels"].([]interface{})
panel := panels[0].(map[string]interface{})
panelDS, ok := panel["datasource"].(map[string]interface{})
require.True(t, ok, "panel datasource should be converted to object")
assert.Equal(t, "ds-uid-1", panelDS["uid"])
assert.Equal(t, "prometheus", panelDS["type"])
// Verify target datasource conversion
targets := panel["targets"].([]interface{})
target := targets[0].(map[string]interface{})
targetDS, ok := target["datasource"].(map[string]interface{})
require.True(t, ok, "target datasource should be converted to object")
assert.Equal(t, "ds-uid-2", targetDS["uid"])
assert.Equal(t, "loki", targetDS["type"])
// Migration with V35 dashboard - should use the cached index from first migration
dash2 := deepCopyDashboard(dashboard2)
err = Migrate(ctx, dash2, schemaversion.LATEST_VERSION)
require.NoError(t, err, "second migration should succeed")
assert.Equal(t, int64(1), dsProvider.GetCallCount())
// Verify the annotation datasource was converted to object reference
annotations := dash2["annotations"].(map[string]interface{})
list := annotations["list"].([]interface{})
var testAnnotation map[string]interface{}
for _, a := range list {
ann := a.(map[string]interface{})
if ann["name"] == "Test Annotation" {
testAnnotation = ann
break
}
}
require.NotNil(t, testAnnotation, "Test Annotation should exist")
annotationDS, ok := testAnnotation["datasource"].(map[string]interface{})
require.True(t, ok, "annotation datasource should be converted to object")
assert.Equal(t, "ds-uid-3", annotationDS["uid"])
assert.Equal(t, "prometheus", annotationDS["type"])
})
// tests that cache isolates data per namespace
t.Run("with multiple orgs", func(t *testing.T) {
// Reset the migration singleton
ResetForTesting()
dsProvider := newCountingProvider(datasources)
leProvider := newCountingLibraryProvider(nil)
Initialize(dsProvider, leProvider, DefaultCacheTTL)
// Create contexts for different orgs with proper namespace format (org-ID)
ctx1 := request.WithNamespace(context.Background(), "default") // org 1
ctx2 := request.WithNamespace(context.Background(), "stacks-2") // stack 2
// Migrate for org 1
err := Migrate(ctx1, deepCopyDashboard(dashboard1), schemaversion.LATEST_VERSION)
require.NoError(t, err)
callsAfterOrg1 := dsProvider.GetCallCount()
// Migrate for org 2 - should build separate cache
err = Migrate(ctx2, deepCopyDashboard(dashboard2), schemaversion.LATEST_VERSION)
require.NoError(t, err)
callsAfterOrg2 := dsProvider.GetCallCount()
assert.Greater(t, callsAfterOrg2, callsAfterOrg1,
"org 2 migration should have called provider (separate cache)")
// Migrate again for org 1 - should use cache
err = Migrate(ctx1, deepCopyDashboard(dashboard1), schemaversion.LATEST_VERSION)
require.NoError(t, err)
callsAfterOrg1Again := dsProvider.GetCallCount()
assert.Equal(t, callsAfterOrg2, callsAfterOrg1Again,
"second org 1 migration should use cache")
// Migrate again for org 2 - should use cache
err = Migrate(ctx2, deepCopyDashboard(dashboard1), schemaversion.LATEST_VERSION)
require.NoError(t, err)
callsAfterOrg2Again := dsProvider.GetCallCount()
assert.Equal(t, callsAfterOrg2, callsAfterOrg2Again,
"second org 2 migration should use cache")
})
}
// countingProvider wraps a datasource provider and counts calls to Index()
type countingProvider struct {
datasources []schemaversion.DataSourceInfo
callCount atomic.Int64
}
func newCountingProvider(datasources []schemaversion.DataSourceInfo) *countingProvider {
return &countingProvider{
datasources: datasources,
}
}
func (p *countingProvider) Index(_ context.Context) *schemaversion.DatasourceIndex {
p.callCount.Add(1)
return schemaversion.NewDatasourceIndex(p.datasources)
}
func (p *countingProvider) GetCallCount() int64 {
return p.callCount.Load()
}
// countingLibraryProvider wraps a library element provider and counts calls
type countingLibraryProvider struct {
elements []schemaversion.LibraryElementInfo
callCount atomic.Int64
}
func newCountingLibraryProvider(elements []schemaversion.LibraryElementInfo) *countingLibraryProvider {
return &countingLibraryProvider{
elements: elements,
}
}
func (p *countingLibraryProvider) GetLibraryElementInfo(_ context.Context) []schemaversion.LibraryElementInfo {
p.callCount.Add(1)
return p.elements
}
func (p *countingLibraryProvider) GetCallCount() int64 {
return p.callCount.Load()
}
// deepCopyDashboard creates a deep copy of a dashboard map
func deepCopyDashboard(dash map[string]interface{}) map[string]interface{} {
cpy := make(map[string]interface{})
for k, v := range dash {
switch val := v.(type) {
case []interface{}:
cpy[k] = deepCopySlice(val)
case map[string]interface{}:
cpy[k] = deepCopyMapForCache(val)
default:
cpy[k] = v
}
}
return cpy
}
func deepCopySlice(s []interface{}) []interface{} {
cpy := make([]interface{}, len(s))
for i, v := range s {
switch val := v.(type) {
case []interface{}:
cpy[i] = deepCopySlice(val)
case map[string]interface{}:
cpy[i] = deepCopyMapForCache(val)
default:
cpy[i] = v
}
}
return cpy
}
func deepCopyMapForCache(m map[string]interface{}) map[string]interface{} {
cpy := make(map[string]interface{})
for k, v := range m {
switch val := v.(type) {
case []interface{}:
cpy[k] = deepCopySlice(val)
case map[string]interface{}:
cpy[k] = deepCopyMapForCache(val)
default:
cpy[k] = v
}
}
return cpy
}

View File

@@ -0,0 +1,104 @@
package schemaversion
import (
"context"
"sync"
"time"
"github.com/grafana/authlib/types"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/hashicorp/golang-lru/v2/expirable"
k8srequest "k8s.io/apiserver/pkg/endpoints/request"
"github.com/grafana/grafana/pkg/services/apiserver/endpoints/request"
)
const defaultCacheSize = 1000
// CacheProvider is a generic cache interface for schema version providers.
type CacheProvider[T any] interface {
// Get returns the cached value if it's still valid, otherwise calls fetch and caches the result.
Get(ctx context.Context) T
}
// PreloadableCache is an interface for providers that support preloading the cache.
type PreloadableCache interface {
// Preload loads data into the cache for the given namespaces.
Preload(ctx context.Context, nsInfos []types.NamespaceInfo)
}
// cachedProvider is a thread-safe TTL cache that wraps any fetch function.
type cachedProvider[T any] struct {
fetch func(context.Context) T
cache *expirable.LRU[string, T] // LRU cache: namespace to cache entry
inFlight sync.Map // map[string]*sync.Mutex - per-namespace fetch locks
logger log.Logger
}
// newCachedProvider creates a new cachedProvider.
// The fetch function should be able to handle context with different namespaces.
// A non-positive size turns LRU mechanism off (cache of unlimited size).
// A non-positive cacheTTL disables TTL expiration.
func newCachedProvider[T any](fetch func(context.Context) T, size int, cacheTTL time.Duration, logger log.Logger) *cachedProvider[T] {
cacheProvider := &cachedProvider[T]{
fetch: fetch,
logger: logger,
}
cacheProvider.cache = expirable.NewLRU(size, func(key string, value T) {
cacheProvider.inFlight.Delete(key)
}, cacheTTL)
return cacheProvider
}
// Get returns the cached value if it's still valid, otherwise calls fetch and caches the result.
func (p *cachedProvider[T]) Get(ctx context.Context) T {
// Get namespace info from ctx
nsInfo, err := request.NamespaceInfoFrom(ctx, true)
if err != nil {
// No namespace, fall back to direct fetch call without caching
p.logger.Warn("Unable to get namespace info from context, skipping cache", "error", err)
return p.fetch(ctx)
}
namespace := nsInfo.Value
// Fast path: check if cache is still valid
if entry, ok := p.cache.Get(namespace); ok {
return entry
}
// Get or create a per-namespace lock for this fetch operation
// This ensures only one fetch happens per namespace at a time
lockInterface, _ := p.inFlight.LoadOrStore(namespace, &sync.Mutex{})
nsMutex := lockInterface.(*sync.Mutex)
// Lock this specific namespace - other namespaces can still proceed
nsMutex.Lock()
defer nsMutex.Unlock()
// Double-check: another goroutine might have already fetched while we waited
if entry, ok := p.cache.Get(namespace); ok {
return entry
}
// Fetch outside the main lock - only this namespace is blocked
p.logger.Debug("cache miss or expired, fetching new value", "namespace", namespace)
value := p.fetch(ctx)
// Update the cache for this namespace
p.cache.Add(namespace, value)
return value
}
// Preload loads data into the cache for the given namespaces.
func (p *cachedProvider[T]) Preload(ctx context.Context, nsInfos []types.NamespaceInfo) {
// Build the cache using a context with the namespace
p.logger.Info("preloading cache", "nsInfos", len(nsInfos))
startedAt := time.Now()
defer func() {
p.logger.Info("finished preloading cache", "nsInfos", len(nsInfos), "elapsed", time.Since(startedAt))
}()
for _, nsInfo := range nsInfos {
p.cache.Add(nsInfo.Value, p.fetch(k8srequest.WithNamespace(ctx, nsInfo.Value)))
}
}

View File

@@ -0,0 +1,478 @@
package schemaversion
import (
"context"
"fmt"
"sync"
"sync/atomic"
"testing"
"time"
authlib "github.com/grafana/authlib/types"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"k8s.io/apiserver/pkg/endpoints/request"
)
// testProvider tracks how many times get() is called
type testProvider struct {
testData any
callCount atomic.Int64
}
func newTestProvider(testData any) *testProvider {
return &testProvider{
testData: testData,
}
}
func (p *testProvider) get(_ context.Context) any {
p.callCount.Add(1)
return p.testData
}
func (p *testProvider) getCallCount() int64 {
return p.callCount.Load()
}
func TestCachedProvider_CacheHit(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
{UID: "ds2", Type: "loki", Name: "Loki"},
}
underlying := newTestProvider(datasources)
// Test newCachedProvider directly instead of the wrapper
cached := newCachedProvider(underlying.get, defaultCacheSize, time.Minute, log.New("test"))
// Use "default" namespace (org 1) - this is the standard Grafana namespace format
ctx := request.WithNamespace(context.Background(), "default")
// First call should hit the underlying provider
idx1 := cached.Get(ctx)
require.NotNil(t, idx1)
assert.Equal(t, int64(1), underlying.getCallCount(), "first call should invoke underlying provider")
// Second call should use cache
idx2 := cached.Get(ctx)
require.NotNil(t, idx2)
assert.Equal(t, int64(1), underlying.getCallCount(), "second call should use cache, not invoke underlying provider")
// Both should return the same data
assert.Equal(t, idx1, idx2)
}
func TestCachedProvider_NamespaceIsolation(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
underlying := newTestProvider(datasources)
cached := newCachedProvider(underlying.get, defaultCacheSize, time.Minute, log.New("test"))
// Use "default" (org 1) and "org-2" (org 2) - standard Grafana namespace formats
ctx1 := request.WithNamespace(context.Background(), "default")
ctx2 := request.WithNamespace(context.Background(), "org-2")
// First call for org 1
idx1 := cached.Get(ctx1)
require.NotNil(t, idx1)
assert.Equal(t, int64(1), underlying.getCallCount(), "first org-1 call should invoke underlying provider")
// Call for org 2 should also invoke underlying provider (different namespace)
idx2 := cached.Get(ctx2)
require.NotNil(t, idx2)
assert.Equal(t, int64(2), underlying.getCallCount(), "org-2 call should invoke underlying provider (separate cache)")
// Second call for org 1 should use cache
idx3 := cached.Get(ctx1)
require.NotNil(t, idx3)
assert.Equal(t, int64(2), underlying.getCallCount(), "second org-1 call should use cache")
// Second call for org 2 should use cache
idx4 := cached.Get(ctx2)
require.NotNil(t, idx4)
assert.Equal(t, int64(2), underlying.getCallCount(), "second org-2 call should use cache")
}
func TestCachedProvider_NoNamespaceFallback(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
underlying := newTestProvider(datasources)
cached := newCachedProvider(underlying.get, defaultCacheSize, time.Minute, log.New("test"))
// Context without namespace - should fall back to direct provider call
ctx := context.Background()
idx1 := cached.Get(ctx)
require.NotNil(t, idx1)
assert.Equal(t, int64(1), underlying.getCallCount())
// Second call without namespace should also invoke underlying (no caching for unknown namespace)
idx2 := cached.Get(ctx)
require.NotNil(t, idx2)
assert.Equal(t, int64(2), underlying.getCallCount(), "without namespace, each call should invoke underlying provider")
}
func TestCachedProvider_ConcurrentAccess(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
underlying := newTestProvider(datasources)
cached := newCachedProvider(underlying.get, defaultCacheSize, time.Minute, log.New("test"))
// Use "default" namespace (org 1)
ctx := request.WithNamespace(context.Background(), "default")
var wg sync.WaitGroup
numGoroutines := 100
// Launch many goroutines that all try to access the cache simultaneously
for i := 0; i < numGoroutines; i++ {
wg.Add(1)
go func() {
defer wg.Done()
idx := cached.Get(ctx)
require.NotNil(t, idx)
}()
}
wg.Wait()
// Due to double-check locking, only 1 goroutine should have actually built the cache
// In practice, there might be a few more due to timing, but it should be much less than numGoroutines
callCount := underlying.getCallCount()
assert.LessOrEqual(t, callCount, int64(5), "with proper locking, very few goroutines should invoke underlying provider; got %d", callCount)
}
func TestCachedProvider_ConcurrentNamespaces(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
underlying := newTestProvider(datasources)
cached := newCachedProvider(underlying.get, defaultCacheSize, time.Minute, log.New("test"))
var wg sync.WaitGroup
numOrgs := 10
callsPerOrg := 20
// Launch goroutines for multiple namespaces
// Use valid namespace formats: "default" for org 1, "org-N" for N > 1
namespaces := make([]string, numOrgs)
namespaces[0] = "default"
for i := 1; i < numOrgs; i++ {
namespaces[i] = fmt.Sprintf("org-%d", i+1)
}
for _, ns := range namespaces {
ctx := request.WithNamespace(context.Background(), ns)
for i := 0; i < callsPerOrg; i++ {
wg.Add(1)
go func(ctx context.Context) {
defer wg.Done()
idx := cached.Get(ctx)
require.NotNil(t, idx)
}(ctx)
}
}
wg.Wait()
// Each org should have at most a few calls (ideally 1, but timing can cause a few more)
callCount := underlying.getCallCount()
// With 10 orgs, we expect around 10 calls (one per org)
assert.LessOrEqual(t, callCount, int64(numOrgs), "expected roughly one call per org, got %d calls for %d orgs", callCount, numOrgs)
}
// Test that cache returns correct data for each namespace
func TestCachedProvider_CorrectDataPerNamespace(t *testing.T) {
// Provider that returns different data based on namespace
underlying := &namespaceAwareProvider{
datasourcesByNamespace: map[string][]DataSourceInfo{
"default": {{UID: "org1-ds", Type: "prometheus", Name: "Org1 DS", Default: true}},
"org-2": {{UID: "org2-ds", Type: "loki", Name: "Org2 DS", Default: true}},
},
}
cached := newCachedProvider(underlying.Index, defaultCacheSize, time.Minute, log.New("test"))
// Use valid namespace formats
ctx1 := request.WithNamespace(context.Background(), "default")
ctx2 := request.WithNamespace(context.Background(), "org-2")
idx1 := cached.Get(ctx1)
idx2 := cached.Get(ctx2)
assert.Equal(t, "org1-ds", idx1.GetDefault().UID, "org 1 should get org-1 datasources")
assert.Equal(t, "org2-ds", idx2.GetDefault().UID, "org 2 should get org-2 datasources")
// Subsequent calls should still return correct data
idx1Again := cached.Get(ctx1)
idx2Again := cached.Get(ctx2)
assert.Equal(t, "org1-ds", idx1Again.GetDefault().UID, "org 1 should still get org-1 datasources from cache")
assert.Equal(t, "org2-ds", idx2Again.GetDefault().UID, "org 2 should still get org-2 datasources from cache")
}
// TestCachedProvider_PreloadMultipleNamespaces verifies preloading multiple namespaces
func TestCachedProvider_PreloadMultipleNamespaces(t *testing.T) {
// Provider that returns different data based on namespace
underlying := &namespaceAwareProvider{
datasourcesByNamespace: map[string][]DataSourceInfo{
"default": {{UID: "org1-ds", Type: "prometheus", Name: "Org1 DS", Default: true}},
"org-2": {{UID: "org2-ds", Type: "loki", Name: "Org2 DS", Default: true}},
"org-3": {{UID: "org3-ds", Type: "tempo", Name: "Org3 DS", Default: true}},
},
}
cached := newCachedProvider(underlying.Index, defaultCacheSize, time.Minute, log.New("test"))
// Preload multiple namespaces
nsInfos := []authlib.NamespaceInfo{
createNamespaceInfo(1, 0, "default"),
createNamespaceInfo(2, 0, "org-2"),
createNamespaceInfo(3, 0, "org-3"),
}
cached.Preload(context.Background(), nsInfos)
// After preload, the underlying provider should have been called once per namespace
assert.Equal(t, 3, underlying.callCount, "preload should call underlying provider once per namespace")
// Access all namespaces - should use preloaded data and get correct data per namespace
expectedUIDs := map[string]string{
"default": "org1-ds",
"org-2": "org2-ds",
"org-3": "org3-ds",
}
for _, ns := range []string{"default", "org-2", "org-3"} {
ctx := request.WithNamespace(context.Background(), ns)
idx := cached.Get(ctx)
require.NotNil(t, idx, "index for namespace %s should not be nil", ns)
assert.Equal(t, expectedUIDs[ns], idx.GetDefault().UID, "namespace %s should get correct datasource", ns)
}
// The underlying provider should still have been called only 3 times (from preload)
assert.Equal(t, 3, underlying.callCount,
"access after preload should use cached data for all namespaces")
}
// namespaceAwareProvider returns different datasources based on namespace
type namespaceAwareProvider struct {
datasourcesByNamespace map[string][]DataSourceInfo
callCount int
}
func (p *namespaceAwareProvider) Index(ctx context.Context) *DatasourceIndex {
p.callCount++
ns := request.NamespaceValue(ctx)
if ds, ok := p.datasourcesByNamespace[ns]; ok {
return NewDatasourceIndex(ds)
}
return NewDatasourceIndex(nil)
}
// createNamespaceInfo creates a NamespaceInfo for testing
func createNamespaceInfo(orgID, stackID int64, value string) authlib.NamespaceInfo {
return authlib.NamespaceInfo{
OrgID: orgID,
StackID: stackID,
Value: value,
}
}
// Test DatasourceIndex functionality
func TestDatasourceIndex_Lookup(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds-uid-1", Type: "prometheus", Name: "Prometheus DS", Default: true, APIVersion: "v1"},
{UID: "ds-uid-2", Type: "loki", Name: "Loki DS", Default: false, APIVersion: "v1"},
}
idx := NewDatasourceIndex(datasources)
t.Run("lookup by name", func(t *testing.T) {
ds := idx.Lookup("Prometheus DS")
require.NotNil(t, ds)
assert.Equal(t, "ds-uid-1", ds.UID)
})
t.Run("lookup by UID", func(t *testing.T) {
ds := idx.Lookup("ds-uid-2")
require.NotNil(t, ds)
assert.Equal(t, "Loki DS", ds.Name)
})
t.Run("lookup unknown returns nil", func(t *testing.T) {
ds := idx.Lookup("unknown")
assert.Nil(t, ds)
})
t.Run("get default", func(t *testing.T) {
ds := idx.GetDefault()
require.NotNil(t, ds)
assert.Equal(t, "ds-uid-1", ds.UID)
})
t.Run("lookup by UID directly", func(t *testing.T) {
ds := idx.LookupByUID("ds-uid-1")
require.NotNil(t, ds)
assert.Equal(t, "Prometheus DS", ds.Name)
})
t.Run("lookup by name directly", func(t *testing.T) {
ds := idx.LookupByName("Loki DS")
require.NotNil(t, ds)
assert.Equal(t, "ds-uid-2", ds.UID)
})
}
func TestDatasourceIndex_EmptyIndex(t *testing.T) {
idx := NewDatasourceIndex(nil)
assert.Nil(t, idx.GetDefault())
assert.Nil(t, idx.Lookup("anything"))
assert.Nil(t, idx.LookupByUID("anything"))
assert.Nil(t, idx.LookupByName("anything"))
}
// TestCachedProvider_TTLExpiration verifies that cache expires after TTL
func TestCachedProvider_TTLExpiration(t *testing.T) {
datasources := []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
}
underlying := newTestProvider(datasources)
// Use a very short TTL for testing
shortTTL := 50 * time.Millisecond
cached := newCachedProvider(underlying.get, defaultCacheSize, shortTTL, log.New("test"))
ctx := request.WithNamespace(context.Background(), "default")
// First call - should call underlying provider
idx1 := cached.Get(ctx)
require.NotNil(t, idx1)
assert.Equal(t, int64(1), underlying.getCallCount(), "first call should invoke underlying provider")
// Second call immediately - should use cache
idx2 := cached.Get(ctx)
require.NotNil(t, idx2)
assert.Equal(t, int64(1), underlying.getCallCount(), "second call should use cache")
// Wait for TTL to expire
time.Sleep(shortTTL + 20*time.Millisecond)
// Third call after TTL - should call underlying provider again
idx3 := cached.Get(ctx)
require.NotNil(t, idx3)
assert.Equal(t, int64(2), underlying.getCallCount(),
"after TTL expiration, underlying provider should be called again")
}
// TestCachedProvider_ParallelNamespacesFetch verifies that different namespaces can fetch in parallel
func TestCachedProvider_ParallelNamespacesFetch(t *testing.T) {
// Create a blocking provider that tracks concurrent executions
provider := &blockingProvider{
blockDuration: 100 * time.Millisecond,
datasources: []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
},
}
cached := newCachedProvider(provider.get, defaultCacheSize, time.Minute, log.New("test"))
numNamespaces := 5
var wg sync.WaitGroup
// Launch fetches for different namespaces simultaneously
startTime := time.Now()
for i := 0; i < numNamespaces; i++ {
wg.Add(1)
namespace := fmt.Sprintf("org-%d", i+1)
go func(ns string) {
defer wg.Done()
ctx := request.WithNamespace(context.Background(), ns)
idx := cached.Get(ctx)
require.NotNil(t, idx)
}(namespace)
}
wg.Wait()
elapsed := time.Since(startTime)
// Verify that all namespaces were called
assert.Equal(t, int64(numNamespaces), provider.callCount.Load())
// Verify max concurrent executions shows parallelism
maxConcurrent := provider.maxConcurrent.Load()
assert.Equal(t, int64(numNamespaces), maxConcurrent)
// If all namespaces had to wait sequentially, it would take numNamespaces * blockDuration
// With parallelism, it should be much faster (close to just blockDuration)
sequentialTime := time.Duration(numNamespaces) * provider.blockDuration
assert.Less(t, elapsed, sequentialTime)
}
// TestCachedProvider_SameNamespaceSerialFetch verifies that the same namespace doesn't fetch concurrently
func TestCachedProvider_SameNamespaceSerialFetch(t *testing.T) {
// Create a blocking provider that tracks concurrent executions
provider := &blockingProvider{
blockDuration: 100 * time.Millisecond,
datasources: []DataSourceInfo{
{UID: "ds1", Type: "prometheus", Name: "Prometheus", Default: true},
},
}
cached := newCachedProvider(provider.get, defaultCacheSize, time.Minute, log.New("test"))
numGoroutines := 10
var wg sync.WaitGroup
// Launch multiple fetches for the SAME namespace simultaneously
ctx := request.WithNamespace(context.Background(), "default")
for i := 0; i < numGoroutines; i++ {
wg.Add(1)
go func() {
defer wg.Done()
idx := cached.Get(ctx)
require.NotNil(t, idx)
}()
}
wg.Wait()
// Max concurrent should be 1 since all goroutines are for the same namespace
maxConcurrent := provider.maxConcurrent.Load()
assert.Equal(t, int64(1), maxConcurrent)
}
// blockingProvider is a test provider that simulates slow fetch operations
// and tracks concurrent executions
type blockingProvider struct {
blockDuration time.Duration
datasources []DataSourceInfo
callCount atomic.Int64
currentActive atomic.Int64
maxConcurrent atomic.Int64
}
func (p *blockingProvider) get(_ context.Context) any {
p.callCount.Add(1)
// Track concurrent executions
current := p.currentActive.Add(1)
// Update max concurrent if this is a new peak
for {
maxVal := p.maxConcurrent.Load()
if current <= maxVal {
break
}
if p.maxConcurrent.CompareAndSwap(maxVal, current) {
break
}
}
// Simulate slow operation
time.Sleep(p.blockDuration)
p.currentActive.Add(-1)
return p.datasources
}

View File

@@ -2,8 +2,9 @@ package schemaversion
import (
"context"
"sync"
"time"
"github.com/grafana/grafana/pkg/infra/log"
)
// Shared utility functions for datasource migrations across different schema versions.
@@ -11,65 +12,41 @@ import (
// string names/UIDs to structured reference objects with uid, type, and apiVersion.
// cachedIndexProvider wraps a DataSourceIndexProvider with time-based caching.
// This prevents multiple DB queries and index builds during operations that may call
// provider.Index() multiple times (e.g., dashboard conversions with many datasource lookups).
// The cache expires after 10 seconds, allowing it to be used as a long-lived singleton
// while still refreshing periodically.
//
// Thread-safe: Uses sync.RWMutex to guarantee safe concurrent access.
type cachedIndexProvider struct {
provider DataSourceIndexProvider
mu sync.RWMutex
index *DatasourceIndex
cachedAt time.Time
cacheTTL time.Duration
*cachedProvider[*DatasourceIndex]
}
// Index returns the cached index if it's still valid (< 10s old), otherwise rebuilds it.
// Uses RWMutex for efficient concurrent reads when cache is valid.
// Index returns the cached index if it's still valid (< TTL old), otherwise rebuilds it.
func (p *cachedIndexProvider) Index(ctx context.Context) *DatasourceIndex {
// Fast path: check if cache is still valid using read lock
p.mu.RLock()
if p.index != nil && time.Since(p.cachedAt) < p.cacheTTL {
idx := p.index
p.mu.RUnlock()
return idx
}
p.mu.RUnlock()
// Slow path: cache expired or not yet built, acquire write lock
p.mu.Lock()
defer p.mu.Unlock()
// Double-check: another goroutine might have refreshed the cache
// while we were waiting for the write lock
if p.index != nil && time.Since(p.cachedAt) < p.cacheTTL {
return p.index
}
// Rebuild the cache
p.index = p.provider.Index(ctx)
p.cachedAt = time.Now()
return p.index
return p.Get(ctx)
}
// WrapIndexProviderWithCache wraps a provider to cache the index with a 10-second TTL.
// Useful for conversions or migrations that may call provider.Index() multiple times.
// The cache expires after 10 seconds, making it suitable for use as a long-lived singleton
// at the top level of dependency injection while still refreshing periodically.
//
// Example usage in dashboard conversion:
//
// cachedDsIndexProvider := schemaversion.WrapIndexProviderWithCache(dsIndexProvider)
// // Now all calls to cachedDsIndexProvider.Index(ctx) return the same cached index
// // for up to 10 seconds before refreshing
func WrapIndexProviderWithCache(provider DataSourceIndexProvider) DataSourceIndexProvider {
if provider == nil {
return nil
// cachedLibraryElementProvider wraps a LibraryElementIndexProvider with time-based caching.
type cachedLibraryElementProvider struct {
*cachedProvider[[]LibraryElementInfo]
}
func (p *cachedLibraryElementProvider) GetLibraryElementInfo(ctx context.Context) []LibraryElementInfo {
return p.Get(ctx)
}
// WrapIndexProviderWithCache wraps a DataSourceIndexProvider to cache indexes with a configurable TTL.
func WrapIndexProviderWithCache(provider DataSourceIndexProvider, cacheTTL time.Duration) DataSourceIndexProvider {
if provider == nil || cacheTTL <= 0 {
return provider
}
return &cachedIndexProvider{
provider: provider,
cacheTTL: 10 * time.Second,
newCachedProvider[*DatasourceIndex](provider.Index, defaultCacheSize, cacheTTL, log.New("schemaversion.dsindexprovider")),
}
}
// WrapLibraryElementProviderWithCache wraps a LibraryElementIndexProvider to cache library elements with a configurable TTL.
func WrapLibraryElementProviderWithCache(provider LibraryElementIndexProvider, cacheTTL time.Duration) LibraryElementIndexProvider {
if provider == nil || cacheTTL <= 0 {
return provider
}
return &cachedLibraryElementProvider{
newCachedProvider[[]LibraryElementInfo](provider.GetLibraryElementInfo, defaultCacheSize, cacheTTL, log.New("schemaversion.leindexprovider")),
}
}
@@ -216,60 +193,3 @@ func MigrateDatasourceNameToRef(nameOrRef interface{}, options map[string]bool,
return nil
}
// cachedLibraryElementProvider wraps a LibraryElementIndexProvider with time-based caching.
// This prevents multiple DB queries during operations that may call GetLibraryElementInfo()
// multiple times (e.g., dashboard conversions with many library panel lookups).
// The cache expires after 10 seconds, allowing it to be used as a long-lived singleton
// while still refreshing periodically.
//
// Thread-safe: Uses sync.RWMutex to guarantee safe concurrent access.
type cachedLibraryElementProvider struct {
provider LibraryElementIndexProvider
mu sync.RWMutex
elements []LibraryElementInfo
cachedAt time.Time
cacheTTL time.Duration
}
// GetLibraryElementInfo returns the cached library elements if they're still valid (< 10s old), otherwise rebuilds the cache.
// Uses RWMutex for efficient concurrent reads when cache is valid.
func (p *cachedLibraryElementProvider) GetLibraryElementInfo(ctx context.Context) []LibraryElementInfo {
// Fast path: check if cache is still valid using read lock
p.mu.RLock()
if p.elements != nil && time.Since(p.cachedAt) < p.cacheTTL {
elements := p.elements
p.mu.RUnlock()
return elements
}
p.mu.RUnlock()
// Slow path: cache expired or not yet built, acquire write lock
p.mu.Lock()
defer p.mu.Unlock()
// Double-check: another goroutine might have refreshed the cache
// while we were waiting for the write lock
if p.elements != nil && time.Since(p.cachedAt) < p.cacheTTL {
return p.elements
}
// Rebuild the cache
p.elements = p.provider.GetLibraryElementInfo(ctx)
p.cachedAt = time.Now()
return p.elements
}
// WrapLibraryElementProviderWithCache wraps a provider to cache library elements with a 10-second TTL.
// Useful for conversions or migrations that may call GetLibraryElementInfo() multiple times.
// The cache expires after 10 seconds, making it suitable for use as a long-lived singleton
// at the top level of dependency injection while still refreshing periodically.
func WrapLibraryElementProviderWithCache(provider LibraryElementIndexProvider) LibraryElementIndexProvider {
if provider == nil {
return nil
}
return &cachedLibraryElementProvider{
provider: provider,
cacheTTL: 10 * time.Second,
}
}

View File

@@ -76,9 +76,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -155,9 +155,9 @@
"barGlow": false,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -234,9 +234,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -313,9 +313,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -392,9 +392,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -471,9 +471,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -550,9 +550,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": false,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -642,9 +642,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -721,9 +721,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -800,9 +800,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -879,9 +879,9 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -975,9 +975,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1054,9 +1054,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1133,9 +1133,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1212,9 +1212,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1291,9 +1291,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1387,9 +1387,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1470,9 +1470,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1553,9 +1553,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": true
},
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1645,10 +1645,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1731,10 +1731,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1831,10 +1831,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1918,11 +1918,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"sparkline": false,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "scheme",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2005,10 +2004,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2091,10 +2090,10 @@
"barGlow": true,
"centerGlow": true,
"rounded": true,
"spotlight": true
"spotlight": true,
"gradient": true
},
"glow": "both",
"gradient": "hue",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -2147,4 +2146,4 @@
"title": "Panel tests - Gauge (new)",
"uid": "panel-tests-gauge-new",
"weekStart": ""
}
}

View File

@@ -956,9 +956,9 @@
"barGlow": false,
"centerGlow": false,
"rounded": false,
"spotlight": false
"spotlight": false,
"gradient": false
},
"gradient": "none",
"orientation": "auto",
"reduceOptions": {
"calcs": [
@@ -1162,4 +1162,4 @@
"title": "Panel tests - Old gauge to new",
"uid": "panel-tests-old-gauge-to-new",
"weekStart": ""
}
}

View File

@@ -34,7 +34,7 @@ manifest: {
v0alpha1: {
kinds: [examplev0alpha1]
// This is explicitly set to false to keep the example app disabled by default.
// This is explicitly set to false to keep the example app disabled by default.
// It can be enabled via conf overrides, or by setting this value to true and regenerating.
served: false
}
@@ -48,14 +48,14 @@ v1alpha1: {
// served indicates whether this particular version is served by the API server.
// served should be set to false before a version is removed from the manifest entirely.
// served defaults to true if not present.
// This is explicitly set to false to keep the example app disabled by default.
// This is explicitly set to false to keep the example app disabled by default.
// It can be enabled via conf overrides, or by setting this value to true and regenerating.
served: false
// routes contains resource routes for the version, which are split into 'namespaced' and 'cluster' scoped routes.
// This allows you to add additional non-storage- and non-kind- based handlers for your app.
// These should only be used if the behavior cannot be accomplished by reconciliation on storage events or subresource routes on a kind.
routes: {
// namespaced contains namespace-scoped resource routes for the version,
// namespaced contains namespace-scoped resource routes for the version,
// which are exposed as HTTP handlers on '<version>/namespaces/<namespace>/<route>'.
namespaced: {
"/something": {
@@ -72,7 +72,7 @@ v1alpha1: {
}
}
}
// cluster contains cluster-scoped resource routes for the version,
// cluster contains cluster-scoped resource routes for the version,
// which are exposed as HTTP handlers on '<version>/<route>'.
cluster: {
"/other": {
@@ -113,4 +113,4 @@ v1alpha1: {
enabled: true
}
}
}
}

View File

@@ -499,8 +499,8 @@ github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+m
github.com/eapache/go-resiliency v1.1.0/go.mod h1:kFI+JgMyC7bLPUVY133qvEBtVayf5mFgVsvEsIPBvNs=
github.com/eapache/go-xerial-snappy v0.0.0-20180814174437-776d5712da21/go.mod h1:+020luEh2TKB4/GOp8oxxtq0Daoen/Cii55CzbTV6DU=
github.com/eapache/queue v1.1.0/go.mod h1:6eCeP0CKFpHLu8blIFXhExK/dRa7WDZfr6jVFPTqq+I=
github.com/ebitengine/purego v0.8.4 h1:CF7LEKg5FFOsASUj0+QwaXf8Ht6TlFxg09+S9wz0omw=
github.com/ebitengine/purego v0.8.4/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
github.com/ebitengine/purego v0.8.2 h1:jPPGWs2sZ1UgOSgD2bClL0MJIqu58nOmIcBuXr62z1I=
github.com/ebitengine/purego v0.8.2/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
github.com/edsrzf/mmap-go v0.0.0-20170320065105-0bce6a688712/go.mod h1:YO35OhQPt3KJa3ryjFM5Bs14WD66h8eGKpfaBNrHW5M=
github.com/edsrzf/mmap-go v1.2.0 h1:hXLYlkbaPzt1SaQk+anYwKSRNhufIDCchSPkUD6dD84=
github.com/edsrzf/mmap-go v1.2.0/go.mod h1:19H/e8pUPLicwkyNgOykDXkJ9F0MHE+Z52B8EIth78Q=
@@ -853,6 +853,8 @@ github.com/grafana/grafana-plugin-sdk-go v0.284.0 h1:1bK7eWsnPBLUWDcWJWe218Ik5ad
github.com/grafana/grafana-plugin-sdk-go v0.284.0/go.mod h1:lHPniaSxq3SL5MxDIPy04TYB1jnTp/ivkYO+xn5Rz3E=
github.com/grafana/grafana/apps/example v0.0.0-20251027162426-edef69fdc82b h1:6Bo65etvjQ4tStkaA5+N3A3ENbO4UAWj53TxF6g2Hdk=
github.com/grafana/grafana/apps/example v0.0.0-20251027162426-edef69fdc82b/go.mod h1:6+wASOCN8LWt6FJ8dc0oODUBIEY5XHaE6ABi8g0mR+k=
github.com/grafana/grafana/apps/quotas v0.0.0-20251209183543-1013d74f13f2 h1:rDPMdshj3QMvpXn+wK4T8awF9n2sd8i4YRiGqX2xTvg=
github.com/grafana/grafana/apps/quotas v0.0.0-20251209183543-1013d74f13f2/go.mod h1:M7bV60iRB61y0ISPG1HX/oNLZtlh0ZF22rUYwNkAKjo=
github.com/grafana/grafana/pkg/promlib v0.0.8 h1:VUWsqttdf0wMI4j9OX9oNrykguQpZcruudDAFpJJVw0=
github.com/grafana/grafana/pkg/promlib v0.0.8/go.mod h1:U1ezG/MGaEPoThqsr3lymMPN5yIPdVTJnDZ+wcXT+ao=
github.com/grafana/grafana/pkg/semconv v0.0.0-20250804150913-990f1c69ecc2 h1:A65jWgLk4Re28gIuZcpC0aTh71JZ0ey89hKGE9h543s=
@@ -1416,8 +1418,8 @@ github.com/sethvargo/go-retry v0.3.0 h1:EEt31A35QhrcRZtrYFDTBg91cqZVnFL2navjDrah
github.com/sethvargo/go-retry v0.3.0/go.mod h1:mNX17F0C/HguQMyMyJxcnU471gOZGxCLyYaFyAZraas=
github.com/shadowspore/fossil-delta v0.0.0-20241213113458-1d797d70cbe3 h1:/4/IJi5iyTdh6mqOUaASW148HQpujYiHl0Wl78dSOSc=
github.com/shadowspore/fossil-delta v0.0.0-20241213113458-1d797d70cbe3/go.mod h1:aJIMhRsunltJR926EB2MUg8qHemFQDreSB33pyto2Ps=
github.com/shirou/gopsutil/v4 v4.25.6 h1:kLysI2JsKorfaFPcYmcJqbzROzsBWEOAtw6A7dIfqXs=
github.com/shirou/gopsutil/v4 v4.25.6/go.mod h1:PfybzyydfZcN+JMMjkF6Zb8Mq1A/VcogFFg7hj50W9c=
github.com/shirou/gopsutil/v4 v4.25.3 h1:SeA68lsu8gLggyMbmCn8cmp97V1TI9ld9sVzAUcKcKE=
github.com/shirou/gopsutil/v4 v4.25.3/go.mod h1:xbuxyoZj+UsgnZrENu3lQivsngRR5BdjbJwf2fv4szA=
github.com/shopspring/decimal v0.0.0-20180709203117-cd690d0c9e24/go.mod h1:M+9NzErvs504Cn4c5DxATwIqPbtswREoFCre64PpcG4=
github.com/shopspring/decimal v1.4.0 h1:bxl37RwXBklmTi0C79JfXCEBD1cqqHt0bbgBAGFp81k=
github.com/shopspring/decimal v1.4.0/go.mod h1:gawqmDU56v4yIKSwfBSFip1HdCCXN8/+DMd9qYNcwME=

View File

@@ -23,6 +23,12 @@ type CoreRole struct {
Spec CoreRoleSpec `json:"spec" yaml:"spec"`
}
func NewCoreRole() *CoreRole {
return &CoreRole{
Spec: *NewCoreRoleSpec(),
}
}
func (o *CoreRole) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaCoreRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &CoreRole{}, &CoreRoleList{}, resource.WithKind("CoreRole"),
schemaCoreRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewCoreRole(), &CoreRoleList{}, resource.WithKind("CoreRole"),
resource.WithPlural("coreroles"), resource.WithScope(resource.NamespacedScope))
kindCoreRole = resource.Kind{
Schema: schemaCoreRole,

View File

@@ -23,6 +23,12 @@ type ExternalGroupMapping struct {
Spec ExternalGroupMappingSpec `json:"spec" yaml:"spec"`
}
func NewExternalGroupMapping() *ExternalGroupMapping {
return &ExternalGroupMapping{
Spec: *NewExternalGroupMappingSpec(),
}
}
func (o *ExternalGroupMapping) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaExternalGroupMapping = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &ExternalGroupMapping{}, &ExternalGroupMappingList{}, resource.WithKind("ExternalGroupMapping"),
schemaExternalGroupMapping = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewExternalGroupMapping(), &ExternalGroupMappingList{}, resource.WithKind("ExternalGroupMapping"),
resource.WithPlural("externalgroupmappings"), resource.WithScope(resource.NamespacedScope))
kindExternalGroupMapping = resource.Kind{
Schema: schemaExternalGroupMapping,

View File

@@ -23,6 +23,12 @@ type GlobalRole struct {
Spec GlobalRoleSpec `json:"spec" yaml:"spec"`
}
func NewGlobalRole() *GlobalRole {
return &GlobalRole{
Spec: *NewGlobalRoleSpec(),
}
}
func (o *GlobalRole) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaGlobalRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &GlobalRole{}, &GlobalRoleList{}, resource.WithKind("GlobalRole"),
schemaGlobalRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewGlobalRole(), &GlobalRoleList{}, resource.WithKind("GlobalRole"),
resource.WithPlural("globalroles"), resource.WithScope(resource.NamespacedScope))
kindGlobalRole = resource.Kind{
Schema: schemaGlobalRole,

View File

@@ -23,6 +23,12 @@ type GlobalRoleBinding struct {
Spec GlobalRoleBindingSpec `json:"spec" yaml:"spec"`
}
func NewGlobalRoleBinding() *GlobalRoleBinding {
return &GlobalRoleBinding{
Spec: *NewGlobalRoleBindingSpec(),
}
}
func (o *GlobalRoleBinding) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaGlobalRoleBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &GlobalRoleBinding{}, &GlobalRoleBindingList{}, resource.WithKind("GlobalRoleBinding"),
schemaGlobalRoleBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewGlobalRoleBinding(), &GlobalRoleBindingList{}, resource.WithKind("GlobalRoleBinding"),
resource.WithPlural("globalrolebindings"), resource.WithScope(resource.NamespacedScope))
kindGlobalRoleBinding = resource.Kind{
Schema: schemaGlobalRoleBinding,

View File

@@ -23,6 +23,12 @@ type ResourcePermission struct {
Spec ResourcePermissionSpec `json:"spec" yaml:"spec"`
}
func NewResourcePermission() *ResourcePermission {
return &ResourcePermission{
Spec: *NewResourcePermissionSpec(),
}
}
func (o *ResourcePermission) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaResourcePermission = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &ResourcePermission{}, &ResourcePermissionList{}, resource.WithKind("ResourcePermission"),
schemaResourcePermission = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewResourcePermission(), &ResourcePermissionList{}, resource.WithKind("ResourcePermission"),
resource.WithPlural("resourcepermissions"), resource.WithScope(resource.NamespacedScope))
kindResourcePermission = resource.Kind{
Schema: schemaResourcePermission,

View File

@@ -23,6 +23,12 @@ type Role struct {
Spec RoleSpec `json:"spec" yaml:"spec"`
}
func NewRole() *Role {
return &Role{
Spec: *NewRoleSpec(),
}
}
func (o *Role) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &Role{}, &RoleList{}, resource.WithKind("Role"),
schemaRole = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewRole(), &RoleList{}, resource.WithKind("Role"),
resource.WithPlural("roles"), resource.WithScope(resource.NamespacedScope))
kindRole = resource.Kind{
Schema: schemaRole,

View File

@@ -23,6 +23,12 @@ type RoleBinding struct {
Spec RoleBindingSpec `json:"spec" yaml:"spec"`
}
func NewRoleBinding() *RoleBinding {
return &RoleBinding{
Spec: *NewRoleBindingSpec(),
}
}
func (o *RoleBinding) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaRoleBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &RoleBinding{}, &RoleBindingList{}, resource.WithKind("RoleBinding"),
schemaRoleBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewRoleBinding(), &RoleBindingList{}, resource.WithKind("RoleBinding"),
resource.WithPlural("rolebindings"), resource.WithScope(resource.NamespacedScope))
kindRoleBinding = resource.Kind{
Schema: schemaRoleBinding,

View File

@@ -23,6 +23,12 @@ type ServiceAccount struct {
Spec ServiceAccountSpec `json:"spec" yaml:"spec"`
}
func NewServiceAccount() *ServiceAccount {
return &ServiceAccount{
Spec: *NewServiceAccountSpec(),
}
}
func (o *ServiceAccount) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaServiceAccount = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &ServiceAccount{}, &ServiceAccountList{}, resource.WithKind("ServiceAccount"),
schemaServiceAccount = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewServiceAccount(), &ServiceAccountList{}, resource.WithKind("ServiceAccount"),
resource.WithPlural("serviceaccounts"), resource.WithScope(resource.NamespacedScope))
kindServiceAccount = resource.Kind{
Schema: schemaServiceAccount,

View File

@@ -23,6 +23,12 @@ type Team struct {
Spec TeamSpec `json:"spec" yaml:"spec"`
}
func NewTeam() *Team {
return &Team{
Spec: *NewTeamSpec(),
}
}
func (o *Team) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaTeam = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &Team{}, &TeamList{}, resource.WithKind("Team"),
schemaTeam = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewTeam(), &TeamList{}, resource.WithKind("Team"),
resource.WithPlural("teams"), resource.WithScope(resource.NamespacedScope))
kindTeam = resource.Kind{
Schema: schemaTeam,

View File

@@ -23,6 +23,12 @@ type TeamBinding struct {
Spec TeamBindingSpec `json:"spec" yaml:"spec"`
}
func NewTeamBinding() *TeamBinding {
return &TeamBinding{
Spec: *NewTeamBindingSpec(),
}
}
func (o *TeamBinding) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaTeamBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &TeamBinding{}, &TeamBindingList{}, resource.WithKind("TeamBinding"),
schemaTeamBinding = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewTeamBinding(), &TeamBindingList{}, resource.WithKind("TeamBinding"),
resource.WithPlural("teambindings"), resource.WithScope(resource.NamespacedScope))
kindTeamBinding = resource.Kind{
Schema: schemaTeamBinding,

View File

@@ -23,6 +23,12 @@ type User struct {
Spec UserSpec `json:"spec" yaml:"spec"`
}
func NewUser() *User {
return &User{
Spec: *NewUserSpec(),
}
}
func (o *User) GetSpec() any {
return o.Spec
}

View File

@@ -10,7 +10,7 @@ import (
// schema is unexported to prevent accidental overwrites
var (
schemaUser = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", &User{}, &UserList{}, resource.WithKind("User"),
schemaUser = resource.NewSimpleSchema("iam.grafana.app", "v0alpha1", NewUser(), &UserList{}, resource.WithKind("User"),
resource.WithPlural("users"), resource.WithScope(resource.NamespacedScope))
kindUser = resource.Kind{
Schema: schemaUser,

View File

@@ -1,8 +1,3 @@
//go:build !ignore_autogenerated
// +build !ignore_autogenerated
// Code generated by grafana-app-sdk. DO NOT EDIT.
package v0alpha1
import (

View File

@@ -109,6 +109,13 @@ var appManifestData = app.ManifestData{
"items": {
SchemaProps: spec.SchemaProps{
Type: []string{"array"},
Items: &spec.SchemaOrArray{
Schema: &spec.Schema{
SchemaProps: spec.SchemaProps{
Ref: spec.MustCreateRef("#/components/schemas/getGroupsExternalGroupMapping"),
}},
},
},
},
"kind": {
@@ -200,6 +207,13 @@ var appManifestData = app.ManifestData{
"hits": {
SchemaProps: spec.SchemaProps{
Type: []string{"array"},
Items: &spec.SchemaOrArray{
Schema: &spec.Schema{
SchemaProps: spec.SchemaProps{
Ref: spec.MustCreateRef("#/components/schemas/getSearchTeamsTeamHit"),
}},
},
},
},
"kind": {

View File

@@ -1,5 +1,10 @@
include ../sdk.mk
.PHONY: generate
.PHONY: generate # Run Grafana App SDK code generation
generate: install-app-sdk update-app-sdk
@$(APP_SDK_BIN) generate -g ./pkg/apis --grouping=group --postprocess --defencoding=none --useoldmanifestkinds
@$(APP_SDK_BIN) generate \
--source=./kinds/ \
--gogenpath=./pkg/apis \
--grouping=group \
--genoperatorstate=false \
--defencoding=none

View File

@@ -1,39 +1,18 @@
package investigations
// This is our Investigation definition, which contains metadata about the kind, and the kind's schema
investigation: {
investigationV0alpha1: {
kind: "Investigation"
group: "investigations.grafana.app"
apiResource: {
groupOverride: "investigations.grafana.app"
}
pluralName: "Investigations"
current: "v0alpha1"
versions: {
"v0alpha1": {
codegen: {
frontend: true
backend: true
options: {
generateObjectMeta: true
generateClient: true
k8sLike: true
package: "github.com/grafana/grafana/apps/investigations"
}
}
schema: {
// spec is the schema of our resource
spec: {
title: string
createdByProfile: #Person
hasCustomName: bool
isFavorite: bool
overviewNote: string
overviewNoteUpdatedAt: string
collectables: [...#Collectable] // +listType=atomic
viewMode: #ViewMode
}
}
schema: {
spec: {
title: string
createdByProfile: #Person
hasCustomName: bool
isFavorite: bool
overviewNote: string
overviewNoteUpdatedAt: string
collectables: [...#Collectable] // +listType=atomic
viewMode: #ViewMode
}
}
}

View File

@@ -1,37 +1,18 @@
package investigations
investigationIndex: {
investigationIndexV0alpha1:{
kind: "InvestigationIndex"
group: "investigations.grafana.app"
apiResource: {
groupOverride: "investigations.grafana.app"
}
pluralName: "InvestigationIndexes"
current: "v0alpha1"
versions: {
"v0alpha1": {
codegen: {
frontend: true
backend: true
options: {
generateObjectMeta: true
generateClient: true
k8sLike: true
package: "github.com/grafana/grafana/apps/investigations"
}
}
schema: {
spec: {
// Title of the index, e.g. 'Favorites' or 'My Investigations'
title: string
schema: {
spec: {
// Title of the index, e.g. 'Favorites' or 'My Investigations'
title: string
// The Person who owns this investigation index
owner: #Person
// The Person who owns this investigation index
owner: #Person
// Array of investigation summaries
investigationSummaries: [...#InvestigationSummary] // +listType=atomic
}
}
// Array of investigation summaries
investigationSummaries: [...#InvestigationSummary] // +listType=atomic
}
}
}

View File

@@ -3,8 +3,16 @@ package investigations
manifest: {
appName: "investigations"
groupOverride: "investigations.grafana.app"
kinds: [
investigation,
investigationIndex,
]
}
versions: {
"v0alpha1": {
codegen: {
ts: {enabled: false}
go: {enabled: true}
}
kinds: [
investigationV0alpha1,
investigationIndexV0alpha1,
]
}
}
}

View File

@@ -4,7 +4,6 @@ import (
"context"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
)
type InvestigationClient struct {
@@ -76,24 +75,6 @@ func (c *InvestigationClient) Patch(ctx context.Context, identifier resource.Ide
return c.client.Patch(ctx, identifier, req, opts)
}
func (c *InvestigationClient) UpdateStatus(ctx context.Context, identifier resource.Identifier, newStatus InvestigationStatus, opts resource.UpdateOptions) (*Investigation, error) {
return c.client.Update(ctx, &Investigation{
TypeMeta: metav1.TypeMeta{
Kind: InvestigationKind().Kind(),
APIVersion: GroupVersion.Identifier(),
},
ObjectMeta: metav1.ObjectMeta{
ResourceVersion: opts.ResourceVersion,
Namespace: identifier.Namespace,
Name: identifier.Name,
},
Status: newStatus,
}, resource.UpdateOptions{
Subresource: "status",
ResourceVersion: opts.ResourceVersion,
})
}
func (c *InvestigationClient) Delete(ctx context.Context, identifier resource.Identifier, opts resource.DeleteOptions) error {
return c.client.Delete(ctx, identifier, opts)
}

View File

@@ -21,8 +21,6 @@ type Investigation struct {
// Spec is the spec of the Investigation
Spec InvestigationSpec `json:"spec" yaml:"spec"`
Status InvestigationStatus `json:"status" yaml:"status"`
}
func (o *Investigation) GetSpec() any {
@@ -39,15 +37,11 @@ func (o *Investigation) SetSpec(spec any) error {
}
func (o *Investigation) GetSubresources() map[string]any {
return map[string]any{
"status": o.Status,
}
return map[string]any{}
}
func (o *Investigation) GetSubresource(name string) (any, bool) {
switch name {
case "status":
return o.Status, true
default:
return nil, false
}
@@ -55,13 +49,6 @@ func (o *Investigation) GetSubresource(name string) (any, bool) {
func (o *Investigation) SetSubresource(name string, value any) error {
switch name {
case "status":
cast, ok := value.(InvestigationStatus)
if !ok {
return fmt.Errorf("cannot set status type %#v, not of type InvestigationStatus", value)
}
o.Status = cast
return nil
default:
return fmt.Errorf("subresource '%s' does not exist", name)
}
@@ -233,7 +220,6 @@ func (o *Investigation) DeepCopyInto(dst *Investigation) {
dst.TypeMeta.Kind = o.TypeMeta.Kind
o.ObjectMeta.DeepCopyInto(&dst.ObjectMeta)
o.Spec.DeepCopyInto(&dst.Spec)
o.Status.DeepCopyInto(&dst.Status)
}
// Interface compliance compile-time check
@@ -305,15 +291,3 @@ func (s *InvestigationSpec) DeepCopy() *InvestigationSpec {
func (s *InvestigationSpec) DeepCopyInto(dst *InvestigationSpec) {
resource.CopyObjectInto(dst, s)
}
// DeepCopy creates a full deep copy of InvestigationStatus
func (s *InvestigationStatus) DeepCopy() *InvestigationStatus {
cpy := &InvestigationStatus{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies InvestigationStatus into another InvestigationStatus object
func (s *InvestigationStatus) DeepCopyInto(dst *InvestigationStatus) {
resource.CopyObjectInto(dst, s)
}

View File

@@ -84,7 +84,6 @@ func NewInvestigationViewMode() *InvestigationViewMode {
return &InvestigationViewMode{}
}
// spec is the schema of our resource
// +k8s:openapi-gen=true
type InvestigationSpec struct {
Title string `json:"title"`

View File

@@ -4,7 +4,6 @@ import (
"context"
"github.com/grafana/grafana-app-sdk/resource"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
)
type InvestigationIndexClient struct {
@@ -76,24 +75,6 @@ func (c *InvestigationIndexClient) Patch(ctx context.Context, identifier resourc
return c.client.Patch(ctx, identifier, req, opts)
}
func (c *InvestigationIndexClient) UpdateStatus(ctx context.Context, identifier resource.Identifier, newStatus InvestigationIndexStatus, opts resource.UpdateOptions) (*InvestigationIndex, error) {
return c.client.Update(ctx, &InvestigationIndex{
TypeMeta: metav1.TypeMeta{
Kind: InvestigationIndexKind().Kind(),
APIVersion: GroupVersion.Identifier(),
},
ObjectMeta: metav1.ObjectMeta{
ResourceVersion: opts.ResourceVersion,
Namespace: identifier.Namespace,
Name: identifier.Name,
},
Status: newStatus,
}, resource.UpdateOptions{
Subresource: "status",
ResourceVersion: opts.ResourceVersion,
})
}
func (c *InvestigationIndexClient) Delete(ctx context.Context, identifier resource.Identifier, opts resource.DeleteOptions) error {
return c.client.Delete(ctx, identifier, opts)
}

View File

@@ -21,8 +21,6 @@ type InvestigationIndex struct {
// Spec is the spec of the InvestigationIndex
Spec InvestigationIndexSpec `json:"spec" yaml:"spec"`
Status InvestigationIndexStatus `json:"status" yaml:"status"`
}
func (o *InvestigationIndex) GetSpec() any {
@@ -39,15 +37,11 @@ func (o *InvestigationIndex) SetSpec(spec any) error {
}
func (o *InvestigationIndex) GetSubresources() map[string]any {
return map[string]any{
"status": o.Status,
}
return map[string]any{}
}
func (o *InvestigationIndex) GetSubresource(name string) (any, bool) {
switch name {
case "status":
return o.Status, true
default:
return nil, false
}
@@ -55,13 +49,6 @@ func (o *InvestigationIndex) GetSubresource(name string) (any, bool) {
func (o *InvestigationIndex) SetSubresource(name string, value any) error {
switch name {
case "status":
cast, ok := value.(InvestigationIndexStatus)
if !ok {
return fmt.Errorf("cannot set status type %#v, not of type InvestigationIndexStatus", value)
}
o.Status = cast
return nil
default:
return fmt.Errorf("subresource '%s' does not exist", name)
}
@@ -233,7 +220,6 @@ func (o *InvestigationIndex) DeepCopyInto(dst *InvestigationIndex) {
dst.TypeMeta.Kind = o.TypeMeta.Kind
o.ObjectMeta.DeepCopyInto(&dst.ObjectMeta)
o.Spec.DeepCopyInto(&dst.Spec)
o.Status.DeepCopyInto(&dst.Status)
}
// Interface compliance compile-time check
@@ -305,15 +291,3 @@ func (s *InvestigationIndexSpec) DeepCopy() *InvestigationIndexSpec {
func (s *InvestigationIndexSpec) DeepCopyInto(dst *InvestigationIndexSpec) {
resource.CopyObjectInto(dst, s)
}
// DeepCopy creates a full deep copy of InvestigationIndexStatus
func (s *InvestigationIndexStatus) DeepCopy() *InvestigationIndexStatus {
cpy := &InvestigationIndexStatus{}
s.DeepCopyInto(cpy)
return cpy
}
// DeepCopyInto deep copies InvestigationIndexStatus into another InvestigationIndexStatus object
func (s *InvestigationIndexStatus) DeepCopyInto(dst *InvestigationIndexStatus) {
resource.CopyObjectInto(dst, s)
}

View File

@@ -20,10 +20,10 @@ import (
)
var (
rawSchemaInvestigationv0alpha1 = []byte(`{"Collectable":{"additionalProperties":false,"description":"Collectable represents an item collected during investigation","properties":{"createdAt":{"type":"string"},"datasource":{"$ref":"#/components/schemas/DatasourceRef"},"fieldConfig":{"type":"string"},"id":{"type":"string"},"logoPath":{"type":"string"},"note":{"type":"string"},"noteUpdatedAt":{"type":"string"},"origin":{"type":"string"},"queries":{"description":"+listType=atomic","items":{"type":"string"},"type":"array"},"timeRange":{"$ref":"#/components/schemas/TimeRange"},"title":{"type":"string"},"type":{"type":"string"},"url":{"type":"string"}},"required":["id","createdAt","title","origin","type","queries","timeRange","datasource","url","note","noteUpdatedAt","fieldConfig"],"type":"object"},"DatasourceRef":{"additionalProperties":false,"description":"DatasourceRef is a reference to a datasource","properties":{"uid":{"type":"string"}},"required":["uid"],"type":"object"},"Investigation":{"properties":{"spec":{"$ref":"#/components/schemas/spec"},"status":{"$ref":"#/components/schemas/status"}},"required":["spec"]},"OperatorState":{"additionalProperties":false,"properties":{"descriptiveState":{"description":"descriptiveState is an optional more descriptive state field which has no requirements on format","type":"string"},"details":{"additionalProperties":{"additionalProperties":{},"type":"object"},"description":"details contains any extra information that is operator-specific","type":"object"},"lastEvaluation":{"description":"lastEvaluation is the ResourceVersion last evaluated","type":"string"},"state":{"description":"state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.","enum":["success","in_progress","failed"],"type":"string"}},"required":["lastEvaluation","state"],"type":"object"},"Person":{"additionalProperties":false,"description":"Person represents a user profile with basic information","properties":{"gravatarUrl":{"description":"URL to user's Gravatar image","type":"string"},"name":{"description":"Display name of the user","type":"string"},"uid":{"description":"Unique identifier for the user","type":"string"}},"required":["uid","name","gravatarUrl"],"type":"object"},"TimeRange":{"additionalProperties":false,"description":"TimeRange represents a time range with both absolute and relative values","properties":{"from":{"type":"string"},"raw":{"additionalProperties":false,"properties":{"from":{"type":"string"},"to":{"type":"string"}},"required":["from","to"],"type":"object"},"to":{"type":"string"}},"required":["from","to","raw"],"type":"object"},"ViewMode":{"additionalProperties":false,"properties":{"mode":{"enum":["compact","full"],"type":"string"},"showComments":{"type":"boolean"},"showTooltips":{"type":"boolean"}},"required":["mode","showComments","showTooltips"],"type":"object"},"spec":{"additionalProperties":false,"description":"spec is the schema of our resource","properties":{"collectables":{"description":"+listType=atomic","items":{"$ref":"#/components/schemas/Collectable"},"type":"array"},"createdByProfile":{"$ref":"#/components/schemas/Person"},"hasCustomName":{"type":"boolean"},"isFavorite":{"type":"boolean"},"overviewNote":{"type":"string"},"overviewNoteUpdatedAt":{"type":"string"},"title":{"type":"string"},"viewMode":{"$ref":"#/components/schemas/ViewMode"}},"required":["title","createdByProfile","hasCustomName","isFavorite","overviewNote","overviewNoteUpdatedAt","collectables","viewMode"],"type":"object"},"status":{"additionalProperties":false,"properties":{"additionalFields":{"additionalProperties":{"additionalProperties":{},"type":"object"},"description":"additionalFields is reserved for future use","type":"object"},"operatorStates":{"additionalProperties":{"$ref":"#/components/schemas/OperatorState"},"description":"operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.","type":"object"}},"type":"object"}}`)
rawSchemaInvestigationv0alpha1 = []byte(`{"Collectable":{"additionalProperties":false,"description":"Collectable represents an item collected during investigation","properties":{"createdAt":{"type":"string"},"datasource":{"$ref":"#/components/schemas/DatasourceRef"},"fieldConfig":{"type":"string"},"id":{"type":"string"},"logoPath":{"type":"string"},"note":{"type":"string"},"noteUpdatedAt":{"type":"string"},"origin":{"type":"string"},"queries":{"description":"+listType=atomic","items":{"type":"string"},"type":"array"},"timeRange":{"$ref":"#/components/schemas/TimeRange"},"title":{"type":"string"},"type":{"type":"string"},"url":{"type":"string"}},"required":["id","createdAt","title","origin","type","queries","timeRange","datasource","url","note","noteUpdatedAt","fieldConfig"],"type":"object"},"DatasourceRef":{"additionalProperties":false,"description":"DatasourceRef is a reference to a datasource","properties":{"uid":{"type":"string"}},"required":["uid"],"type":"object"},"Investigation":{"properties":{"spec":{"$ref":"#/components/schemas/spec"}},"required":["spec"]},"Person":{"additionalProperties":false,"description":"Person represents a user profile with basic information","properties":{"gravatarUrl":{"description":"URL to user's Gravatar image","type":"string"},"name":{"description":"Display name of the user","type":"string"},"uid":{"description":"Unique identifier for the user","type":"string"}},"required":["uid","name","gravatarUrl"],"type":"object"},"TimeRange":{"additionalProperties":false,"description":"TimeRange represents a time range with both absolute and relative values","properties":{"from":{"type":"string"},"raw":{"additionalProperties":false,"properties":{"from":{"type":"string"},"to":{"type":"string"}},"required":["from","to"],"type":"object"},"to":{"type":"string"}},"required":["from","to","raw"],"type":"object"},"ViewMode":{"additionalProperties":false,"properties":{"mode":{"enum":["compact","full"],"type":"string"},"showComments":{"type":"boolean"},"showTooltips":{"type":"boolean"}},"required":["mode","showComments","showTooltips"],"type":"object"},"spec":{"additionalProperties":false,"properties":{"collectables":{"description":"+listType=atomic","items":{"$ref":"#/components/schemas/Collectable"},"type":"array"},"createdByProfile":{"$ref":"#/components/schemas/Person"},"hasCustomName":{"type":"boolean"},"isFavorite":{"type":"boolean"},"overviewNote":{"type":"string"},"overviewNoteUpdatedAt":{"type":"string"},"title":{"type":"string"},"viewMode":{"$ref":"#/components/schemas/ViewMode"}},"required":["title","createdByProfile","hasCustomName","isFavorite","overviewNote","overviewNoteUpdatedAt","collectables","viewMode"],"type":"object"}}`)
versionSchemaInvestigationv0alpha1 app.VersionSchema
_ = json.Unmarshal(rawSchemaInvestigationv0alpha1, &versionSchemaInvestigationv0alpha1)
rawSchemaInvestigationIndexv0alpha1 = []byte(`{"CollectableSummary":{"additionalProperties":false,"properties":{"id":{"type":"string"},"logoPath":{"type":"string"},"origin":{"type":"string"},"title":{"type":"string"}},"required":["id","title","logoPath","origin"],"type":"object"},"InvestigationIndex":{"properties":{"spec":{"$ref":"#/components/schemas/spec"},"status":{"$ref":"#/components/schemas/status"}},"required":["spec"]},"InvestigationSummary":{"additionalProperties":false,"description":"Type definition for investigation summaries","properties":{"collectableSummaries":{"description":"+listType=atomic","items":{"$ref":"#/components/schemas/CollectableSummary"},"type":"array"},"createdByProfile":{"$ref":"#/components/schemas/Person"},"hasCustomName":{"type":"boolean"},"isFavorite":{"type":"boolean"},"overviewNote":{"type":"string"},"overviewNoteUpdatedAt":{"type":"string"},"title":{"type":"string"},"viewMode":{"$ref":"#/components/schemas/ViewMode"}},"required":["title","createdByProfile","hasCustomName","isFavorite","overviewNote","overviewNoteUpdatedAt","viewMode","collectableSummaries"],"type":"object"},"OperatorState":{"additionalProperties":false,"properties":{"descriptiveState":{"description":"descriptiveState is an optional more descriptive state field which has no requirements on format","type":"string"},"details":{"additionalProperties":{"additionalProperties":{},"type":"object"},"description":"details contains any extra information that is operator-specific","type":"object"},"lastEvaluation":{"description":"lastEvaluation is the ResourceVersion last evaluated","type":"string"},"state":{"description":"state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.","enum":["success","in_progress","failed"],"type":"string"}},"required":["lastEvaluation","state"],"type":"object"},"Person":{"additionalProperties":false,"description":"Person represents a user profile with basic information","properties":{"gravatarUrl":{"description":"URL to user's Gravatar image","type":"string"},"name":{"description":"Display name of the user","type":"string"},"uid":{"description":"Unique identifier for the user","type":"string"}},"required":["uid","name","gravatarUrl"],"type":"object"},"ViewMode":{"additionalProperties":false,"properties":{"mode":{"enum":["compact","full"],"type":"string"},"showComments":{"type":"boolean"},"showTooltips":{"type":"boolean"}},"required":["mode","showComments","showTooltips"],"type":"object"},"spec":{"additionalProperties":false,"properties":{"investigationSummaries":{"description":"Array of investigation summaries\n+listType=atomic","items":{"$ref":"#/components/schemas/InvestigationSummary"},"type":"array"},"owner":{"$ref":"#/components/schemas/Person","description":"The Person who owns this investigation index"},"title":{"description":"Title of the index, e.g. 'Favorites' or 'My Investigations'","type":"string"}},"required":["title","owner","investigationSummaries"],"type":"object"},"status":{"additionalProperties":false,"properties":{"additionalFields":{"additionalProperties":{"additionalProperties":{},"type":"object"},"description":"additionalFields is reserved for future use","type":"object"},"operatorStates":{"additionalProperties":{"$ref":"#/components/schemas/OperatorState"},"description":"operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.","type":"object"}},"type":"object"}}`)
rawSchemaInvestigationIndexv0alpha1 = []byte(`{"CollectableSummary":{"additionalProperties":false,"properties":{"id":{"type":"string"},"logoPath":{"type":"string"},"origin":{"type":"string"},"title":{"type":"string"}},"required":["id","title","logoPath","origin"],"type":"object"},"InvestigationIndex":{"properties":{"spec":{"$ref":"#/components/schemas/spec"}},"required":["spec"]},"InvestigationSummary":{"additionalProperties":false,"description":"Type definition for investigation summaries","properties":{"collectableSummaries":{"description":"+listType=atomic","items":{"$ref":"#/components/schemas/CollectableSummary"},"type":"array"},"createdByProfile":{"$ref":"#/components/schemas/Person"},"hasCustomName":{"type":"boolean"},"isFavorite":{"type":"boolean"},"overviewNote":{"type":"string"},"overviewNoteUpdatedAt":{"type":"string"},"title":{"type":"string"},"viewMode":{"$ref":"#/components/schemas/ViewMode"}},"required":["title","createdByProfile","hasCustomName","isFavorite","overviewNote","overviewNoteUpdatedAt","viewMode","collectableSummaries"],"type":"object"},"Person":{"additionalProperties":false,"description":"Person represents a user profile with basic information","properties":{"gravatarUrl":{"description":"URL to user's Gravatar image","type":"string"},"name":{"description":"Display name of the user","type":"string"},"uid":{"description":"Unique identifier for the user","type":"string"}},"required":["uid","name","gravatarUrl"],"type":"object"},"ViewMode":{"additionalProperties":false,"properties":{"mode":{"enum":["compact","full"],"type":"string"},"showComments":{"type":"boolean"},"showTooltips":{"type":"boolean"}},"required":["mode","showComments","showTooltips"],"type":"object"},"spec":{"additionalProperties":false,"properties":{"investigationSummaries":{"description":"Array of investigation summaries\n+listType=atomic","items":{"$ref":"#/components/schemas/InvestigationSummary"},"type":"array"},"owner":{"$ref":"#/components/schemas/Person","description":"The Person who owns this investigation index"},"title":{"description":"Title of the index, e.g. 'Favorites' or 'My Investigations'","type":"string"}},"required":["title","owner","investigationSummaries"],"type":"object"}}`)
versionSchemaInvestigationIndexv0alpha1 app.VersionSchema
_ = json.Unmarshal(rawSchemaInvestigationIndexv0alpha1, &versionSchemaInvestigationIndexv0alpha1)
)

View File

@@ -0,0 +1,319 @@
{
"apiVersion": "apps.grafana.com/v1alpha2",
"kind": "AppManifest",
"metadata": {
"name": "logsdrilldown"
},
"spec": {
"appName": "logsdrilldown",
"group": "logsdrilldown.grafana.app",
"versions": [
{
"name": "v1alpha1",
"served": true,
"kinds": [
{
"kind": "LogsDrilldown",
"plural": "LogsDrilldowns",
"scope": "Namespaced",
"schemas": {
"LogsDrilldown": {
"properties": {
"spec": {
"$ref": "#/components/schemas/spec"
},
"status": {
"$ref": "#/components/schemas/status"
}
},
"required": ["spec"]
},
"OperatorState": {
"additionalProperties": false,
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "details contains any extra information that is operator-specific",
"type": "object"
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"spec": {
"additionalProperties": false,
"properties": {
"defaultFields": {
"items": {
"type": "string"
},
"type": "array"
},
"interceptDismissed": {
"type": "boolean"
},
"prettifyJSON": {
"type": "boolean"
},
"wrapLogMessage": {
"type": "boolean"
}
},
"required": ["defaultFields", "prettifyJSON", "wrapLogMessage", "interceptDismissed"],
"type": "object"
},
"status": {
"additionalProperties": false,
"properties": {
"additionalFields": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "additionalFields is reserved for future use",
"type": "object"
},
"operatorStates": {
"additionalProperties": {
"$ref": "#/components/schemas/OperatorState"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"conversion": false
},
{
"kind": "LogsDrilldownDefaults",
"plural": "LogsDrilldownDefaults",
"scope": "Namespaced",
"schemas": {
"LogsDrilldownDefaults": {
"properties": {
"spec": {
"$ref": "#/components/schemas/spec"
},
"status": {
"$ref": "#/components/schemas/status"
}
},
"required": ["spec"]
},
"OperatorState": {
"additionalProperties": false,
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "details contains any extra information that is operator-specific",
"type": "object"
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"spec": {
"additionalProperties": false,
"properties": {
"defaultFields": {
"items": {
"type": "string"
},
"type": "array"
},
"interceptDismissed": {
"type": "boolean"
},
"prettifyJSON": {
"type": "boolean"
},
"wrapLogMessage": {
"type": "boolean"
}
},
"required": ["defaultFields", "prettifyJSON", "wrapLogMessage", "interceptDismissed"],
"type": "object"
},
"status": {
"additionalProperties": false,
"properties": {
"additionalFields": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "additionalFields is reserved for future use",
"type": "object"
},
"operatorStates": {
"additionalProperties": {
"$ref": "#/components/schemas/OperatorState"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"conversion": false
},
{
"kind": "LogsDrilldownDefaultColumns",
"plural": "LogsDrilldownDefaultColumns",
"scope": "Namespaced",
"schemas": {
"LogsDefaultColumnsLabel": {
"additionalProperties": false,
"properties": {
"key": {
"type": "string"
},
"value": {
"type": "string"
}
},
"required": ["key", "value"],
"type": "object"
},
"LogsDefaultColumnsLabels": {
"items": {
"$ref": "#/components/schemas/LogsDefaultColumnsLabel"
},
"type": "array"
},
"LogsDefaultColumnsRecord": {
"additionalProperties": false,
"properties": {
"columns": {
"items": {
"type": "string"
},
"type": "array"
},
"labels": {
"$ref": "#/components/schemas/LogsDefaultColumnsLabels"
}
},
"required": ["columns", "labels"],
"type": "object"
},
"LogsDefaultColumnsRecords": {
"items": {
"$ref": "#/components/schemas/LogsDefaultColumnsRecord"
},
"type": "array"
},
"LogsDrilldownDefaultColumns": {
"properties": {
"spec": {
"$ref": "#/components/schemas/spec"
},
"status": {
"$ref": "#/components/schemas/status"
}
},
"required": ["spec"]
},
"OperatorState": {
"additionalProperties": false,
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "details contains any extra information that is operator-specific",
"type": "object"
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"spec": {
"additionalProperties": false,
"properties": {
"records": {
"$ref": "#/components/schemas/LogsDefaultColumnsRecords"
}
},
"required": ["records"],
"type": "object"
},
"status": {
"additionalProperties": false,
"properties": {
"additionalFields": {
"additionalProperties": {
"additionalProperties": {},
"type": "object"
},
"description": "additionalFields is reserved for future use",
"type": "object"
},
"operatorStates": {
"additionalProperties": {
"$ref": "#/components/schemas/OperatorState"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"conversion": false
}
]
}
],
"preferredVersion": "v1alpha1"
}
}

View File

@@ -0,0 +1,92 @@
{
"kind": "CustomResourceDefinition",
"apiVersion": "apiextensions.k8s.io/v1",
"metadata": {
"name": "logsdrilldowns.logsdrilldown.grafana.app"
},
"spec": {
"group": "logsdrilldown.grafana.app",
"versions": [
{
"name": "v1alpha1",
"served": true,
"storage": true,
"schema": {
"openAPIV3Schema": {
"properties": {
"spec": {
"properties": {
"defaultFields": {
"items": {
"type": "string"
},
"type": "array"
},
"interceptDismissed": {
"type": "boolean"
},
"prettifyJSON": {
"type": "boolean"
},
"wrapLogMessage": {
"type": "boolean"
}
},
"required": ["defaultFields", "prettifyJSON", "wrapLogMessage", "interceptDismissed"],
"type": "object"
},
"status": {
"properties": {
"additionalFields": {
"description": "additionalFields is reserved for future use",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"operatorStates": {
"additionalProperties": {
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"description": "details contains any extra information that is operator-specific",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"required": ["spec"],
"type": "object"
}
},
"subresources": {
"status": {}
}
}
],
"names": {
"kind": "LogsDrilldown",
"plural": "logsdrilldowns"
},
"scope": "Namespaced"
}
}

View File

@@ -0,0 +1,107 @@
{
"kind": "CustomResourceDefinition",
"apiVersion": "apiextensions.k8s.io/v1",
"metadata": {
"name": "logsdrilldowndefaultcolumns.logsdrilldown.grafana.app"
},
"spec": {
"group": "logsdrilldown.grafana.app",
"versions": [
{
"name": "v1alpha1",
"served": true,
"storage": true,
"schema": {
"openAPIV3Schema": {
"properties": {
"spec": {
"properties": {
"records": {
"items": {
"properties": {
"columns": {
"items": {
"type": "string"
},
"type": "array"
},
"labels": {
"items": {
"properties": {
"key": {
"type": "string"
},
"value": {
"type": "string"
}
},
"required": ["key", "value"],
"type": "object"
},
"type": "array"
}
},
"required": ["columns", "labels"],
"type": "object"
},
"type": "array"
}
},
"required": ["records"],
"type": "object"
},
"status": {
"properties": {
"additionalFields": {
"description": "additionalFields is reserved for future use",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"operatorStates": {
"additionalProperties": {
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"description": "details contains any extra information that is operator-specific",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"required": ["spec"],
"type": "object"
}
},
"subresources": {
"status": {}
}
}
],
"names": {
"kind": "LogsDrilldownDefaultColumns",
"plural": "logsdrilldowndefaultcolumns"
},
"scope": "Namespaced"
}
}

View File

@@ -0,0 +1,92 @@
{
"kind": "CustomResourceDefinition",
"apiVersion": "apiextensions.k8s.io/v1",
"metadata": {
"name": "logsdrilldowndefaults.logsdrilldown.grafana.app"
},
"spec": {
"group": "logsdrilldown.grafana.app",
"versions": [
{
"name": "v1alpha1",
"served": true,
"storage": true,
"schema": {
"openAPIV3Schema": {
"properties": {
"spec": {
"properties": {
"defaultFields": {
"items": {
"type": "string"
},
"type": "array"
},
"interceptDismissed": {
"type": "boolean"
},
"prettifyJSON": {
"type": "boolean"
},
"wrapLogMessage": {
"type": "boolean"
}
},
"required": ["defaultFields", "prettifyJSON", "wrapLogMessage", "interceptDismissed"],
"type": "object"
},
"status": {
"properties": {
"additionalFields": {
"description": "additionalFields is reserved for future use",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"operatorStates": {
"additionalProperties": {
"properties": {
"descriptiveState": {
"description": "descriptiveState is an optional more descriptive state field which has no requirements on format",
"type": "string"
},
"details": {
"description": "details contains any extra information that is operator-specific",
"type": "object",
"x-kubernetes-preserve-unknown-fields": true
},
"lastEvaluation": {
"description": "lastEvaluation is the ResourceVersion last evaluated",
"type": "string"
},
"state": {
"description": "state describes the state of the lastEvaluation.\nIt is limited to three possible states for machine evaluation.",
"enum": ["success", "in_progress", "failed"],
"type": "string"
}
},
"required": ["lastEvaluation", "state"],
"type": "object"
},
"description": "operatorStates is a map of operator ID to operator state evaluations.\nAny operator which consumes this kind SHOULD add its state evaluation information to this field.",
"type": "object"
}
},
"type": "object"
}
},
"required": ["spec"],
"type": "object"
}
},
"subresources": {
"status": {}
}
}
],
"names": {
"kind": "LogsDrilldownDefaults",
"plural": "logsdrilldowndefaults"
},
"scope": "Namespaced"
}
}

View File

@@ -1,5 +1,9 @@
package kinds
import (
"github.com/grafana/grafana/apps/logsdrilldown/kinds/v0alpha1"
)
LogsDrilldownSpecv1alpha1: {
defaultFields: [...string] | *[]
prettifyJSON: bool
@@ -21,3 +25,12 @@ logsdrilldownDefaultsv1alpha1: {
spec: LogsDrilldownSpecv1alpha1
}
}
// Default columns API
logsdrilldownDefaultColumnsv0alpha1: {
kind: "LogsDrilldownDefaultColumns"
pluralName: "LogsDrilldownDefaultColumns"
schema: {
spec: v0alpha1.LogsDefaultColumns
}
}

Some files were not shown because too many files have changed in this diff Show More