Compare commits

...

36 Commits

Author SHA1 Message Date
Todd Treece
07bb48e874 add storage test 2025-12-11 21:22:28 -05:00
Will Browne
306186c4ea Merge branch 'main' into wb/pluginmeta-local 2025-12-11 16:59:33 +00:00
Tobias Skarhed
f63c2cb2dd Scopes: Don't use redirect if you're on an active scope navigation (#115149)
* Don't use redirectUrl if we are on an active scope navigation

* Remove superflous test
2025-12-11 17:42:47 +01:00
Tobias Skarhed
fe4c615b3d Scopes: Sync nested scopes navigation open folders to URL (#114786)
* Sync nav_scope_path with url

* Let the current active scope remain if it is a child of the selected subscope

* Remove location updates based on nav_scope_path to maintain expanded folders

* Fix folder tests

* Remove console logs

* Better mock for changeScopes

* Update test to support the new calls

* Update test with function inputs

* Fix failinging test

* Add tests and add isEqual check for fetching new subscopes
2025-12-11 17:34:21 +01:00
grafana-pr-automation[bot]
02d3fd7b31 I18n: Download translations from Crowdin (#115123)
New Crowdin translations by GitHub Action

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2025-12-11 16:31:02 +00:00
Jesse David Peterson
5dcfc19060 Table: Add title attribute to make truncated headings legible (#115155)
* fix(table): add HTML title attribute to make truncated headings legible

* fix(table): avoid redundant display name calculation

Co-authored-by: Paul Marbach <paul.marbach@grafana.com>

---------

Co-authored-by: Paul Marbach <paul.marbach@grafana.com>
2025-12-11 12:22:10 -04:00
Roberto Jiménez Sánchez
5bda17be3f Provisioning: Update provisioning docs to reflect kubernetesDashboards defaults to true (#115159)
Docs: Update provisioning docs to reflect kubernetesDashboards defaults to true

The kubernetesDashboards feature toggle now defaults to true, so users
don't need to explicitly enable it in their configuration. Updated
documentation and UI to reflect this:

- Removed kubernetesDashboards from configuration examples
- Added notes explaining it's enabled by default
- Clarified that users only need to take action if they've explicitly
  disabled it
- Kept validation checks to catch explicit disables

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2025-12-11 17:08:57 +01:00
Will Browne
a28c70bbcc fix linter 2025-12-11 16:06:27 +00:00
Will Browne
1ebcd2319a undo var name change 2025-12-11 15:55:17 +00:00
Will Browne
5dc3767854 more tidying 2025-12-11 15:48:26 +00:00
Usman Ahmad
bc88796e6e Created Troubleshooting guide for MySQL data source plugin (#114737)
* created troubleshooting guide for mysql data source plugin

Signed-off-by: Usman Ahmad <usman.ahmad@grafana.com>

* Apply suggestions from code review

thanks for the code review

Co-authored-by: Christopher Moyer <35463610+chri2547@users.noreply.github.com>

* rename file from _index.md to index.md

Signed-off-by: Usman Ahmad <usman.ahmad@grafana.com>

* Update docs/sources/datasources/mysql/troubleshoot/index.md

---------

Signed-off-by: Usman Ahmad <usman.ahmad@grafana.com>
Co-authored-by: Christopher Moyer <35463610+chri2547@users.noreply.github.com>
2025-12-11 16:42:09 +01:00
Will Browne
040dbfb5e3 tidy 2025-12-11 15:34:55 +00:00
Andres Torres
5d7b9c5050 fix(setting): Replacing dynamic client to reduce memory footprint (#115125) 2025-12-11 10:24:01 -05:00
Alexander Akhmetov
73bcfbcc74 Alerting: Collate alert_rule.namespace_uid column as binary (#115152)
Alerting: Collate namespace_uid column as binary
2025-12-11 16:05:13 +01:00
Will Browne
32d43f5b5d add gen files 2025-12-11 14:51:09 +00:00
Will Browne
fef9c760a0 merge with main 2025-12-11 14:48:58 +00:00
Erik Sundell
4ab198b201 E2E Selectors: Fix package description (#115148)
dummie change
2025-12-11 14:00:54 +00:00
Erik Sundell
0c82f92539 NPM: Attempt to fix e2e-selectors dist-tag after OIDC migration (#115012)
* fetch oidc token from github

* use same approach as electron
2025-12-11 14:35:27 +01:00
Ivana Huckova
73de5f98e1 Assistant: Update origin for analyze-rule-menu-item (#115147)
* Assistant: Update origin for analyze-rule-menu-item

* Update origin, not test id
2025-12-11 13:06:09 +00:00
Oscar Kilhed
b6ba8a0fd4 Dashboards: Make variables selectable in controls menu (#115092)
* Dashboard: Make variables selectable in controls menu and improve spacing

- Add selection support for variables in controls menu (onPointerDown handler and selection classes)
- Add padding to variables and annotations in controls menu (theme.spacing(1))
- Reduce menu container padding from 1.5 to 1
- Remove margins between menu items

* fix: remove unused imports in DashboardControlsMenu
2025-12-11 13:55:03 +01:00
Oscar Kilhed
350c3578c7 Dynamic dashboards: Update variable set state when variable hide property changes (#115094)
fix: update variable set state when variable hide property changes

When changing a variable's positioning to show in controls menu using the edit side pane, the state of dashboardControls does not immediately update. This makes it seem to the user that nothing was changed.

The issue was that when a variable's hide property changes, only the variable's state was updated, but not the parent SceneVariableSet state. Components that subscribe to the variable set state (like useDashboardControls) didn't detect the change because the variables array reference remained the same.

This fix updates the parent SceneVariableSet state when a variable's hide property changes, ensuring components that subscribe to the variable set will re-render immediately.

Co-authored-by: grafakus <marc.mignonsin@grafana.com>
2025-12-11 13:54:30 +01:00
Andres Martinez Gotor
e6b5ece559 Plugins Preinstall: Fix URL parsing when includes basic auth (#115143)
Preinstall: Fix URL setting when includes basic auth
2025-12-11 13:38:02 +01:00
Ryan McKinley
eef14d2cee Dependencies: update glob@npm for dependabot (#115146) 2025-12-11 12:33:34 +00:00
Anna Urbiztondo
c71c0b33ee Docs: Configure Git Sync using CLI (#115068)
* WIP

* WIP

* Edits, Claude

* Prettier

* Update docs/sources/as-code/observability-as-code/provision-resources/git-sync-setup.md

Co-authored-by: Roberto Jiménez Sánchez <roberto.jimenez@grafana.com>

* Update docs/sources/as-code/observability-as-code/provision-resources/git-sync-setup.md

Co-authored-by: Roberto Jiménez Sánchez <roberto.jimenez@grafana.com>

* WIP

* Restructuring

* Minor tweaks

* Fix

* Update docs/sources/as-code/observability-as-code/provision-resources/git-sync-setup.md

Co-authored-by: Roberto Jiménez Sánchez <roberto.jimenez@grafana.com>

* Feedback

* Prettier

* Links

---------

Co-authored-by: Roberto Jiménez Sánchez <roberto.jimenez@grafana.com>
2025-12-11 11:27:36 +00:00
Lauren
d568798c64 Alerting: Improve instance count display (#114997)
* Update button text to Show All if filters are enabled

* Show state in text if filters enabled

* resolve PR comments
2025-12-11 11:01:53 +00:00
Ryan McKinley
9bec62a080 Live: simplify dependencies (#115130) 2025-12-11 13:37:45 +03:00
Roberto Jiménez Sánchez
7fe3214f16 Provisioning: Add fieldSelector regression tests for Repository and Jobs (#115135) 2025-12-11 13:36:01 +03:00
Alexander Zobnin
e2d12f4cce Zanzana: Refactor remote client initialization (#114142)
* Zanzana: Refactor remote client

* rename config field URL to Addr

* Instrument grpc queries

* fix duplicated field
2025-12-11 10:55:12 +01:00
Will Browne
1fe9a38a2a add angular + translations 2025-11-27 15:04:20 +00:00
Will Browne
59bf7896f4 Merge branch 'main' into wb/pluginmeta-local 2025-11-27 10:40:45 +00:00
Will Browne
4b4ad544a8 Merge branch 'main' into wb/pluginmeta-local 2025-11-26 16:08:53 +00:00
Will Browne
7e3289f2c9 Merge branch 'main' into wb/pluginmeta-local 2025-11-26 12:24:17 +00:00
Will Browne
0d0b5b757b fix test + lint issues 2025-11-26 12:14:50 +00:00
Will Browne
c49261cce2 fix another test 2025-11-26 11:54:12 +00:00
Will Browne
d5efce72f3 fix tests 2025-11-26 11:48:13 +00:00
Will Browne
881c81f0b3 flesh out local for demo 2025-11-26 11:28:16 +00:00
113 changed files with 3729 additions and 1907 deletions

View File

@@ -377,10 +377,10 @@ github.com/cenkalti/backoff/v4 v4.3.0/go.mod h1:Y3VNntkOUPxTVeUxJ/G5vcM//AlwfmyY
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/centrifugal/centrifuge v0.37.2 h1:rerQNvDfYN2FZEkVtb/hvGV7SIrJfEQrKF3MaE8GDlo=
github.com/centrifugal/centrifuge v0.37.2/go.mod h1:aj4iRJGhzi3SlL8iUtVezxway1Xf8g+hmNQkLLO7sS8=
github.com/centrifugal/protocol v0.16.2 h1:KoIHgDeX1fFxyxQoKW+6E8ZTCf5mwGm8JyGoJ5NBMbQ=
github.com/centrifugal/protocol v0.16.2/go.mod h1:Q7OpS/8HMXDnL7f9DpNx24IhG96MP88WPpVTTCdrokI=
github.com/centrifugal/centrifuge v0.38.0 h1:UJTowwc5lSwnpvd3vbrTseODbU7osSggN67RTrJ8EfQ=
github.com/centrifugal/centrifuge v0.38.0/go.mod h1:rcZLARnO5GXOeE9qG7iIPMvERxESespqkSX4cGLCAzo=
github.com/centrifugal/protocol v0.17.0 h1:hD0WczyiG7zrVJcgkQsd5/nhfFXt0Y04SJHV2Z7B1rg=
github.com/centrifugal/protocol v0.17.0/go.mod h1:9MdiYyjw5Bw1+d5Sp4Y0NK+qiuTNyd88nrHJsUUh8k4=
github.com/cespare/xxhash v1.1.0 h1:a6HrQnmkObjyL+Gs60czilIUGqrzKutQD6XZog3p+ko=
github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghfAqPWnc=
github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
@@ -1376,11 +1376,13 @@ github.com/puzpuzpuz/xsync/v2 v2.5.1 h1:mVGYAvzDSu52+zaGyNjC+24Xw2bQi3kTr4QJ6N9p
github.com/puzpuzpuz/xsync/v2 v2.5.1/go.mod h1:gD2H2krq/w52MfPLE+Uy64TzJDVY7lP2znR9qmR35kU=
github.com/puzpuzpuz/xsync/v4 v4.2.0 h1:dlxm77dZj2c3rxq0/XNvvUKISAmovoXF4a4qM6Wvkr0=
github.com/puzpuzpuz/xsync/v4 v4.2.0/go.mod h1:VJDmTCJMBt8igNxnkQd86r+8KUeN1quSfNKu5bLYFQo=
github.com/quagmt/udecimal v1.9.0 h1:TLuZiFeg0HhS6X8VDa78Y6XTaitZZfh+z5q4SXMzpDQ=
github.com/quagmt/udecimal v1.9.0/go.mod h1:ScmJ/xTGZcEoYiyMMzgDLn79PEJHcMBiJ4NNRT3FirA=
github.com/rcrowley/go-metrics v0.0.0-20181016184325-3113b8401b8a/go.mod h1:bCqnVzQkZxMG4s8nGwiZ5l3QUCyqpo9Y+/ZMZ9VjZe4=
github.com/redis/go-redis/v9 v9.14.0 h1:u4tNCjXOyzfgeLN+vAZaW1xUooqWDqVEsZN0U01jfAE=
github.com/redis/go-redis/v9 v9.14.0/go.mod h1:huWgSWd8mW6+m0VPhJjSSQ+d6Nh1VICQ6Q5lHuCH/Iw=
github.com/redis/rueidis v1.0.64 h1:XqgbueDuNV3qFdVdQwAHJl1uNt90zUuAJuzqjH4cw6Y=
github.com/redis/rueidis v1.0.64/go.mod h1:Lkhr2QTgcoYBhxARU7kJRO8SyVlgUuEkcJO1Y8MCluA=
github.com/redis/rueidis v1.0.68 h1:gept0E45JGxVigWb3zoWHvxEc4IOC7kc4V/4XvN8eG8=
github.com/redis/rueidis v1.0.68/go.mod h1:Lkhr2QTgcoYBhxARU7kJRO8SyVlgUuEkcJO1Y8MCluA=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=

View File

@@ -4,6 +4,8 @@ import (
"context"
"fmt"
"github.com/prometheus/client_golang/prometheus"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/logging"
"github.com/grafana/grafana-app-sdk/operator"
@@ -12,7 +14,6 @@ import (
foldersKind "github.com/grafana/grafana/apps/folder/pkg/apis/folder/v1beta1"
"github.com/grafana/grafana/apps/iam/pkg/reconcilers"
"github.com/grafana/grafana/pkg/services/authz"
"github.com/prometheus/client_golang/prometheus"
)
var appManifestData = app.ManifestData{
@@ -78,7 +79,7 @@ func New(cfg app.Config) (app.App, error) {
folderReconciler, err := reconcilers.NewFolderReconciler(reconcilers.ReconcilerConfig{
ZanzanaCfg: appSpecificConfig.ZanzanaClientCfg,
Metrics: metrics,
})
}, appSpecificConfig.MetricsRegisterer)
if err != nil {
return nil, fmt.Errorf("unable to create FolderReconciler: %w", err)
}

View File

@@ -5,6 +5,7 @@ import (
"fmt"
"time"
"github.com/prometheus/client_golang/prometheus"
"go.opentelemetry.io/otel"
"go.opentelemetry.io/otel/attribute"
"go.opentelemetry.io/otel/codes"
@@ -35,9 +36,9 @@ type FolderReconciler struct {
metrics *ReconcilerMetrics
}
func NewFolderReconciler(cfg ReconcilerConfig) (operator.Reconciler, error) {
func NewFolderReconciler(cfg ReconcilerConfig, reg prometheus.Registerer) (operator.Reconciler, error) {
// Create Zanzana client
zanzanaClient, err := authz.NewRemoteZanzanaClient("*", cfg.ZanzanaCfg)
zanzanaClient, err := authz.NewRemoteZanzanaClient(cfg.ZanzanaCfg, reg)
if err != nil {
return nil, fmt.Errorf("unable to create zanzana client: %w", err)

View File

@@ -24,6 +24,7 @@ require (
require (
cel.dev/expr v0.24.0 // indirect
github.com/NYTimes/gziphandler v1.1.1 // indirect
github.com/ProtonMail/go-crypto v1.1.6 // indirect
github.com/antlr4-go/antlr/v4 v4.13.1 // indirect
github.com/apache/arrow-go/v18 v18.4.1 // indirect
github.com/armon/go-metrics v0.4.1 // indirect
@@ -35,16 +36,21 @@ require (
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.8 // indirect
github.com/aws/aws-sdk-go-v2/service/sts v1.38.5 // indirect
github.com/aws/smithy-go v1.23.1 // indirect
github.com/barkimedes/go-deepcopy v0.0.0-20220514131651-17c30cfc62df // indirect
github.com/beorn7/perks v1.0.1 // indirect
github.com/blang/semver v3.5.1+incompatible // indirect
github.com/blang/semver/v4 v4.0.0 // indirect
github.com/bluele/gcache v0.0.2 // indirect
github.com/bradfitz/gomemcache v0.0.0-20230905024940-24af94b03874 // indirect
github.com/bwmarrin/snowflake v0.3.0 // indirect
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/cheekybits/genny v1.0.0 // indirect
github.com/cloudflare/circl v1.6.1 // indirect
github.com/coreos/go-semver v0.3.1 // indirect
github.com/coreos/go-systemd/v22 v22.5.0 // indirect
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
github.com/diegoholiveira/jsonlogic/v3 v3.7.4 // indirect
github.com/evanphx/json-patch v5.9.11+incompatible // indirect
github.com/fatih/color v1.18.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
@@ -132,11 +138,15 @@ require (
github.com/mohae/deepcopy v0.0.0-20170929034955-c48cc78d4826 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f // indirect
github.com/nikunjy/rules v1.5.0 // indirect
github.com/oasdiff/yaml v0.0.0-20250309154309-f31be36b4037 // indirect
github.com/oasdiff/yaml3 v0.0.0-20250309153720-d2182401db90 // indirect
github.com/oklog/run v1.1.0 // indirect
github.com/oklog/ulid v1.3.1 // indirect
github.com/olekukonko/tablewriter v0.0.5 // indirect
github.com/open-feature/go-sdk v1.16.0 // indirect
github.com/open-feature/go-sdk-contrib/providers/go-feature-flag v0.2.6 // indirect
github.com/open-feature/go-sdk-contrib/providers/ofrep v0.1.6 // indirect
github.com/patrickmn/go-cache v2.1.0+incompatible // indirect
github.com/perimeterx/marshmallow v1.1.5 // indirect
github.com/pierrec/lz4/v4 v4.1.22 // indirect
@@ -155,6 +165,7 @@ require (
github.com/spf13/pflag v1.0.10 // indirect
github.com/stoewer/go-strcase v1.3.1 // indirect
github.com/stretchr/objx v0.5.2 // indirect
github.com/thomaspoignant/go-feature-flag v1.42.0 // indirect
github.com/tjhop/slog-gokit v0.1.5 // indirect
github.com/woodsbury/decimal128 v1.3.0 // indirect
github.com/x448/float16 v0.8.4 // indirect
@@ -176,6 +187,8 @@ require (
go.opentelemetry.io/otel/sdk v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
go.opentelemetry.io/proto/otlp v1.9.0 // indirect
go.uber.org/atomic v1.11.0 // indirect
go.uber.org/mock v0.6.0 // indirect
go.uber.org/multierr v1.11.0 // indirect
go.uber.org/zap v1.27.1 // indirect
go.yaml.in/yaml/v2 v2.4.3 // indirect

View File

@@ -4,9 +4,13 @@ cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMT
cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
filippo.io/edwards25519 v1.1.0 h1:FNf4tywRC1HmFuKW5xopWpigGjJKiJSV0Cqo0cJWDaA=
filippo.io/edwards25519 v1.1.0/go.mod h1:BxyFTGdWcka3PhytdK4V28tE5sGfRvvvRV7EaN4VDT4=
github.com/BurntSushi/toml v1.5.0 h1:W5quZX/G/csjUnuI8SUYlsHs9M38FC7znL0lIO+DvMg=
github.com/BurntSushi/toml v1.5.0/go.mod h1:ukJfTF/6rtPPRCnwkur4qwRxa8vTRFBF0uk2lLoLwho=
github.com/DataDog/datadog-go v3.2.0+incompatible/go.mod h1:LButxg5PwREeZtORoXG3tL4fMGNddJ+vMq1mwgfaqoQ=
github.com/NYTimes/gziphandler v1.1.1 h1:ZUDjpQae29j0ryrS0u/B8HZfJBtBQHjqw2rQ2cqUQ3I=
github.com/NYTimes/gziphandler v1.1.1/go.mod h1:n/CVRwUEOgIxrgPvAQhUUr9oeUtvrhMomdKFjzJNB0c=
github.com/ProtonMail/go-crypto v1.1.6 h1:ZcV+Ropw6Qn0AX9brlQLAUXfqLBc7Bl+f/DmNxpLfdw=
github.com/ProtonMail/go-crypto v1.1.6/go.mod h1:rA3QumHc/FZ8pAHreoekgiAbzpNsfQAosU5td4SnOrE=
github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
@@ -38,12 +42,18 @@ github.com/aws/aws-sdk-go-v2/service/sts v1.38.5 h1:+LVB0xBqEgjQoqr9bGZbRzvg212B
github.com/aws/aws-sdk-go-v2/service/sts v1.38.5/go.mod h1:xoaxeqnnUaZjPjaICgIy5B+MHCSb/ZSOn4MvkFNOUA0=
github.com/aws/smithy-go v1.23.1 h1:sLvcH6dfAFwGkHLZ7dGiYF7aK6mg4CgKA/iDKjLDt9M=
github.com/aws/smithy-go v1.23.1/go.mod h1:LEj2LM3rBRQJxPZTB4KuzZkaZYnZPnvgIhb4pu07mx0=
github.com/barkimedes/go-deepcopy v0.0.0-20220514131651-17c30cfc62df h1:GSoSVRLoBaFpOOds6QyY1L8AX7uoY+Ln3BHc22W40X0=
github.com/barkimedes/go-deepcopy v0.0.0-20220514131651-17c30cfc62df/go.mod h1:hiVxq5OP2bUGBRNS3Z/bt/reCLFNbdcST6gISi1fiOM=
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
github.com/blang/semver v3.5.1+incompatible h1:cQNTCjp13qL8KC3Nbxr/y2Bqb63oX6wdnnjpJbkM4JQ=
github.com/blang/semver v3.5.1+incompatible/go.mod h1:kRBLl5iJ+tD4TcOOxsy/0fnwebNt5EWlYSAyrTnjyyk=
github.com/blang/semver/v4 v4.0.0 h1:1PFHFE6yCCTv8C1TeyNNarDzntLi7wMI5i/pzqYIsAM=
github.com/blang/semver/v4 v4.0.0/go.mod h1:IbckMUScFkM3pff0VJDNKRiT6TG/YpiHIM2yvyW5YoQ=
github.com/bluele/gcache v0.0.2 h1:WcbfdXICg7G/DGBh1PFfcirkWOQV+v077yF1pSy3DGw=
github.com/bluele/gcache v0.0.2/go.mod h1:m15KV+ECjptwSPxKhOhQoAFQVtUFjTVkc3H8o0t/fp0=
github.com/bradfitz/gomemcache v0.0.0-20230905024940-24af94b03874 h1:N7oVaKyGp8bttX0bfZGmcGkjz7DLQXhAn3DNd3T0ous=
github.com/bradfitz/gomemcache v0.0.0-20230905024940-24af94b03874/go.mod h1:r5xuitiExdLAJ09PR7vBVENGvp4ZuTBeWTGtxuX3K+c=
github.com/bufbuild/protocompile v0.14.1 h1:iA73zAf/fyljNjQKwYzUHD6AD4R8KMasmwa/FBatYVw=
@@ -60,6 +70,8 @@ github.com/cheekybits/genny v1.0.0/go.mod h1:+tQajlRqAUrPI7DOSpB0XAqZYtQakVtB7wX
github.com/circonus-labs/circonus-gometrics v2.3.1+incompatible/go.mod h1:nmEj6Dob7S7YxXgwXpfOuvO54S+tGdZdw9fuRZt25Ag=
github.com/circonus-labs/circonusllhist v0.1.3/go.mod h1:kMXHVDlOchFAehlya5ePtbp5jckzBHf4XRpQvBOLI+I=
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
github.com/cloudflare/circl v1.6.1 h1:zqIqSPIndyBh1bjLVVDHMPpVKqp8Su/V+6MeDzzQBQ0=
github.com/cloudflare/circl v1.6.1/go.mod h1:uddAzsPgqdMAYatqJ0lsjX1oECcQLIlRpzZh3pJrofs=
github.com/coreos/go-semver v0.3.1 h1:yi21YpKnrx1gt5R+la8n5WgS0kCrsPp33dmEyHReZr4=
github.com/coreos/go-semver v0.3.1/go.mod h1:irMmmIw/7yzSRPWryHsK7EYSg09caPQL03VsM8rvUec=
github.com/coreos/go-systemd/v22 v22.5.0 h1:RrqgGjYQKalulkV8NGVIfkXQf6YYmOyiJKk8iXXhfZs=
@@ -69,6 +81,8 @@ github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSs
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc h1:U9qPSI2PIWSS1VwoXQT9A3Wy9MM3WgvqSxFWenqJduM=
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/diegoholiveira/jsonlogic/v3 v3.7.4 h1:92HSmB9bwM/o0ZvrCpcvTP2EsPXSkKtAniIr2W/dcIM=
github.com/diegoholiveira/jsonlogic/v3 v3.7.4/go.mod h1:OYRb6FSTVmMM+MNQ7ElmMsczyNSepw+OU4Z8emDSi4w=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
github.com/emicklei/go-restful/v3 v3.13.0 h1:C4Bl2xDndpU6nJ4bc1jXd+uTmYPVUwkD6bFY/oTyCes=
@@ -341,6 +355,8 @@ github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8m
github.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=
github.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f h1:KUppIJq7/+SVif2QVs3tOP0zanoHgBEVAwHxUSIzRqU=
github.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=
github.com/nikunjy/rules v1.5.0 h1:KJDSLOsFhwt7kcXUyZqwkgrQg5YoUwj+TVu6ItCQShw=
github.com/nikunjy/rules v1.5.0/go.mod h1:TlZtZdBChrkqi8Lr2AXocme8Z7EsbxtFdDoKeI6neBQ=
github.com/oasdiff/yaml v0.0.0-20250309154309-f31be36b4037 h1:G7ERwszslrBzRxj//JalHPu/3yz+De2J+4aLtSRlHiY=
github.com/oasdiff/yaml v0.0.0-20250309154309-f31be36b4037/go.mod h1:2bpvgLBZEtENV5scfDFEtB/5+1M4hkQhDQrccEJ/qGw=
github.com/oasdiff/yaml3 v0.0.0-20250309153720-d2182401db90 h1:bQx3WeLcUWy+RletIKwUIt4x3t8n2SxavmoclizMb8c=
@@ -355,6 +371,12 @@ github.com/onsi/ginkgo/v2 v2.22.2 h1:/3X8Panh8/WwhU/3Ssa6rCKqPLuAkVY2I0RoyDLySlU
github.com/onsi/ginkgo/v2 v2.22.2/go.mod h1:oeMosUL+8LtarXBHu/c0bx2D/K9zyQ6uX3cTyztHwsk=
github.com/onsi/gomega v1.36.2 h1:koNYke6TVk6ZmnyHrCXba/T/MoLBXFjeC1PtvYgw0A8=
github.com/onsi/gomega v1.36.2/go.mod h1:DdwyADRjrc825LhMEkD76cHR5+pUnjhUN8GlHlRPHzY=
github.com/open-feature/go-sdk v1.16.0 h1:5NCHYv5slvNBIZhYXAzAufo0OI59OACZ5tczVqSE+Tg=
github.com/open-feature/go-sdk v1.16.0/go.mod h1:EIF40QcoYT1VbQkMPy2ZJH4kvZeY+qGUXAorzSWgKSo=
github.com/open-feature/go-sdk-contrib/providers/go-feature-flag v0.2.6 h1:megzzlQGjsRVWDX8oJnLaa5eEcsAHekiL4Uvl3jSAcY=
github.com/open-feature/go-sdk-contrib/providers/go-feature-flag v0.2.6/go.mod h1:K1gDKvt76CGFLSUMHUydd5ba2V5Cv69gQZsdbnXhAm8=
github.com/open-feature/go-sdk-contrib/providers/ofrep v0.1.6 h1:WinefYxeVx5rV0uQmuWbxQf8iACu/JiRubo5w0saToc=
github.com/open-feature/go-sdk-contrib/providers/ofrep v0.1.6/go.mod h1:Dwcaoma6lZVqYwyfVlY7eB6RXbG+Ju3b9cnpTlUN+Hc=
github.com/pascaldekloe/goe v0.1.0 h1:cBOtyMzM9HTpWjXfbbunk26uA6nG3a8n06Wieeh0MwY=
github.com/pascaldekloe/goe v0.1.0/go.mod h1:lzWF7FIEvWOWxwDKqyGYQf6ZUaNfKdP144TG7ZOy1lc=
github.com/patrickmn/go-cache v2.1.0+incompatible h1:HRMgzkcYKYpi3C8ajMPV8OFXaaRUnok+kx1WdO15EQc=
@@ -440,6 +462,10 @@ github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/thejerf/slogassert v0.3.4 h1:VoTsXixRbXMrRSSxDjYTiEDCM4VWbsYPW5rB/hX24kM=
github.com/thejerf/slogassert v0.3.4/go.mod h1:0zn9ISLVKo1aPMTqcGfG1o6dWwt+Rk574GlUxHD4rs8=
github.com/thomaspoignant/go-feature-flag v1.42.0 h1:C7embmOTzaLyRki+OoU2RvtVjJE9IrvgBA2C1mRN1lc=
github.com/thomaspoignant/go-feature-flag v1.42.0/go.mod h1:y0QiWH7chHWhGATb/+XqwAwErORmPSH2MUsQlCmmWlM=
github.com/tjhop/slog-gokit v0.1.5 h1:ayloIUi5EK2QYB8eY4DOPO95/mRtMW42lUkp3quJohc=
github.com/tjhop/slog-gokit v0.1.5/go.mod h1:yA48zAHvV+Sg4z4VRyeFyFUNNXd3JY5Zg84u3USICq0=
github.com/tmc/grpc-websocket-proxy v0.0.0-20220101234140-673ab2c3ae75 h1:6fotK7otjonDflCTK0BCfls4SPy3NcCVb5dqqmbRknE=
@@ -507,8 +533,12 @@ go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJr
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
go.opentelemetry.io/proto/otlp v1.9.0 h1:l706jCMITVouPOqEnii2fIAuO3IVGBRPV5ICjceRb/A=
go.opentelemetry.io/proto/otlp v1.9.0/go.mod h1:xE+Cx5E/eEHw+ISFkwPLwCZefwVjY+pqKg1qcK03+/4=
go.uber.org/atomic v1.11.0 h1:ZvwS0R+56ePWxUNi+Atn9dWONBPp/AUETXlHW0DxSjE=
go.uber.org/atomic v1.11.0/go.mod h1:LUxbIzbOniOlMKjJjyPfpl4v+PKK2cNJn91OQbhoJI0=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.uber.org/mock v0.6.0 h1:hyF9dfmbgIX5EfOdasqLsWD6xqpNZlXblLB/Dbnwv3Y=
go.uber.org/mock v0.6.0/go.mod h1:KiVJ4BqZJaMj4svdfmHM0AUx4NJYO8ZNpPnZn1Z+BBU=
go.uber.org/multierr v1.11.0 h1:blXXJkSxSSfBVBlC76pxqeO+LN3aDfLQo+309xJstO0=
go.uber.org/multierr v1.11.0/go.mod h1:20+QtiLqy0Nd6FdQB9TLXag12DsQkrbs3htMFfDN80Y=
go.uber.org/zap v1.27.1 h1:08RqriUEv8+ArZRYSTXy1LeBScaMpVSTBhCeaZYfMYc=

View File

@@ -5,7 +5,24 @@ metaV0Alpha1: {
scope: "Namespaced"
schema: {
spec: {
pluginJSON: #JSONData,
pluginJson: #JSONData
module?: {
path: string
hash?: string
loadingStrategy?: "fetch" | "script"
}
baseURL?: string
signature?: {
status: "internal" | "valid" | "invalid" | "modified" | "unsigned"
type?: "grafana" | "commercial" | "community" | "private" | "private-glob"
org?: string
}
angular?: {
detected: bool
}
translations?: [string]: string
// +listType=atomic
children?: [...string]
}
}
}

View File

@@ -208,13 +208,20 @@ func NewMetaExtensions() *MetaExtensions {
// +k8s:openapi-gen=true
type MetaSpec struct {
PluginJSON MetaJSONData `json:"pluginJSON"`
PluginJson MetaJSONData `json:"pluginJson"`
Module *MetaV0alpha1SpecModule `json:"module,omitempty"`
BaseURL *string `json:"baseURL,omitempty"`
Signature *MetaV0alpha1SpecSignature `json:"signature,omitempty"`
Angular *MetaV0alpha1SpecAngular `json:"angular,omitempty"`
Translations map[string]string `json:"translations,omitempty"`
// +listType=atomic
Children []string `json:"children,omitempty"`
}
// NewMetaSpec creates a new MetaSpec object.
func NewMetaSpec() *MetaSpec {
return &MetaSpec{
PluginJSON: *NewMetaJSONData(),
PluginJson: *NewMetaJSONData(),
}
}
@@ -412,6 +419,40 @@ func NewMetaV0alpha1ExtensionsExtensionPoints() *MetaV0alpha1ExtensionsExtension
return &MetaV0alpha1ExtensionsExtensionPoints{}
}
// +k8s:openapi-gen=true
type MetaV0alpha1SpecModule struct {
Path string `json:"path"`
Hash *string `json:"hash,omitempty"`
LoadingStrategy *MetaV0alpha1SpecModuleLoadingStrategy `json:"loadingStrategy,omitempty"`
}
// NewMetaV0alpha1SpecModule creates a new MetaV0alpha1SpecModule object.
func NewMetaV0alpha1SpecModule() *MetaV0alpha1SpecModule {
return &MetaV0alpha1SpecModule{}
}
// +k8s:openapi-gen=true
type MetaV0alpha1SpecSignature struct {
Status MetaV0alpha1SpecSignatureStatus `json:"status"`
Type *MetaV0alpha1SpecSignatureType `json:"type,omitempty"`
Org *string `json:"org,omitempty"`
}
// NewMetaV0alpha1SpecSignature creates a new MetaV0alpha1SpecSignature object.
func NewMetaV0alpha1SpecSignature() *MetaV0alpha1SpecSignature {
return &MetaV0alpha1SpecSignature{}
}
// +k8s:openapi-gen=true
type MetaV0alpha1SpecAngular struct {
Detected bool `json:"detected"`
}
// NewMetaV0alpha1SpecAngular creates a new MetaV0alpha1SpecAngular object.
func NewMetaV0alpha1SpecAngular() *MetaV0alpha1SpecAngular {
return &MetaV0alpha1SpecAngular{}
}
// +k8s:openapi-gen=true
type MetaJSONDataType string
@@ -472,3 +513,33 @@ const (
MetaV0alpha1DependenciesPluginsTypeDatasource MetaV0alpha1DependenciesPluginsType = "datasource"
MetaV0alpha1DependenciesPluginsTypePanel MetaV0alpha1DependenciesPluginsType = "panel"
)
// +k8s:openapi-gen=true
type MetaV0alpha1SpecModuleLoadingStrategy string
const (
MetaV0alpha1SpecModuleLoadingStrategyFetch MetaV0alpha1SpecModuleLoadingStrategy = "fetch"
MetaV0alpha1SpecModuleLoadingStrategyScript MetaV0alpha1SpecModuleLoadingStrategy = "script"
)
// +k8s:openapi-gen=true
type MetaV0alpha1SpecSignatureStatus string
const (
MetaV0alpha1SpecSignatureStatusInternal MetaV0alpha1SpecSignatureStatus = "internal"
MetaV0alpha1SpecSignatureStatusValid MetaV0alpha1SpecSignatureStatus = "valid"
MetaV0alpha1SpecSignatureStatusInvalid MetaV0alpha1SpecSignatureStatus = "invalid"
MetaV0alpha1SpecSignatureStatusModified MetaV0alpha1SpecSignatureStatus = "modified"
MetaV0alpha1SpecSignatureStatusUnsigned MetaV0alpha1SpecSignatureStatus = "unsigned"
)
// +k8s:openapi-gen=true
type MetaV0alpha1SpecSignatureType string
const (
MetaV0alpha1SpecSignatureTypeGrafana MetaV0alpha1SpecSignatureType = "grafana"
MetaV0alpha1SpecSignatureTypeCommercial MetaV0alpha1SpecSignatureType = "commercial"
MetaV0alpha1SpecSignatureTypeCommunity MetaV0alpha1SpecSignatureType = "community"
MetaV0alpha1SpecSignatureTypePrivate MetaV0alpha1SpecSignatureType = "private"
MetaV0alpha1SpecSignatureTypePrivateGlob MetaV0alpha1SpecSignatureType = "private-glob"
)

File diff suppressed because one or more lines are too long

View File

@@ -10,8 +10,6 @@ import (
"time"
"github.com/grafana/grafana-app-sdk/logging"
pluginsv0alpha1 "github.com/grafana/grafana/apps/plugins/pkg/apis/plugins/v0alpha1"
)
const (
@@ -87,45 +85,9 @@ func (p *CatalogProvider) GetMeta(ctx context.Context, pluginID, version string)
return nil, fmt.Errorf("failed to decode response: %w", err)
}
metaSpec := grafanaComPluginVersionMetaToMetaSpec(gcomMeta)
return &Result{
Meta: gcomMeta.JSON,
Meta: metaSpec,
TTL: p.ttl,
}, nil
}
// grafanaComPluginVersionMeta represents the response from grafana.com API
// GET /api/plugins/{pluginId}/versions/{version}
type grafanaComPluginVersionMeta struct {
PluginID string `json:"pluginSlug"`
Version string `json:"version"`
URL string `json:"url"`
Commit string `json:"commit"`
Description string `json:"description"`
Keywords []string `json:"keywords"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
JSON pluginsv0alpha1.MetaJSONData `json:"json"`
Readme string `json:"readme"`
Downloads int `json:"downloads"`
Verified bool `json:"verified"`
Status string `json:"status"`
StatusContext string `json:"statusContext"`
DownloadSlug string `json:"downloadSlug"`
SignatureType string `json:"signatureType"`
SignedByOrg string `json:"signedByOrg"`
SignedByOrgName string `json:"signedByOrgName"`
Packages struct {
Any struct {
Md5 string `json:"md5"`
Sha256 string `json:"sha256"`
PackageName string `json:"packageName"`
DownloadURL string `json:"downloadUrl"`
} `json:"any"`
} `json:"packages"`
Links []struct {
Rel string `json:"rel"`
Href string `json:"href"`
} `json:"links"`
AngularDetected bool `json:"angularDetected"`
Scopes []string `json:"scopes"`
}

View File

@@ -49,7 +49,7 @@ func TestCatalogProvider_GetMeta(t *testing.T) {
require.NoError(t, err)
require.NotNil(t, result)
assert.Equal(t, expectedMeta, result.Meta)
assert.Equal(t, expectedMeta, result.Meta.PluginJson)
assert.Equal(t, defaultCatalogTTL, result.TTL)
})

View File

@@ -0,0 +1,725 @@
package meta
import (
"encoding/json"
"time"
pluginsv0alpha1 "github.com/grafana/grafana/apps/plugins/pkg/apis/plugins/v0alpha1"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
)
// jsonDataToMetaJSONData converts a plugins.JSONData to a pluginsv0alpha1.MetaJSONData.
// nolint:gocyclo
func jsonDataToMetaJSONData(jsonData plugins.JSONData) pluginsv0alpha1.MetaJSONData {
meta := pluginsv0alpha1.MetaJSONData{
Id: jsonData.ID,
Name: jsonData.Name,
}
// Map plugin type
switch jsonData.Type {
case plugins.TypeApp:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeApp
case plugins.TypeDataSource:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeDatasource
case plugins.TypePanel:
meta.Type = pluginsv0alpha1.MetaJSONDataTypePanel
case plugins.TypeRenderer:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeRenderer
}
// Map Info
meta.Info = pluginsv0alpha1.MetaInfo{
Keywords: jsonData.Info.Keywords,
Logos: pluginsv0alpha1.MetaV0alpha1InfoLogos{
Small: jsonData.Info.Logos.Small,
Large: jsonData.Info.Logos.Large,
},
Updated: jsonData.Info.Updated,
Version: jsonData.Info.Version,
}
if jsonData.Info.Description != "" {
meta.Info.Description = &jsonData.Info.Description
}
if jsonData.Info.Author.Name != "" || jsonData.Info.Author.URL != "" {
author := &pluginsv0alpha1.MetaV0alpha1InfoAuthor{}
if jsonData.Info.Author.Name != "" {
author.Name = &jsonData.Info.Author.Name
}
if jsonData.Info.Author.URL != "" {
author.Url = &jsonData.Info.Author.URL
}
meta.Info.Author = author
}
if len(jsonData.Info.Links) > 0 {
meta.Info.Links = make([]pluginsv0alpha1.MetaV0alpha1InfoLinks, 0, len(jsonData.Info.Links))
for _, link := range jsonData.Info.Links {
v0Link := pluginsv0alpha1.MetaV0alpha1InfoLinks{}
if link.Name != "" {
v0Link.Name = &link.Name
}
if link.URL != "" {
v0Link.Url = &link.URL
}
meta.Info.Links = append(meta.Info.Links, v0Link)
}
}
if len(jsonData.Info.Screenshots) > 0 {
meta.Info.Screenshots = make([]pluginsv0alpha1.MetaV0alpha1InfoScreenshots, 0, len(jsonData.Info.Screenshots))
for _, screenshot := range jsonData.Info.Screenshots {
v0Screenshot := pluginsv0alpha1.MetaV0alpha1InfoScreenshots{}
if screenshot.Name != "" {
v0Screenshot.Name = &screenshot.Name
}
if screenshot.Path != "" {
v0Screenshot.Path = &screenshot.Path
}
meta.Info.Screenshots = append(meta.Info.Screenshots, v0Screenshot)
}
}
// Map Dependencies
meta.Dependencies = pluginsv0alpha1.MetaDependencies{
GrafanaDependency: jsonData.Dependencies.GrafanaDependency,
}
if jsonData.Dependencies.GrafanaVersion != "" {
meta.Dependencies.GrafanaVersion = &jsonData.Dependencies.GrafanaVersion
}
if len(jsonData.Dependencies.Plugins) > 0 {
meta.Dependencies.Plugins = make([]pluginsv0alpha1.MetaV0alpha1DependenciesPlugins, 0, len(jsonData.Dependencies.Plugins))
for _, dep := range jsonData.Dependencies.Plugins {
var depType pluginsv0alpha1.MetaV0alpha1DependenciesPluginsType
switch dep.Type {
case "app":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypeApp
case "datasource":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypeDatasource
case "panel":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypePanel
}
meta.Dependencies.Plugins = append(meta.Dependencies.Plugins, pluginsv0alpha1.MetaV0alpha1DependenciesPlugins{
Id: dep.ID,
Type: depType,
Name: dep.Name,
})
}
}
if len(jsonData.Dependencies.Extensions.ExposedComponents) > 0 {
meta.Dependencies.Extensions = &pluginsv0alpha1.MetaV0alpha1DependenciesExtensions{
ExposedComponents: jsonData.Dependencies.Extensions.ExposedComponents,
}
}
// Map optional boolean fields
if jsonData.Alerting {
meta.Alerting = &jsonData.Alerting
}
if jsonData.Annotations {
meta.Annotations = &jsonData.Annotations
}
if jsonData.AutoEnabled {
meta.AutoEnabled = &jsonData.AutoEnabled
}
if jsonData.Backend {
meta.Backend = &jsonData.Backend
}
if jsonData.BuiltIn {
meta.BuiltIn = &jsonData.BuiltIn
}
if jsonData.HideFromList {
meta.HideFromList = &jsonData.HideFromList
}
if jsonData.Logs {
meta.Logs = &jsonData.Logs
}
if jsonData.Metrics {
meta.Metrics = &jsonData.Metrics
}
if jsonData.MultiValueFilterOperators {
meta.MultiValueFilterOperators = &jsonData.MultiValueFilterOperators
}
if jsonData.Preload {
meta.Preload = &jsonData.Preload
}
if jsonData.SkipDataQuery {
meta.SkipDataQuery = &jsonData.SkipDataQuery
}
if jsonData.Streaming {
meta.Streaming = &jsonData.Streaming
}
if jsonData.Tracing {
meta.Tracing = &jsonData.Tracing
}
// Map category
if jsonData.Category != "" {
var category pluginsv0alpha1.MetaJSONDataCategory
switch jsonData.Category {
case "tsdb":
category = pluginsv0alpha1.MetaJSONDataCategoryTsdb
case "logging":
category = pluginsv0alpha1.MetaJSONDataCategoryLogging
case "cloud":
category = pluginsv0alpha1.MetaJSONDataCategoryCloud
case "tracing":
category = pluginsv0alpha1.MetaJSONDataCategoryTracing
case "profiling":
category = pluginsv0alpha1.MetaJSONDataCategoryProfiling
case "sql":
category = pluginsv0alpha1.MetaJSONDataCategorySql
case "enterprise":
category = pluginsv0alpha1.MetaJSONDataCategoryEnterprise
case "iot":
category = pluginsv0alpha1.MetaJSONDataCategoryIot
case "other":
category = pluginsv0alpha1.MetaJSONDataCategoryOther
default:
category = pluginsv0alpha1.MetaJSONDataCategoryOther
}
meta.Category = &category
}
// Map state
if jsonData.State != "" {
var state pluginsv0alpha1.MetaJSONDataState
switch jsonData.State {
case plugins.ReleaseStateAlpha:
state = pluginsv0alpha1.MetaJSONDataStateAlpha
case plugins.ReleaseStateBeta:
state = pluginsv0alpha1.MetaJSONDataStateBeta
default:
}
if state != "" {
meta.State = &state
}
}
// Map executable
if jsonData.Executable != "" {
meta.Executable = &jsonData.Executable
}
// Map QueryOptions
if len(jsonData.QueryOptions) > 0 {
queryOptions := &pluginsv0alpha1.MetaQueryOptions{}
if val, ok := jsonData.QueryOptions["maxDataPoints"]; ok {
queryOptions.MaxDataPoints = &val
}
if val, ok := jsonData.QueryOptions["minInterval"]; ok {
queryOptions.MinInterval = &val
}
if val, ok := jsonData.QueryOptions["cacheTimeout"]; ok {
queryOptions.CacheTimeout = &val
}
meta.QueryOptions = queryOptions
}
// Map Includes
if len(jsonData.Includes) > 0 {
meta.Includes = make([]pluginsv0alpha1.MetaInclude, 0, len(jsonData.Includes))
for _, include := range jsonData.Includes {
v0Include := pluginsv0alpha1.MetaInclude{}
if include.UID != "" {
v0Include.Uid = &include.UID
}
if include.Type != "" {
var includeType pluginsv0alpha1.MetaIncludeType
switch include.Type {
case "dashboard":
includeType = pluginsv0alpha1.MetaIncludeTypeDashboard
case "page":
includeType = pluginsv0alpha1.MetaIncludeTypePage
case "panel":
includeType = pluginsv0alpha1.MetaIncludeTypePanel
case "datasource":
includeType = pluginsv0alpha1.MetaIncludeTypeDatasource
}
v0Include.Type = &includeType
}
if include.Name != "" {
v0Include.Name = &include.Name
}
if include.Component != "" {
v0Include.Component = &include.Component
}
if include.Role != "" {
var role pluginsv0alpha1.MetaIncludeRole
switch include.Role {
case "Admin":
role = pluginsv0alpha1.MetaIncludeRoleAdmin
case "Editor":
role = pluginsv0alpha1.MetaIncludeRoleEditor
case "Viewer":
role = pluginsv0alpha1.MetaIncludeRoleViewer
}
v0Include.Role = &role
}
if include.Action != "" {
v0Include.Action = &include.Action
}
if include.Path != "" {
v0Include.Path = &include.Path
}
if include.AddToNav {
v0Include.AddToNav = &include.AddToNav
}
if include.DefaultNav {
v0Include.DefaultNav = &include.DefaultNav
}
if include.Icon != "" {
v0Include.Icon = &include.Icon
}
meta.Includes = append(meta.Includes, v0Include)
}
}
// Map Routes
if len(jsonData.Routes) > 0 {
meta.Routes = make([]pluginsv0alpha1.MetaRoute, 0, len(jsonData.Routes))
for _, route := range jsonData.Routes {
v0Route := pluginsv0alpha1.MetaRoute{}
if route.Path != "" {
v0Route.Path = &route.Path
}
if route.Method != "" {
v0Route.Method = &route.Method
}
if route.URL != "" {
v0Route.Url = &route.URL
}
if route.ReqRole != "" {
reqRole := string(route.ReqRole)
v0Route.ReqRole = &reqRole
}
if route.ReqAction != "" {
v0Route.ReqAction = &route.ReqAction
}
if len(route.Headers) > 0 {
headers := make([]string, 0, len(route.Headers))
for _, header := range route.Headers {
headers = append(headers, header.Name+": "+header.Content)
}
v0Route.Headers = headers
}
if len(route.URLParams) > 0 {
v0Route.UrlParams = make([]pluginsv0alpha1.MetaV0alpha1RouteUrlParams, 0, len(route.URLParams))
for _, param := range route.URLParams {
v0Param := pluginsv0alpha1.MetaV0alpha1RouteUrlParams{}
if param.Name != "" {
v0Param.Name = &param.Name
}
if param.Content != "" {
v0Param.Content = &param.Content
}
v0Route.UrlParams = append(v0Route.UrlParams, v0Param)
}
}
if route.TokenAuth != nil {
v0Route.TokenAuth = &pluginsv0alpha1.MetaV0alpha1RouteTokenAuth{}
if route.TokenAuth.Url != "" {
v0Route.TokenAuth.Url = &route.TokenAuth.Url
}
if len(route.TokenAuth.Scopes) > 0 {
v0Route.TokenAuth.Scopes = route.TokenAuth.Scopes
}
if len(route.TokenAuth.Params) > 0 {
v0Route.TokenAuth.Params = make(map[string]interface{})
for k, v := range route.TokenAuth.Params {
v0Route.TokenAuth.Params[k] = v
}
}
}
if route.JwtTokenAuth != nil {
v0Route.JwtTokenAuth = &pluginsv0alpha1.MetaV0alpha1RouteJwtTokenAuth{}
if route.JwtTokenAuth.Url != "" {
v0Route.JwtTokenAuth.Url = &route.JwtTokenAuth.Url
}
if len(route.JwtTokenAuth.Scopes) > 0 {
v0Route.JwtTokenAuth.Scopes = route.JwtTokenAuth.Scopes
}
if len(route.JwtTokenAuth.Params) > 0 {
v0Route.JwtTokenAuth.Params = make(map[string]interface{})
for k, v := range route.JwtTokenAuth.Params {
v0Route.JwtTokenAuth.Params[k] = v
}
}
}
if len(route.Body) > 0 {
var bodyMap map[string]interface{}
if err := json.Unmarshal(route.Body, &bodyMap); err == nil {
v0Route.Body = bodyMap
}
}
meta.Routes = append(meta.Routes, v0Route)
}
}
// Map Extensions
if len(jsonData.Extensions.AddedLinks) > 0 || len(jsonData.Extensions.AddedComponents) > 0 ||
len(jsonData.Extensions.ExposedComponents) > 0 || len(jsonData.Extensions.ExtensionPoints) > 0 {
extensions := &pluginsv0alpha1.MetaExtensions{}
if len(jsonData.Extensions.AddedLinks) > 0 {
extensions.AddedLinks = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsAddedLinks, 0, len(jsonData.Extensions.AddedLinks))
for _, link := range jsonData.Extensions.AddedLinks {
v0Link := pluginsv0alpha1.MetaV0alpha1ExtensionsAddedLinks{
Targets: link.Targets,
Title: link.Title,
}
if link.Description != "" {
v0Link.Description = &link.Description
}
extensions.AddedLinks = append(extensions.AddedLinks, v0Link)
}
}
if len(jsonData.Extensions.AddedComponents) > 0 {
extensions.AddedComponents = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsAddedComponents, 0, len(jsonData.Extensions.AddedComponents))
for _, comp := range jsonData.Extensions.AddedComponents {
v0Comp := pluginsv0alpha1.MetaV0alpha1ExtensionsAddedComponents{
Targets: comp.Targets,
Title: comp.Title,
}
if comp.Description != "" {
v0Comp.Description = &comp.Description
}
extensions.AddedComponents = append(extensions.AddedComponents, v0Comp)
}
}
if len(jsonData.Extensions.ExposedComponents) > 0 {
extensions.ExposedComponents = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsExposedComponents, 0, len(jsonData.Extensions.ExposedComponents))
for _, comp := range jsonData.Extensions.ExposedComponents {
v0Comp := pluginsv0alpha1.MetaV0alpha1ExtensionsExposedComponents{
Id: comp.Id,
}
if comp.Title != "" {
v0Comp.Title = &comp.Title
}
if comp.Description != "" {
v0Comp.Description = &comp.Description
}
extensions.ExposedComponents = append(extensions.ExposedComponents, v0Comp)
}
}
if len(jsonData.Extensions.ExtensionPoints) > 0 {
extensions.ExtensionPoints = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsExtensionPoints, 0, len(jsonData.Extensions.ExtensionPoints))
for _, point := range jsonData.Extensions.ExtensionPoints {
v0Point := pluginsv0alpha1.MetaV0alpha1ExtensionsExtensionPoints{
Id: point.Id,
}
if point.Title != "" {
v0Point.Title = &point.Title
}
if point.Description != "" {
v0Point.Description = &point.Description
}
extensions.ExtensionPoints = append(extensions.ExtensionPoints, v0Point)
}
}
meta.Extensions = extensions
}
// Map Roles
if len(jsonData.Roles) > 0 {
meta.Roles = make([]pluginsv0alpha1.MetaRole, 0, len(jsonData.Roles))
for _, role := range jsonData.Roles {
v0Role := pluginsv0alpha1.MetaRole{
Grants: role.Grants,
}
if role.Role.Name != "" || role.Role.Description != "" || len(role.Role.Permissions) > 0 {
v0RoleRole := &pluginsv0alpha1.MetaV0alpha1RoleRole{}
if role.Role.Name != "" {
v0RoleRole.Name = &role.Role.Name
}
if role.Role.Description != "" {
v0RoleRole.Description = &role.Role.Description
}
if len(role.Role.Permissions) > 0 {
v0RoleRole.Permissions = make([]pluginsv0alpha1.MetaV0alpha1RoleRolePermissions, 0, len(role.Role.Permissions))
for _, perm := range role.Role.Permissions {
v0Perm := pluginsv0alpha1.MetaV0alpha1RoleRolePermissions{}
if perm.Action != "" {
v0Perm.Action = &perm.Action
}
if perm.Scope != "" {
v0Perm.Scope = &perm.Scope
}
v0RoleRole.Permissions = append(v0RoleRole.Permissions, v0Perm)
}
}
v0Role.Role = v0RoleRole
}
meta.Roles = append(meta.Roles, v0Role)
}
}
// Map IAM
if jsonData.IAM != nil && len(jsonData.IAM.Permissions) > 0 {
iam := &pluginsv0alpha1.MetaIAM{
Permissions: make([]pluginsv0alpha1.MetaV0alpha1IAMPermissions, 0, len(jsonData.IAM.Permissions)),
}
for _, perm := range jsonData.IAM.Permissions {
v0Perm := pluginsv0alpha1.MetaV0alpha1IAMPermissions{}
if perm.Action != "" {
v0Perm.Action = &perm.Action
}
if perm.Scope != "" {
v0Perm.Scope = &perm.Scope
}
iam.Permissions = append(iam.Permissions, v0Perm)
}
meta.Iam = iam
}
return meta
}
// pluginStorePluginToMeta converts a pluginstore.Plugin to a pluginsv0alpha1.MetaSpec.
// This is similar to pluginToPluginMetaSpec but works with the plugin store DTO.
// loadingStrategy and moduleHash are optional calculated values that can be provided.
func pluginStorePluginToMeta(plugin pluginstore.Plugin, loadingStrategy plugins.LoadingStrategy, moduleHash string) pluginsv0alpha1.MetaSpec {
metaSpec := pluginsv0alpha1.MetaSpec{
PluginJson: jsonDataToMetaJSONData(plugin.JSONData),
}
if plugin.Module != "" {
module := &pluginsv0alpha1.MetaV0alpha1SpecModule{
Path: plugin.Module,
}
if moduleHash != "" {
module.Hash = &moduleHash
}
if loadingStrategy != "" {
var ls pluginsv0alpha1.MetaV0alpha1SpecModuleLoadingStrategy
switch loadingStrategy {
case plugins.LoadingStrategyFetch:
ls = pluginsv0alpha1.MetaV0alpha1SpecModuleLoadingStrategyFetch
case plugins.LoadingStrategyScript:
ls = pluginsv0alpha1.MetaV0alpha1SpecModuleLoadingStrategyScript
}
module.LoadingStrategy = &ls
}
metaSpec.Module = module
}
if plugin.BaseURL != "" {
metaSpec.BaseURL = &plugin.BaseURL
}
if plugin.Signature != "" {
signature := &pluginsv0alpha1.MetaV0alpha1SpecSignature{
Status: convertSignatureStatus(plugin.Signature),
}
if plugin.SignatureType != "" {
sigType := convertSignatureType(plugin.SignatureType)
signature.Type = &sigType
}
if plugin.SignatureOrg != "" {
signature.Org = &plugin.SignatureOrg
}
metaSpec.Signature = signature
}
if len(plugin.Children) > 0 {
metaSpec.Children = plugin.Children
}
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: plugin.Angular.Detected,
}
if len(plugin.Translations) > 0 {
metaSpec.Translations = plugin.Translations
}
return metaSpec
}
// convertSignatureStatus converts plugins.SignatureStatus to pluginsv0alpha1.MetaV0alpha1SpecSignatureStatus.
func convertSignatureStatus(status plugins.SignatureStatus) pluginsv0alpha1.MetaV0alpha1SpecSignatureStatus {
switch status {
case plugins.SignatureStatusInternal:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusInternal
case plugins.SignatureStatusValid:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusValid
case plugins.SignatureStatusInvalid:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusInvalid
case plugins.SignatureStatusModified:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusModified
case plugins.SignatureStatusUnsigned:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusUnsigned
default:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusUnsigned
}
}
// convertSignatureType converts plugins.SignatureType to pluginsv0alpha1.MetaV0alpha1SpecSignatureType.
func convertSignatureType(sigType plugins.SignatureType) pluginsv0alpha1.MetaV0alpha1SpecSignatureType {
switch sigType {
case plugins.SignatureTypeGrafana:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeGrafana
case plugins.SignatureTypeCommercial:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeCommercial
case plugins.SignatureTypeCommunity:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeCommunity
case plugins.SignatureTypePrivate:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypePrivate
case plugins.SignatureTypePrivateGlob:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypePrivateGlob
default:
return pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeGrafana
}
}
// pluginToMetaSpec converts a fully loaded *plugins.Plugin to a pluginsv0alpha1.MetaSpec.
func pluginToMetaSpec(plugin *plugins.Plugin) pluginsv0alpha1.MetaSpec {
metaSpec := pluginsv0alpha1.MetaSpec{
PluginJson: jsonDataToMetaJSONData(plugin.JSONData),
}
// Set module information
if plugin.Module != "" {
module := &pluginsv0alpha1.MetaV0alpha1SpecModule{
Path: plugin.Module,
}
loadingStrategy := pluginsv0alpha1.MetaV0alpha1SpecModuleLoadingStrategyScript
module.LoadingStrategy = &loadingStrategy
metaSpec.Module = module
}
// Set BaseURL
if plugin.BaseURL != "" {
metaSpec.BaseURL = &plugin.BaseURL
}
// Set signature information
signature := &pluginsv0alpha1.MetaV0alpha1SpecSignature{
Status: convertSignatureStatus(plugin.Signature),
}
if plugin.SignatureType != "" {
sigType := convertSignatureType(plugin.SignatureType)
signature.Type = &sigType
}
if plugin.SignatureOrg != "" {
signature.Org = &plugin.SignatureOrg
}
metaSpec.Signature = signature
if len(plugin.Children) > 0 {
children := make([]string, 0, len(plugin.Children))
for _, child := range plugin.Children {
children = append(children, child.ID)
}
metaSpec.Children = children
}
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: plugin.Angular.Detected,
}
if len(plugin.Translations) > 0 {
metaSpec.Translations = plugin.Translations
}
return metaSpec
}
// grafanaComPluginVersionMeta represents the response from grafana.com API
// GET /api/plugins/{pluginId}/versions/{version}
type grafanaComPluginVersionMeta struct {
PluginID string `json:"pluginSlug"`
Version string `json:"version"`
URL string `json:"url"`
Commit string `json:"commit"`
Description string `json:"description"`
Keywords []string `json:"keywords"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
JSON pluginsv0alpha1.MetaJSONData `json:"json"`
Readme string `json:"readme"`
Downloads int `json:"downloads"`
Verified bool `json:"verified"`
Status string `json:"status"`
StatusContext string `json:"statusContext"`
DownloadSlug string `json:"downloadSlug"`
SignatureType string `json:"signatureType"`
SignedByOrg string `json:"signedByOrg"`
SignedByOrgName string `json:"signedByOrgName"`
Packages struct {
Any struct {
Md5 string `json:"md5"`
Sha256 string `json:"sha256"`
PackageName string `json:"packageName"`
DownloadURL string `json:"downloadUrl"`
} `json:"any"`
} `json:"packages"`
Links []struct {
Rel string `json:"rel"`
Href string `json:"href"`
} `json:"links"`
AngularDetected bool `json:"angularDetected"`
Scopes []string `json:"scopes"`
}
// grafanaComPluginVersionMetaToMetaSpec converts a grafanaComPluginVersionMeta to a pluginsv0alpha1.MetaSpec.
func grafanaComPluginVersionMetaToMetaSpec(gcomMeta grafanaComPluginVersionMeta) pluginsv0alpha1.MetaSpec {
metaSpec := pluginsv0alpha1.MetaSpec{
PluginJson: gcomMeta.JSON,
}
if gcomMeta.SignatureType != "" {
signature := &pluginsv0alpha1.MetaV0alpha1SpecSignature{
Status: pluginsv0alpha1.MetaV0alpha1SpecSignatureStatusValid,
}
switch gcomMeta.SignatureType {
case "grafana":
sigType := pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeGrafana
signature.Type = &sigType
case "commercial":
sigType := pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeCommercial
signature.Type = &sigType
case "community":
sigType := pluginsv0alpha1.MetaV0alpha1SpecSignatureTypeCommunity
signature.Type = &sigType
case "private":
sigType := pluginsv0alpha1.MetaV0alpha1SpecSignatureTypePrivate
signature.Type = &sigType
case "private-glob":
sigType := pluginsv0alpha1.MetaV0alpha1SpecSignatureTypePrivateGlob
signature.Type = &sigType
}
if gcomMeta.SignedByOrg != "" {
signature.Org = &gcomMeta.SignedByOrg
}
metaSpec.Signature = signature
}
// Set angular info
metaSpec.Angular = &pluginsv0alpha1.MetaV0alpha1SpecAngular{
Detected: gcomMeta.AngularDetected,
}
return metaSpec
}

View File

@@ -2,7 +2,6 @@ package meta
import (
"context"
"encoding/json"
"errors"
"os"
"path/filepath"
@@ -13,7 +12,15 @@ import (
pluginsv0alpha1 "github.com/grafana/grafana/apps/plugins/pkg/apis/plugins/v0alpha1"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/plugins/config"
pluginsLoader "github.com/grafana/grafana/pkg/plugins/manager/loader"
"github.com/grafana/grafana/pkg/plugins/manager/pipeline/bootstrap"
"github.com/grafana/grafana/pkg/plugins/manager/pipeline/discovery"
"github.com/grafana/grafana/pkg/plugins/manager/pipeline/initialization"
"github.com/grafana/grafana/pkg/plugins/manager/pipeline/termination"
"github.com/grafana/grafana/pkg/plugins/manager/pipeline/validation"
"github.com/grafana/grafana/pkg/plugins/manager/sources"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginerrs"
)
const (
@@ -23,9 +30,10 @@ const (
// CoreProvider retrieves plugin metadata for core plugins.
type CoreProvider struct {
mu sync.RWMutex
loadedPlugins map[string]pluginsv0alpha1.MetaJSONData
loadedPlugins map[string]pluginsv0alpha1.MetaSpec
initialized bool
ttl time.Duration
loader pluginsLoader.Service
}
// NewCoreProvider creates a new CoreProvider for core plugins.
@@ -35,9 +43,13 @@ func NewCoreProvider() *CoreProvider {
// NewCoreProviderWithTTL creates a new CoreProvider with a custom TTL.
func NewCoreProviderWithTTL(ttl time.Duration) *CoreProvider {
cfg := &config.PluginManagementCfg{
Features: config.Features{},
}
return &CoreProvider{
loadedPlugins: make(map[string]pluginsv0alpha1.MetaJSONData),
loadedPlugins: make(map[string]pluginsv0alpha1.MetaSpec),
ttl: ttl,
loader: createLoader(cfg),
}
}
@@ -76,9 +88,9 @@ func (p *CoreProvider) GetMeta(ctx context.Context, pluginID, _ string) (*Result
p.initialized = true
}
if meta, found := p.loadedPlugins[pluginID]; found {
if spec, found := p.loadedPlugins[pluginID]; found {
return &Result{
Meta: meta,
Meta: spec,
TTL: p.ttl,
}, nil
}
@@ -86,8 +98,8 @@ func (p *CoreProvider) GetMeta(ctx context.Context, pluginID, _ string) (*Result
return nil, ErrMetaNotFound
}
// loadPlugins discovers and caches all core plugins.
// Returns an error if the static root path cannot be found or if plugin discovery fails.
// loadPlugins discovers and caches all core plugins by fully loading them.
// Returns an error if the static root path cannot be found or if plugin loading fails.
// This error will be handled gracefully by GetMeta, which will return ErrMetaNotFound
// to allow other providers to handle the request.
func (p *CoreProvider) loadPlugins(ctx context.Context) error {
@@ -108,496 +120,51 @@ func (p *CoreProvider) loadPlugins(ctx context.Context) error {
panelPath := filepath.Join(staticRootPath, "app", "plugins", "panel")
src := sources.NewLocalSource(plugins.ClassCore, []string{datasourcePath, panelPath})
ps, err := src.Discover(ctx)
loadedPlugins, err := p.loader.Load(ctx, src)
if err != nil {
return err
}
if len(ps) == 0 {
logging.DefaultLogger.Warn("CoreProvider: no core plugins found during discovery")
if len(loadedPlugins) == 0 {
logging.DefaultLogger.Warn("CoreProvider: no core plugins found during loading")
return nil
}
for _, bundle := range ps {
meta := jsonDataToMetaJSONData(bundle.Primary.JSONData)
p.loadedPlugins[bundle.Primary.JSONData.ID] = meta
for _, plugin := range loadedPlugins {
metaSpec := pluginToMetaSpec(plugin)
p.loadedPlugins[plugin.ID] = metaSpec
}
return nil
}
// jsonDataToMetaJSONData converts a plugins.JSONData to a pluginsv0alpha1.MetaJSONData.
// nolint:gocyclo
func jsonDataToMetaJSONData(jsonData plugins.JSONData) pluginsv0alpha1.MetaJSONData {
meta := pluginsv0alpha1.MetaJSONData{
Id: jsonData.ID,
Name: jsonData.Name,
}
// Map plugin type
switch jsonData.Type {
case plugins.TypeApp:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeApp
case plugins.TypeDataSource:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeDatasource
case plugins.TypePanel:
meta.Type = pluginsv0alpha1.MetaJSONDataTypePanel
case plugins.TypeRenderer:
meta.Type = pluginsv0alpha1.MetaJSONDataTypeRenderer
}
// Map Info
meta.Info = pluginsv0alpha1.MetaInfo{
Keywords: jsonData.Info.Keywords,
Logos: pluginsv0alpha1.MetaV0alpha1InfoLogos{
Small: jsonData.Info.Logos.Small,
Large: jsonData.Info.Logos.Large,
// createLoader creates a loader service configured for core plugins.
func createLoader(cfg *config.PluginManagementCfg) pluginsLoader.Service {
d := discovery.New(cfg, discovery.Opts{
FilterFuncs: []discovery.FilterFunc{
// Allow all plugin types for core plugins
},
Updated: jsonData.Info.Updated,
Version: jsonData.Info.Version,
}
})
b := bootstrap.New(cfg, bootstrap.Opts{
DecorateFuncs: []bootstrap.DecorateFunc{}, // no decoration required for metadata
})
v := validation.New(cfg, validation.Opts{
ValidateFuncs: []validation.ValidateFunc{
// Skip validation for core plugins - they're trusted
},
})
i := initialization.New(cfg, initialization.Opts{
InitializeFuncs: []initialization.InitializeFunc{
// Skip initialization - we only need metadata, not running plugins
},
})
t, _ := termination.New(cfg, termination.Opts{
TerminateFuncs: []termination.TerminateFunc{
// No termination needed for metadata-only loading
},
})
if jsonData.Info.Description != "" {
meta.Info.Description = &jsonData.Info.Description
}
et := pluginerrs.ProvideErrorTracker()
if jsonData.Info.Author.Name != "" || jsonData.Info.Author.URL != "" {
author := &pluginsv0alpha1.MetaV0alpha1InfoAuthor{}
if jsonData.Info.Author.Name != "" {
author.Name = &jsonData.Info.Author.Name
}
if jsonData.Info.Author.URL != "" {
author.Url = &jsonData.Info.Author.URL
}
meta.Info.Author = author
}
if len(jsonData.Info.Links) > 0 {
meta.Info.Links = make([]pluginsv0alpha1.MetaV0alpha1InfoLinks, 0, len(jsonData.Info.Links))
for _, link := range jsonData.Info.Links {
v0Link := pluginsv0alpha1.MetaV0alpha1InfoLinks{}
if link.Name != "" {
v0Link.Name = &link.Name
}
if link.URL != "" {
v0Link.Url = &link.URL
}
meta.Info.Links = append(meta.Info.Links, v0Link)
}
}
if len(jsonData.Info.Screenshots) > 0 {
meta.Info.Screenshots = make([]pluginsv0alpha1.MetaV0alpha1InfoScreenshots, 0, len(jsonData.Info.Screenshots))
for _, screenshot := range jsonData.Info.Screenshots {
v0Screenshot := pluginsv0alpha1.MetaV0alpha1InfoScreenshots{}
if screenshot.Name != "" {
v0Screenshot.Name = &screenshot.Name
}
if screenshot.Path != "" {
v0Screenshot.Path = &screenshot.Path
}
meta.Info.Screenshots = append(meta.Info.Screenshots, v0Screenshot)
}
}
// Map Dependencies
meta.Dependencies = pluginsv0alpha1.MetaDependencies{
GrafanaDependency: jsonData.Dependencies.GrafanaDependency,
}
if jsonData.Dependencies.GrafanaVersion != "" {
meta.Dependencies.GrafanaVersion = &jsonData.Dependencies.GrafanaVersion
}
if len(jsonData.Dependencies.Plugins) > 0 {
meta.Dependencies.Plugins = make([]pluginsv0alpha1.MetaV0alpha1DependenciesPlugins, 0, len(jsonData.Dependencies.Plugins))
for _, dep := range jsonData.Dependencies.Plugins {
var depType pluginsv0alpha1.MetaV0alpha1DependenciesPluginsType
switch dep.Type {
case "app":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypeApp
case "datasource":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypeDatasource
case "panel":
depType = pluginsv0alpha1.MetaV0alpha1DependenciesPluginsTypePanel
}
meta.Dependencies.Plugins = append(meta.Dependencies.Plugins, pluginsv0alpha1.MetaV0alpha1DependenciesPlugins{
Id: dep.ID,
Type: depType,
Name: dep.Name,
})
}
}
if len(jsonData.Dependencies.Extensions.ExposedComponents) > 0 {
meta.Dependencies.Extensions = &pluginsv0alpha1.MetaV0alpha1DependenciesExtensions{
ExposedComponents: jsonData.Dependencies.Extensions.ExposedComponents,
}
}
// Map optional boolean fields
if jsonData.Alerting {
meta.Alerting = &jsonData.Alerting
}
if jsonData.Annotations {
meta.Annotations = &jsonData.Annotations
}
if jsonData.AutoEnabled {
meta.AutoEnabled = &jsonData.AutoEnabled
}
if jsonData.Backend {
meta.Backend = &jsonData.Backend
}
if jsonData.BuiltIn {
meta.BuiltIn = &jsonData.BuiltIn
}
if jsonData.HideFromList {
meta.HideFromList = &jsonData.HideFromList
}
if jsonData.Logs {
meta.Logs = &jsonData.Logs
}
if jsonData.Metrics {
meta.Metrics = &jsonData.Metrics
}
if jsonData.MultiValueFilterOperators {
meta.MultiValueFilterOperators = &jsonData.MultiValueFilterOperators
}
if jsonData.Preload {
meta.Preload = &jsonData.Preload
}
if jsonData.SkipDataQuery {
meta.SkipDataQuery = &jsonData.SkipDataQuery
}
if jsonData.Streaming {
meta.Streaming = &jsonData.Streaming
}
if jsonData.Tracing {
meta.Tracing = &jsonData.Tracing
}
// Map category
if jsonData.Category != "" {
var category pluginsv0alpha1.MetaJSONDataCategory
switch jsonData.Category {
case "tsdb":
category = pluginsv0alpha1.MetaJSONDataCategoryTsdb
case "logging":
category = pluginsv0alpha1.MetaJSONDataCategoryLogging
case "cloud":
category = pluginsv0alpha1.MetaJSONDataCategoryCloud
case "tracing":
category = pluginsv0alpha1.MetaJSONDataCategoryTracing
case "profiling":
category = pluginsv0alpha1.MetaJSONDataCategoryProfiling
case "sql":
category = pluginsv0alpha1.MetaJSONDataCategorySql
case "enterprise":
category = pluginsv0alpha1.MetaJSONDataCategoryEnterprise
case "iot":
category = pluginsv0alpha1.MetaJSONDataCategoryIot
case "other":
category = pluginsv0alpha1.MetaJSONDataCategoryOther
default:
category = pluginsv0alpha1.MetaJSONDataCategoryOther
}
meta.Category = &category
}
// Map state
if jsonData.State != "" {
var state pluginsv0alpha1.MetaJSONDataState
switch jsonData.State {
case plugins.ReleaseStateAlpha:
state = pluginsv0alpha1.MetaJSONDataStateAlpha
case plugins.ReleaseStateBeta:
state = pluginsv0alpha1.MetaJSONDataStateBeta
default:
}
if state != "" {
meta.State = &state
}
}
// Map executable
if jsonData.Executable != "" {
meta.Executable = &jsonData.Executable
}
// Map QueryOptions
if len(jsonData.QueryOptions) > 0 {
queryOptions := &pluginsv0alpha1.MetaQueryOptions{}
if val, ok := jsonData.QueryOptions["maxDataPoints"]; ok {
queryOptions.MaxDataPoints = &val
}
if val, ok := jsonData.QueryOptions["minInterval"]; ok {
queryOptions.MinInterval = &val
}
if val, ok := jsonData.QueryOptions["cacheTimeout"]; ok {
queryOptions.CacheTimeout = &val
}
meta.QueryOptions = queryOptions
}
// Map Includes
if len(jsonData.Includes) > 0 {
meta.Includes = make([]pluginsv0alpha1.MetaInclude, 0, len(jsonData.Includes))
for _, include := range jsonData.Includes {
v0Include := pluginsv0alpha1.MetaInclude{}
if include.UID != "" {
v0Include.Uid = &include.UID
}
if include.Type != "" {
var includeType pluginsv0alpha1.MetaIncludeType
switch include.Type {
case "dashboard":
includeType = pluginsv0alpha1.MetaIncludeTypeDashboard
case "page":
includeType = pluginsv0alpha1.MetaIncludeTypePage
case "panel":
includeType = pluginsv0alpha1.MetaIncludeTypePanel
case "datasource":
includeType = pluginsv0alpha1.MetaIncludeTypeDatasource
}
v0Include.Type = &includeType
}
if include.Name != "" {
v0Include.Name = &include.Name
}
if include.Component != "" {
v0Include.Component = &include.Component
}
if include.Role != "" {
var role pluginsv0alpha1.MetaIncludeRole
switch include.Role {
case "Admin":
role = pluginsv0alpha1.MetaIncludeRoleAdmin
case "Editor":
role = pluginsv0alpha1.MetaIncludeRoleEditor
case "Viewer":
role = pluginsv0alpha1.MetaIncludeRoleViewer
}
v0Include.Role = &role
}
if include.Action != "" {
v0Include.Action = &include.Action
}
if include.Path != "" {
v0Include.Path = &include.Path
}
if include.AddToNav {
v0Include.AddToNav = &include.AddToNav
}
if include.DefaultNav {
v0Include.DefaultNav = &include.DefaultNav
}
if include.Icon != "" {
v0Include.Icon = &include.Icon
}
meta.Includes = append(meta.Includes, v0Include)
}
}
// Map Routes
if len(jsonData.Routes) > 0 {
meta.Routes = make([]pluginsv0alpha1.MetaRoute, 0, len(jsonData.Routes))
for _, route := range jsonData.Routes {
v0Route := pluginsv0alpha1.MetaRoute{}
if route.Path != "" {
v0Route.Path = &route.Path
}
if route.Method != "" {
v0Route.Method = &route.Method
}
if route.URL != "" {
v0Route.Url = &route.URL
}
if route.ReqRole != "" {
reqRole := string(route.ReqRole)
v0Route.ReqRole = &reqRole
}
if route.ReqAction != "" {
v0Route.ReqAction = &route.ReqAction
}
if len(route.Headers) > 0 {
headers := make([]string, 0, len(route.Headers))
for _, header := range route.Headers {
headers = append(headers, header.Name+": "+header.Content)
}
v0Route.Headers = headers
}
if len(route.URLParams) > 0 {
v0Route.UrlParams = make([]pluginsv0alpha1.MetaV0alpha1RouteUrlParams, 0, len(route.URLParams))
for _, param := range route.URLParams {
v0Param := pluginsv0alpha1.MetaV0alpha1RouteUrlParams{}
if param.Name != "" {
v0Param.Name = &param.Name
}
if param.Content != "" {
v0Param.Content = &param.Content
}
v0Route.UrlParams = append(v0Route.UrlParams, v0Param)
}
}
if route.TokenAuth != nil {
v0Route.TokenAuth = &pluginsv0alpha1.MetaV0alpha1RouteTokenAuth{}
if route.TokenAuth.Url != "" {
v0Route.TokenAuth.Url = &route.TokenAuth.Url
}
if len(route.TokenAuth.Scopes) > 0 {
v0Route.TokenAuth.Scopes = route.TokenAuth.Scopes
}
if len(route.TokenAuth.Params) > 0 {
v0Route.TokenAuth.Params = make(map[string]interface{})
for k, v := range route.TokenAuth.Params {
v0Route.TokenAuth.Params[k] = v
}
}
}
if route.JwtTokenAuth != nil {
v0Route.JwtTokenAuth = &pluginsv0alpha1.MetaV0alpha1RouteJwtTokenAuth{}
if route.JwtTokenAuth.Url != "" {
v0Route.JwtTokenAuth.Url = &route.JwtTokenAuth.Url
}
if len(route.JwtTokenAuth.Scopes) > 0 {
v0Route.JwtTokenAuth.Scopes = route.JwtTokenAuth.Scopes
}
if len(route.JwtTokenAuth.Params) > 0 {
v0Route.JwtTokenAuth.Params = make(map[string]interface{})
for k, v := range route.JwtTokenAuth.Params {
v0Route.JwtTokenAuth.Params[k] = v
}
}
}
if len(route.Body) > 0 {
var bodyMap map[string]interface{}
if err := json.Unmarshal(route.Body, &bodyMap); err == nil {
v0Route.Body = bodyMap
}
}
meta.Routes = append(meta.Routes, v0Route)
}
}
// Map Extensions
if len(jsonData.Extensions.AddedLinks) > 0 || len(jsonData.Extensions.AddedComponents) > 0 ||
len(jsonData.Extensions.ExposedComponents) > 0 || len(jsonData.Extensions.ExtensionPoints) > 0 {
extensions := &pluginsv0alpha1.MetaExtensions{}
if len(jsonData.Extensions.AddedLinks) > 0 {
extensions.AddedLinks = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsAddedLinks, 0, len(jsonData.Extensions.AddedLinks))
for _, link := range jsonData.Extensions.AddedLinks {
v0Link := pluginsv0alpha1.MetaV0alpha1ExtensionsAddedLinks{
Targets: link.Targets,
Title: link.Title,
}
if link.Description != "" {
v0Link.Description = &link.Description
}
extensions.AddedLinks = append(extensions.AddedLinks, v0Link)
}
}
if len(jsonData.Extensions.AddedComponents) > 0 {
extensions.AddedComponents = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsAddedComponents, 0, len(jsonData.Extensions.AddedComponents))
for _, comp := range jsonData.Extensions.AddedComponents {
v0Comp := pluginsv0alpha1.MetaV0alpha1ExtensionsAddedComponents{
Targets: comp.Targets,
Title: comp.Title,
}
if comp.Description != "" {
v0Comp.Description = &comp.Description
}
extensions.AddedComponents = append(extensions.AddedComponents, v0Comp)
}
}
if len(jsonData.Extensions.ExposedComponents) > 0 {
extensions.ExposedComponents = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsExposedComponents, 0, len(jsonData.Extensions.ExposedComponents))
for _, comp := range jsonData.Extensions.ExposedComponents {
v0Comp := pluginsv0alpha1.MetaV0alpha1ExtensionsExposedComponents{
Id: comp.Id,
}
if comp.Title != "" {
v0Comp.Title = &comp.Title
}
if comp.Description != "" {
v0Comp.Description = &comp.Description
}
extensions.ExposedComponents = append(extensions.ExposedComponents, v0Comp)
}
}
if len(jsonData.Extensions.ExtensionPoints) > 0 {
extensions.ExtensionPoints = make([]pluginsv0alpha1.MetaV0alpha1ExtensionsExtensionPoints, 0, len(jsonData.Extensions.ExtensionPoints))
for _, point := range jsonData.Extensions.ExtensionPoints {
v0Point := pluginsv0alpha1.MetaV0alpha1ExtensionsExtensionPoints{
Id: point.Id,
}
if point.Title != "" {
v0Point.Title = &point.Title
}
if point.Description != "" {
v0Point.Description = &point.Description
}
extensions.ExtensionPoints = append(extensions.ExtensionPoints, v0Point)
}
}
meta.Extensions = extensions
}
// Map Roles
if len(jsonData.Roles) > 0 {
meta.Roles = make([]pluginsv0alpha1.MetaRole, 0, len(jsonData.Roles))
for _, role := range jsonData.Roles {
v0Role := pluginsv0alpha1.MetaRole{
Grants: role.Grants,
}
if role.Role.Name != "" || role.Role.Description != "" || len(role.Role.Permissions) > 0 {
v0RoleRole := &pluginsv0alpha1.MetaV0alpha1RoleRole{}
if role.Role.Name != "" {
v0RoleRole.Name = &role.Role.Name
}
if role.Role.Description != "" {
v0RoleRole.Description = &role.Role.Description
}
if len(role.Role.Permissions) > 0 {
v0RoleRole.Permissions = make([]pluginsv0alpha1.MetaV0alpha1RoleRolePermissions, 0, len(role.Role.Permissions))
for _, perm := range role.Role.Permissions {
v0Perm := pluginsv0alpha1.MetaV0alpha1RoleRolePermissions{}
if perm.Action != "" {
v0Perm.Action = &perm.Action
}
if perm.Scope != "" {
v0Perm.Scope = &perm.Scope
}
v0RoleRole.Permissions = append(v0RoleRole.Permissions, v0Perm)
}
}
v0Role.Role = v0RoleRole
}
meta.Roles = append(meta.Roles, v0Role)
}
}
// Map IAM
if jsonData.IAM != nil && len(jsonData.IAM.Permissions) > 0 {
iam := &pluginsv0alpha1.MetaIAM{
Permissions: make([]pluginsv0alpha1.MetaV0alpha1IAMPermissions, 0, len(jsonData.IAM.Permissions)),
}
for _, perm := range jsonData.IAM.Permissions {
v0Perm := pluginsv0alpha1.MetaV0alpha1IAMPermissions{}
if perm.Action != "" {
v0Perm.Action = &perm.Action
}
if perm.Scope != "" {
v0Perm.Scope = &perm.Scope
}
iam.Permissions = append(iam.Permissions, v0Perm)
}
meta.Iam = iam
}
return meta
return pluginsLoader.New(cfg, d, b, v, i, t, et)
}

View File

@@ -22,10 +22,12 @@ func TestCoreProvider_GetMeta(t *testing.T) {
t.Run("returns cached plugin when available", func(t *testing.T) {
provider := NewCoreProvider()
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider.mu.Lock()
@@ -58,10 +60,12 @@ func TestCoreProvider_GetMeta(t *testing.T) {
t.Run("ignores version parameter", func(t *testing.T) {
provider := NewCoreProvider()
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider.mu.Lock()
@@ -81,10 +85,12 @@ func TestCoreProvider_GetMeta(t *testing.T) {
customTTL := 2 * time.Hour
provider := NewCoreProviderWithTTL(customTTL)
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider.mu.Lock()
@@ -226,8 +232,8 @@ func TestCoreProvider_loadPlugins(t *testing.T) {
if loaded {
result, err := provider.GetMeta(ctx, "test-datasource", "1.0.0")
require.NoError(t, err)
assert.Equal(t, "test-datasource", result.Meta.Id)
assert.Equal(t, "Test Datasource", result.Meta.Name)
assert.Equal(t, "test-datasource", result.Meta.PluginJson.Id)
assert.Equal(t, "Test Datasource", result.Meta.PluginJson.Name)
}
})
}

View File

@@ -0,0 +1,53 @@
package meta
import (
"context"
"time"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
)
const (
defaultLocalTTL = 1 * time.Hour
)
// PluginAssetsCalculator is an interface for calculating plugin asset information.
// LocalProvider requires this to calculate loading strategy and module hash.
type PluginAssetsCalculator interface {
LoadingStrategy(ctx context.Context, p pluginstore.Plugin) plugins.LoadingStrategy
ModuleHash(ctx context.Context, p pluginstore.Plugin) string
}
// LocalProvider retrieves plugin metadata for locally installed plugins.
// It uses the plugin store to access plugins that have already been loaded.
type LocalProvider struct {
store pluginstore.Store
pluginAssets PluginAssetsCalculator
}
// NewLocalProvider creates a new LocalProvider for locally installed plugins.
// pluginAssets is required for calculating loading strategy and module hash.
func NewLocalProvider(pluginStore pluginstore.Store, pluginAssets PluginAssetsCalculator) *LocalProvider {
return &LocalProvider{
store: pluginStore,
pluginAssets: pluginAssets,
}
}
// GetMeta retrieves plugin metadata for locally installed plugins.
func (p *LocalProvider) GetMeta(ctx context.Context, pluginID, version string) (*Result, error) {
plugin, exists := p.store.Plugin(ctx, pluginID)
if !exists {
return nil, ErrMetaNotFound
}
loadingStrategy := p.pluginAssets.LoadingStrategy(ctx, plugin)
moduleHash := p.pluginAssets.ModuleHash(ctx, plugin)
spec := pluginStorePluginToMeta(plugin, loadingStrategy, moduleHash)
return &Result{
Meta: spec,
TTL: defaultLocalTTL,
}, nil
}

View File

@@ -16,7 +16,7 @@ const (
// cachedMeta represents a cached metadata entry with expiration time
type cachedMeta struct {
meta pluginsv0alpha1.MetaJSONData
meta pluginsv0alpha1.MetaSpec
ttl time.Duration
expiresAt time.Time
}
@@ -84,7 +84,7 @@ func (pm *ProviderManager) GetMeta(ctx context.Context, pluginID, version string
if err == nil {
// Don't cache results with a zero TTL
if result.TTL == 0 {
continue
return result, nil
}
pm.cacheMu.Lock()

View File

@@ -35,10 +35,12 @@ func TestProviderManager_GetMeta(t *testing.T) {
ctx := context.Background()
t.Run("returns cached result when available and not expired", func(t *testing.T) {
cachedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
cachedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider := &mockProvider{
@@ -60,8 +62,10 @@ func TestProviderManager_GetMeta(t *testing.T) {
provider.getMetaFunc = func(ctx context.Context, pluginID, version string) (*Result, error) {
return &Result{
Meta: pluginsv0alpha1.MetaJSONData{Id: "different"},
TTL: time.Hour,
Meta: pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{Id: "different"},
},
TTL: time.Hour,
}, nil
}
@@ -73,10 +77,12 @@ func TestProviderManager_GetMeta(t *testing.T) {
})
t.Run("fetches from provider when not cached", func(t *testing.T) {
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
expectedTTL := 2 * time.Hour
@@ -107,19 +113,16 @@ func TestProviderManager_GetMeta(t *testing.T) {
assert.Equal(t, expectedTTL, cached.ttl)
})
t.Run("does not cache result with zero TTL and tries next provider", func(t *testing.T) {
zeroTTLMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Zero TTL Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
}
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
t.Run("does not cache result with zero TTL", func(t *testing.T) {
zeroTTLMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Zero TTL Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider1 := &mockProvider{
provider := &mockProvider{
getMetaFunc: func(ctx context.Context, pluginID, version string) (*Result, error) {
return &Result{
Meta: zeroTTLMeta,
@@ -127,37 +130,30 @@ func TestProviderManager_GetMeta(t *testing.T) {
}, nil
},
}
provider2 := &mockProvider{
getMetaFunc: func(ctx context.Context, pluginID, version string) (*Result, error) {
return &Result{
Meta: expectedMeta,
TTL: time.Hour,
}, nil
},
}
pm := NewProviderManager(provider1, provider2)
pm := NewProviderManager(provider)
result, err := pm.GetMeta(ctx, "test-plugin", "1.0.0")
require.NoError(t, err)
require.NotNil(t, result)
assert.Equal(t, expectedMeta, result.Meta)
assert.Equal(t, zeroTTLMeta, result.Meta)
assert.Equal(t, time.Duration(0), result.TTL)
pm.cacheMu.RLock()
cached, exists := pm.cache["test-plugin:1.0.0"]
_, exists := pm.cache["test-plugin:1.0.0"]
pm.cacheMu.RUnlock()
assert.True(t, exists)
assert.Equal(t, expectedMeta, cached.meta)
assert.Equal(t, time.Hour, cached.ttl)
assert.False(t, exists, "zero TTL results should not be cached")
})
t.Run("tries next provider when first returns ErrMetaNotFound", func(t *testing.T) {
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider1 := &mockProvider{
@@ -229,15 +225,19 @@ func TestProviderManager_GetMeta(t *testing.T) {
})
t.Run("skips expired cache entries", func(t *testing.T) {
expiredMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Expired Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expiredMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Expired Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
expectedMeta := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Test Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
callCount := 0
@@ -272,15 +272,19 @@ func TestProviderManager_GetMeta(t *testing.T) {
})
t.Run("uses first successful provider", func(t *testing.T) {
expectedMeta1 := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Provider 1 Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta1 := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Provider 1 Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
expectedMeta2 := pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Provider 2 Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
expectedMeta2 := pluginsv0alpha1.MetaSpec{
PluginJson: pluginsv0alpha1.MetaJSONData{
Id: "test-plugin",
Name: "Provider 2 Plugin",
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
},
}
provider1 := &mockProvider{
@@ -331,9 +335,9 @@ func TestProviderManager_Run(t *testing.T) {
func TestProviderManager_cleanupExpired(t *testing.T) {
t.Run("removes expired entries", func(t *testing.T) {
validMeta := pluginsv0alpha1.MetaJSONData{Id: "valid"}
expiredMeta1 := pluginsv0alpha1.MetaJSONData{Id: "expired1"}
expiredMeta2 := pluginsv0alpha1.MetaJSONData{Id: "expired2"}
validMeta := pluginsv0alpha1.MetaSpec{PluginJson: pluginsv0alpha1.MetaJSONData{Id: "valid"}}
expiredMeta1 := pluginsv0alpha1.MetaSpec{PluginJson: pluginsv0alpha1.MetaJSONData{Id: "expired1"}}
expiredMeta2 := pluginsv0alpha1.MetaSpec{PluginJson: pluginsv0alpha1.MetaJSONData{Id: "expired2"}}
provider := &mockProvider{
getMetaFunc: func(ctx context.Context, pluginID, version string) (*Result, error) {

View File

@@ -14,7 +14,7 @@ var (
// Result contains plugin metadata along with its recommended TTL.
type Result struct {
Meta pluginsv0alpha1.MetaJSONData
Meta pluginsv0alpha1.MetaSpec
TTL time.Duration
}

View File

@@ -121,8 +121,19 @@ func (s *MetaStorage) List(ctx context.Context, options *internalversion.ListOpt
continue
}
pluginMeta := createMetaFromMetaJSONData(result.Meta, plugin.Name, plugin.Namespace)
metaItems = append(metaItems, *pluginMeta)
pluginMeta := pluginsv0alpha1.Meta{
ObjectMeta: metav1.ObjectMeta{
Name: plugin.Name,
Namespace: plugin.Namespace,
},
Spec: result.Meta,
}
pluginMeta.SetGroupVersionKind(schema.GroupVersionKind{
Group: pluginsv0alpha1.APIGroup,
Version: pluginsv0alpha1.APIVersion,
Kind: pluginsv0alpha1.MetaKind().Kind(),
})
metaItems = append(metaItems, pluginMeta)
}
list := &pluginsv0alpha1.MetaList{
@@ -169,27 +180,18 @@ func (s *MetaStorage) Get(ctx context.Context, name string, options *metav1.GetO
return nil, apierrors.NewInternalError(fmt.Errorf("failed to fetch plugin metadata: %w", err))
}
return createMetaFromMetaJSONData(result.Meta, name, ns.Value), nil
}
// createMetaFromMetaJSONData creates a Meta k8s object from MetaJSONData and plugin metadata.
func createMetaFromMetaJSONData(pluginJSON pluginsv0alpha1.MetaJSONData, name, namespace string) *pluginsv0alpha1.Meta {
pluginMeta := &pluginsv0alpha1.Meta{
ObjectMeta: metav1.ObjectMeta{
Name: name,
Namespace: namespace,
},
Spec: pluginsv0alpha1.MetaSpec{
PluginJSON: pluginJSON,
Name: plugin.Name,
Namespace: plugin.Namespace,
},
Spec: result.Meta,
}
// Set the GroupVersionKind
pluginMeta.SetGroupVersionKind(schema.GroupVersionKind{
Group: pluginsv0alpha1.APIGroup,
Version: pluginsv0alpha1.APIVersion,
Kind: pluginsv0alpha1.MetaKind().Kind(),
})
return pluginMeta
return pluginMeta, nil
}

View File

@@ -0,0 +1,249 @@
package app
import (
"context"
"encoding/json"
"net/http"
"net/http/httptest"
"slices"
"strings"
"testing"
"github.com/grafana/grafana-app-sdk/resource"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apiserver/pkg/endpoints/request"
pluginsv0alpha1 "github.com/grafana/grafana/apps/plugins/pkg/apis/plugins/v0alpha1"
"github.com/grafana/grafana/apps/plugins/pkg/app/meta"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
)
func TestMetaStorageListPreload(t *testing.T) {
ctx := request.WithNamespace(context.Background(), "default")
preloadPlugin := pluginstore.Plugin{
JSONData: plugins.JSONData{
ID: "test-plugin",
Name: "Test Plugin",
Type: plugins.TypeDataSource,
Info: plugins.Info{Version: "1.0.0"},
Preload: true,
},
}
nonPreloadPlugin := pluginstore.Plugin{
JSONData: plugins.JSONData{
ID: "test-plugin-2",
Name: "Test Plugin 2",
Type: plugins.TypeDataSource,
Info: plugins.Info{Version: "1.0.0"},
Preload: false,
},
}
store := &mockPluginStore{plugins: map[string]pluginstore.Plugin{
"test-plugin": preloadPlugin,
}}
store2 := &mockPluginStore{plugins: map[string]pluginstore.Plugin{
"test-plugin-2": nonPreloadPlugin,
}}
catalogServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
require.Equal(t, http.MethodGet, r.Method)
require.Equal(t, "application/json", r.Header.Get("Accept"))
require.Equal(t, "grafana-plugins-app", r.Header.Get("User-Agent"))
segments := strings.Split(strings.Trim(r.URL.Path, "/"), "/")
require.Len(t, segments, 5)
require.Equal(t, "api", segments[0])
require.Equal(t, "plugins", segments[1])
require.Equal(t, "versions", segments[3])
preload := true
response := struct {
PluginID string `json:"pluginSlug"`
Version string `json:"version"`
JSON pluginsv0alpha1.MetaJSONData `json:"json"`
}{
PluginID: segments[2],
Version: segments[4],
JSON: pluginsv0alpha1.MetaJSONData{
Id: segments[2],
Name: segments[2],
Type: pluginsv0alpha1.MetaJSONDataTypeDatasource,
Preload: &preload,
},
}
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusOK)
require.NoError(t, json.NewEncoder(w).Encode(response))
}))
defer catalogServer.Close()
provider := meta.NewLocalProvider(store, mockPluginAssets{})
provider2 := meta.NewLocalProvider(store2, mockPluginAssets{})
catalogProvider := meta.NewCatalogProvider(catalogServer.URL + "/api/plugins")
metaManager := meta.NewProviderManager(provider2, provider, catalogProvider)
pluginClient := pluginsv0alpha1.NewPluginClient(&mockResourceClient{
listFunc: func(ctx context.Context, namespace string, opts resource.ListOptions) (resource.ListObject, error) {
return newPluginList(), nil
},
})
storage := NewMetaStorage(metaManager, func(ctx context.Context) (*pluginsv0alpha1.PluginClient, error) {
return pluginClient, nil
})
obj, err := storage.List(ctx, nil)
require.NoError(t, err)
metaList, ok := obj.(*pluginsv0alpha1.MetaList)
require.True(t, ok)
require.Len(t, metaList.Items, 3)
require.NotNil(t, metaList.Items[0].Spec.PluginJson.Preload)
require.True(t, *metaList.Items[0].Spec.PluginJson.Preload)
require.NotNil(t, metaList.Items[1].Spec.PluginJson.Preload)
require.True(t, *metaList.Items[1].Spec.PluginJson.Preload)
require.Nil(t, metaList.Items[2].Spec.PluginJson.Preload)
obj, err = storage.List(ctx, nil)
require.NoError(t, err)
metaList, ok = obj.(*pluginsv0alpha1.MetaList)
require.True(t, ok)
require.Len(t, metaList.Items, 3)
require.NotNil(t, metaList.Items[0].Spec.PluginJson.Preload)
require.True(t, *metaList.Items[0].Spec.PluginJson.Preload)
require.NotNil(t, metaList.Items[1].Spec.PluginJson.Preload)
require.True(t, *metaList.Items[1].Spec.PluginJson.Preload)
require.Nil(t, metaList.Items[2].Spec.PluginJson.Preload)
}
type mockPluginAssets struct{}
func (mockPluginAssets) LoadingStrategy(ctx context.Context, p pluginstore.Plugin) plugins.LoadingStrategy {
return plugins.LoadingStrategyFetch
}
func (mockPluginAssets) ModuleHash(ctx context.Context, p pluginstore.Plugin) string {
return "hash"
}
type mockPluginStore struct {
plugins map[string]pluginstore.Plugin
}
func (m *mockPluginStore) Plugin(ctx context.Context, pluginID string) (pluginstore.Plugin, bool) {
if m.plugins[pluginID].ID != pluginID {
return pluginstore.Plugin{}, false
}
return m.plugins[pluginID], true
}
func (m *mockPluginStore) Plugins(ctx context.Context, pluginTypes ...plugins.Type) []pluginstore.Plugin {
result := []pluginstore.Plugin{}
for _, plugin := range m.plugins {
if len(pluginTypes) == 0 || slices.Contains(pluginTypes, plugin.Type) {
result = append(result, plugin)
}
}
return result
}
func newPluginList() *pluginsv0alpha1.PluginList {
return &pluginsv0alpha1.PluginList{
Items: []pluginsv0alpha1.Plugin{
{
ObjectMeta: metav1.ObjectMeta{Name: "grafana-plugins-app", Namespace: "org-1"},
Spec: pluginsv0alpha1.PluginSpec{Id: "grafana-plugins-app", Version: "1.0.0"},
},
{
ObjectMeta: metav1.ObjectMeta{Name: "test-plugin", Namespace: "org-1"},
Spec: pluginsv0alpha1.PluginSpec{Id: "test-plugin", Version: "1.0.0"},
},
{
ObjectMeta: metav1.ObjectMeta{Name: "test-plugin-2", Namespace: "org-1"},
Spec: pluginsv0alpha1.PluginSpec{Id: "test-plugin-2", Version: "1.0.0"},
},
},
}
}
type mockResourceClient struct {
listFunc func(ctx context.Context, namespace string, opts resource.ListOptions) (resource.ListObject, error)
}
func (m *mockResourceClient) List(ctx context.Context, namespace string, opts resource.ListOptions) (resource.ListObject, error) {
if m.listFunc != nil {
return m.listFunc(ctx, namespace, opts)
}
return &pluginsv0alpha1.PluginList{}, nil
}
func (m *mockResourceClient) ListInto(ctx context.Context, namespace string, opts resource.ListOptions, into resource.ListObject) error {
list, err := m.List(ctx, namespace, opts)
if err != nil {
return err
}
if src, ok := list.(*pluginsv0alpha1.PluginList); ok {
if dst, ok := into.(*pluginsv0alpha1.PluginList); ok {
*dst = *src
}
}
return nil
}
func (m *mockResourceClient) Get(ctx context.Context, identifier resource.Identifier) (resource.Object, error) {
return nil, nil
}
func (m *mockResourceClient) GetInto(ctx context.Context, identifier resource.Identifier, into resource.Object) error {
return nil
}
func (m *mockResourceClient) Create(ctx context.Context, identifier resource.Identifier, obj resource.Object, opts resource.CreateOptions) (resource.Object, error) {
return nil, nil
}
func (m *mockResourceClient) CreateInto(ctx context.Context, identifier resource.Identifier, obj resource.Object, opts resource.CreateOptions, into resource.Object) error {
return nil
}
func (m *mockResourceClient) Update(ctx context.Context, identifier resource.Identifier, obj resource.Object, opts resource.UpdateOptions) (resource.Object, error) {
return nil, nil
}
func (m *mockResourceClient) UpdateInto(ctx context.Context, identifier resource.Identifier, obj resource.Object, opts resource.UpdateOptions, into resource.Object) error {
return nil
}
func (m *mockResourceClient) Patch(ctx context.Context, identifier resource.Identifier, patch resource.PatchRequest, opts resource.PatchOptions) (resource.Object, error) {
return nil, nil
}
func (m *mockResourceClient) PatchInto(ctx context.Context, identifier resource.Identifier, patch resource.PatchRequest, opts resource.PatchOptions, into resource.Object) error {
return nil
}
func (m *mockResourceClient) Delete(ctx context.Context, identifier resource.Identifier, opts resource.DeleteOptions) error {
return nil
}
func (m *mockResourceClient) SubresourceRequest(ctx context.Context, identifier resource.Identifier, req resource.CustomRouteRequestOptions) ([]byte, error) {
return nil, nil
}
func (m *mockResourceClient) Watch(ctx context.Context, namespace string, opts resource.WatchOptions) (resource.WatchResponse, error) {
return &mockWatchResponse{}, nil
}
type mockWatchResponse struct{}
func (m *mockWatchResponse) Stop() {}
func (m *mockWatchResponse) WatchEvents() <-chan resource.WatchEvent {
ch := make(chan resource.WatchEvent)
close(ch)
return ch
}

View File

@@ -83,6 +83,12 @@ tree:
nodeType: leaf
linkId: test-case-2
linkType: scope
test-case-redirect:
title: Test case with redirect
nodeType: leaf
linkId: shoe-org
linkType: scope
redirectPath: /d/dcb9f5e9-8066-4397-889e-864b99555dbb #Reliability dashboard
clusters:
title: Clusters
nodeType: container

View File

@@ -67,10 +67,12 @@ type ScopeFilterConfig struct {
type TreeNode struct {
Title string `yaml:"title"`
SubTitle string `yaml:"subTitle,omitempty"`
Description string `yaml:"description,omitempty"`
NodeType string `yaml:"nodeType"`
LinkID string `yaml:"linkId,omitempty"`
LinkType string `yaml:"linkType,omitempty"`
DisableMultiSelect bool `yaml:"disableMultiSelect,omitempty"`
RedirectPath string `yaml:"redirectPath,omitempty"`
Children map[string]TreeNode `yaml:"children,omitempty"`
}
@@ -259,6 +261,7 @@ func (c *Client) createScopeNode(name string, node TreeNode, parentName string)
spec := v0alpha1.ScopeNodeSpec{
Title: node.Title,
SubTitle: node.SubTitle,
Description: node.Description,
NodeType: nodeType,
DisableMultiSelect: node.DisableMultiSelect,
}
@@ -272,6 +275,10 @@ func (c *Client) createScopeNode(name string, node TreeNode, parentName string)
spec.LinkType = linkType
}
if node.RedirectPath != "" {
spec.RedirectPath = node.RedirectPath
}
resource := v0alpha1.ScopeNode{
TypeMeta: metav1.TypeMeta{
APIVersion: apiVersion,

View File

@@ -54,7 +54,7 @@ For production systems, use the `folderFromFilesStructure` capability instead of
## Before you begin
{{< admonition type="note" >}}
Enable the `provisioning` and `kubernetesDashboards` feature toggles in Grafana to use this feature.
Enable the `provisioning` feature toggle in Grafana to use this feature.
{{< /admonition >}}
To set up file provisioning, you need:
@@ -67,7 +67,7 @@ To set up file provisioning, you need:
## Enable required feature toggles and configure permitted paths
To activate local file provisioning in Grafana, you need to enable the `provisioning` and `kubernetesDashboards` feature toggles.
To activate local file provisioning in Grafana, you need to enable the `provisioning` feature toggle.
For additional information about feature toggles, refer to [Configure feature toggles](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/feature-toggles).
The local setting must be a relative path and its relative path must be configured in the `permitted_provisioned_paths` configuration option.
@@ -82,12 +82,11 @@ Any subdirectories are automatically included.
The values that you enter for the `permitted_provisioning_paths` become the base paths for those entered when you enter a local path in the **Connect to local storage** wizard.
1. Open your Grafana configuration file, either `grafana.ini` or `custom.ini`. For file location based on operating system, refer to [Configuration file location](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/feature-toggles/#experimental-feature-toggles).
1. Locate or add a `[feature_toggles]` section. Add these values:
1. Locate or add a `[feature_toggles]` section. Add this value:
```ini
[feature_toggles]
provisioning = true
kubernetesDashboards = true ; use k8s from browser
```
1. Locate or add a `[paths]` section. To add more than one location, use the pipe character (`|`) to separate the paths. The list should not include empty paths or trailing pipes. Add these values:

View File

@@ -29,76 +29,70 @@ You can sign up to the private preview using the [Git Sync early access form](ht
{{< /admonition >}}
Git Sync lets you manage Grafana dashboards as code by storing dashboard JSON files and folders in a remote GitHub repository.
To set up Git Sync and synchronize with a GitHub repository follow these steps:
1. [Enable feature toggles in Grafana](#enable-required-feature-toggles) (first time set up).
1. [Create a GitHub access token](#create-a-github-access-token).
1. [Configure a connection to your GitHub repository](#set-up-the-connection-to-github).
1. [Choose what content to sync with Grafana](#choose-what-to-synchronize).
Optionally, you can [extend Git Sync](#configure-webhooks-and-image-rendering) by enabling pull request notifications and image previews of dashboard changes.
| Capability | Benefit | Requires |
| ----------------------------------------------------- | ------------------------------------------------------------------------------- | -------------------------------------- |
| Adds a table summarizing changes to your pull request | Provides a convenient way to save changes back to GitHub. | Webhooks configured |
| Add a dashboard preview image to a PR | View a snapshot of dashboard changes to a pull request without opening Grafana. | Image renderer and webhooks configured |
{{< admonition type="note" >}}
Alternatively, you can configure a local file system instead of using GitHub. Refer to [Set up file provisioning](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/file-path-setup/) for more information.
{{< /admonition >}}
## Performance impacts of enabling Git Sync
Git Sync is an experimental feature and is under continuous development. Reporting any issues you encounter can help us improve Git Sync.
When Git Sync is enabled, the database load might increase, especially for instances with a lot of folders and nested folders. Evaluate the performance impact, if any, in a non-production environment.
This guide shows you how to set up Git Sync to synchronize your Grafana dashboards and folders with a GitHub repository. You'll set up Git Sync to enable version-controlled dashboard management either [using the UI](#set-up-git-sync-using-grafana-ui) or [as code](#set-up-git-sync-as-code).
## Before you begin
{{< admonition type="caution" >}}
Before you begin, ensure you have the following:
Refer to [Known limitations](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/intro-git-sync#known-limitations/) before using Git Sync.
- A Grafana instance (Cloud, OSS, or Enterprise).
- If you're [using webhooks or image rendering](#extend-git-sync-for-real-time-notification-and-image-rendering), a public instance with external access
- Administration rights in your Grafana organization
- A [GitHub private access token](#create-a-github-access-token)
- A GitHub repository to store your dashboards in
- Optional: The [Image Renderer service](https://github.com/grafana/grafana-image-renderer) to save image previews with your PRs
### Known limitations
Refer to [Known limitations](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/intro-git-sync#known-limitations) before using Git Sync.
Refer to [Supported resources](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/intro-git-sync#supported-resources) for details about which resources you can sync.
### Performance considerations
When Git Sync is enabled, the database load might increase, especially for instances with many folders and nested folders. Evaluate the performance impact, if any, in a non-production environment.
Git Sync is under continuous development. [Report any issues](https://grafana.com/help/) you encounter to help us improve Git Sync.
## Set up Git Sync
To set up Git Sync and synchronize with a GitHub repository, follow these steps:
1. [Enable feature toggles in Grafana](#enable-required-feature-toggles) (first time setup)
1. [Create a GitHub access token](#create-a-github-access-token)
1. Set up Git Sync [using the UI](#set-up-git-sync-using-grafana-ui) or [as code](#set-up-git-sync-as-code)
After setup, you can [verify your dashboards](#verify-your-dashboards-in-grafana).
Optionally, you can also [extend Git Sync with webhooks and image rendering](#extend-git-sync-for-real-time-notification-and-image-rendering).
{{< admonition type="note" >}}
Alternatively, you can configure a local file system instead of using GitHub. Refer to [Set up file provisioning](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/file-path-setup/) for more information.
{{< /admonition >}}
### Requirements
To set up Git Sync, you need:
- Administration rights in your Grafana organization.
- Enable the required feature toggles in your Grafana instance. Refer to [Enable required feature toggles](#enable-required-feature-toggles) for instructions.
- A GitHub repository to store your dashboards in.
- If you want to use a local file path, refer to [the local file path guide](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/file-path-setup/).
- A GitHub access token. The Grafana UI will prompt you during setup.
- Optional: A public Grafana instance.
- Optional: The [Image Renderer service](https://github.com/grafana/grafana-image-renderer) to save image previews with your PRs.
## Enable required feature toggles
To activate Git Sync in Grafana, you need to enable the `provisioning` and `kubernetesDashboards` feature toggles.
For additional information about feature toggles, refer to [Configure feature toggles](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/feature-toggles).
To activate Git Sync in Grafana, you need to enable the `provisioning` feature toggle. For more information about feature toggles, refer to [Configure feature toggles](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/feature-toggles/#experimental-feature-toggles).
To enable the required feature toggles, add them to your Grafana configuration file:
To enable the required feature toggle:
1. Open your Grafana configuration file, either `grafana.ini` or `custom.ini`. For file location based on operating system, refer to [Configuration file location](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/setup-grafana/configure-grafana/feature-toggles/#experimental-feature-toggles).
1. Locate or add a `[feature_toggles]` section. Add these values:
1. Locate or add a `[feature_toggles]` section. Add this value:
```ini
[feature_toggles]
provisioning = true
kubernetesDashboards = true ; use k8s from browser
```
1. Save the changes to the file and restart Grafana.
## Create a GitHub access token
Whenever you connect to a GitHub repository, you need to create a GitHub access token with specific repository permissions.
This token needs to be added to your Git Sync configuration to enable read and write permissions between Grafana and GitHub repository.
Whenever you connect to a GitHub repository, you need to create a GitHub access token with specific repository permissions. This token needs to be added to your Git Sync configuration to enable read and write permissions between Grafana and GitHub repository.
To create a GitHub access token:
1. Create a new token using [Create new fine-grained personal access token](https://github.com/settings/personal-access-tokens/new). Refer to [Managing your personal access tokens](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens) for instructions.
1. Under **Permissions**, expand **Repository permissions**.
@@ -112,19 +106,23 @@ This token needs to be added to your Git Sync configuration to enable read and w
1. Verify the options and select **Generate token**.
1. Copy the access token. Leave the browser window available with the token until you've completed configuration.
GitHub Apps are not currently supported.
GitHub Apps aren't currently supported.
## Set up the connection to GitHub
## Set up Git Sync using Grafana UI
Use **Provisioning** to guide you through setting up Git Sync to use a GitHub repository.
1. [Configure a connection to your GitHub repository](#set-up-the-connection-to-github)
1. [Choose what content to sync with Grafana](#choose-what-to-synchronize)
1. [Choose additional settings](#choose-additional-settings)
### Set up the connection to GitHub
Use **Provisioning** to guide you through setting up Git Sync to use a GitHub repository:
1. Log in to your Grafana server with an account that has the Grafana Admin flag set.
1. Select **Administration** in the left-side menu and then **Provisioning**.
1. Select **Configure Git Sync**.
### Connect to external storage
To connect your GitHub repository, follow these steps:
To connect your GitHub repository:
1. Paste your GitHub personal access token into **Enter your access token**. Refer to [Create a GitHub access token](#create-a-github-access-token) for instructions.
1. Paste the **Repository URL** for your GitHub repository into the text box.
@@ -134,32 +132,12 @@ To connect your GitHub repository, follow these steps:
### Choose what to synchronize
In this step you can decide which elements to synchronize. Keep in mind the available options depend on the status of your Grafana instance.
In this step, you can decide which elements to synchronize. The available options depend on the status of your Grafana instance:
- If the instance contains resources in an incompatible data format, you'll have to migrate all the data using instance sync. Folder sync won't be supported.
- If there is already another connection using folder sync, instance sync won't be offered.
- If there's already another connection using folder sync, instance sync won't be offered.
#### Synchronization limitations
Git Sync only supports dashboards and folders. Alerts, panels, and other resources are not supported yet.
{{< admonition type="caution" >}}
Refer to [Known limitations](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/intro-git-sync#known-limitations/) before using Git Sync. Refer to [Supported resources](/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/intro-git-sync#supported-resources) for details about which resources you can sync.
{{< /admonition >}}
Full instance sync is not available in Grafana Cloud.
In Grafana OSS/Enterprise:
- If you try to perform a full instance sync with resources that contain alerts or panels, Git Sync will block the connection.
- You won't be able to create new alerts or library panels after the setup is completed.
- If you opted for full instance sync and want to use alerts and library panels, you'll have to delete the synced repository and connect again with folder sync.
#### Set up synchronization
To set up synchronization, choose to either sync your entire organization resources with external storage, or to sync certain resources to a new Grafana folder (with up to 10 connections).
To set up synchronization:
- Choose **Sync all resources with external storage** if you want to sync and manage your entire Grafana instance through external storage. With this option, all of your dashboards are synced to that one repository. You can only have one provisioned connection with this selection, and you won't have the option of setting up additional repositories to connect to.
- Choose **Sync external storage to new Grafana folder** to sync external resources into a new folder without affecting the rest of your instance. You can repeat this process for up to 10 connections.
@@ -170,20 +148,183 @@ Next, enter a **Display name** for the repository connection. Resources stored i
Finally, you can set up how often your configured storage is polled for updates.
To configure additional settings:
1. For **Update instance interval (seconds)**, enter how often you want the instance to pull updates from GitHub. The default value is 60 seconds.
1. Optional: Select **Read only** to ensure resources can't be modified in Grafana.
1. Optional: If you have the Grafana Image Renderer plugin configured, you can **Enable dashboards previews in pull requests**. If image rendering is not available, then you can't select this option. For more information, refer to the [Image Renderer service](https://github.com/grafana/grafana-image-renderer).
1. Optional: If you have the Grafana Image Renderer plugin configured, you can **Enable dashboards previews in pull requests**. If image rendering isn't available, then you can't select this option. For more information, refer to the [Image Renderer service](https://github.com/grafana/grafana-image-renderer).
1. Select **Finish** to proceed.
### Modify your configuration after setup is complete
To update your repository configuration after you've completed setup:
1. Log in to your Grafana server with an account that has the Grafana Admin flag set.
1. Select **Administration** in the left-side menu and then **Provisioning**.
1. Select **Settings** for the repository you wish to modify.
1. Use the **Configure repository** screen to update any of the settings.
1. Select **Save** to preserve the updates.
## Set up Git Sync as code
Alternatively, you can also configure Git Sync using `grafanactl`. Since Git Sync configuration is managed as code using Custom Resource Definitions (CRDs), you can create a Repository CRD in a YAML file and use `grafanactl` to push it to Grafana. This approach enables automated, GitOps-style workflows for managing Git Sync configuration instead of using the Grafana UI.
To set up Git Sync with `grafanactl`, follow these steps:
1. [Create the repository CRD](#create-the-repository-crd)
1. [Push the repository CRD to Grafana](#push-the-repository-crd-to-grafana)
1. [Manage repository resources](#manage-repository-resources)
1. [Verify setup](#verify-setup)
For more information, refer to the following documents:
- [grafanactl Documentation](https://grafana.github.io/grafanactl/)
- [Repository CRD Reference](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/git-sync-setup/)
- [Dashboard CRD Format](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/export-resources/)
### Create the repository CRD
Create a `repository.yaml` file defining your Git Sync configuration:
```yaml
apiVersion: provisioning.grafana.app/v0alpha1
kind: Repository
metadata:
name: <REPOSITORY_NAME>
spec:
title: <REPOSITORY_TITLE>
type: github
github:
url: <GITHUB_REPO_URL>
branch: <BRANCH>
path: grafana/
generateDashboardPreviews: true
sync:
enabled: true
intervalSeconds: 60
target: folder
workflows:
- write
- branch
secure:
token:
create: <GITHUB_PAT>
```
Replace the placeholders with your values:
- _`<REPOSITORY_NAME>`_: Unique identifier for this repository resource
- _`<REPOSITORY_TITLE>`_: Human-readable name displayed in Grafana UI
- _`<GITHUB_REPO_URL>`_: GitHub repository URL
- _`<BRANCH>`_: Branch to sync
- _`<GITHUB_PAT>`_: GitHub Personal Access Token
{{< admonition type="note" >}}
Only `target: folder` is currently supported for Git Sync.
{{< /admonition >}}
#### Configuration parameters
The following configuration parameters are available:
| Field | Description |
| --------------------------------------- | ----------------------------------------------------------- |
| `metadata.name` | Unique identifier for this repository resource |
| `spec.title` | Human-readable name displayed in Grafana UI |
| `spec.type` | Repository type (`github`) |
| `spec.github.url` | GitHub repository URL |
| `spec.github.branch` | Branch to sync |
| `spec.github.path` | Directory path containing dashboards |
| `spec.github.generateDashboardPreviews` | Generate preview images (true/false) |
| `spec.sync.enabled` | Enable synchronization (true/false) |
| `spec.sync.intervalSeconds` | Sync interval in seconds |
| `spec.sync.target` | Where to place synced dashboards (`folder`) |
| `spec.workflows` | Enabled workflows: `write` (direct commits), `branch` (PRs) |
| `secure.token.create` | GitHub Personal Access Token |
### Push the repository CRD to Grafana
Before pushing any resources, configure `grafanactl` with your Grafana instance details. Refer to the [grafanactl configuration documentation](https://grafana.github.io/grafanactl/) for setup instructions.
Push the repository configuration:
```sh
grafanactl resources push --path <DIRECTORY>
```
The `--path` parameter has to point to the directory containing your `repository.yaml` file.
After pushing, Grafana will:
1. Create the repository resource
1. Connect to your GitHub repository
1. Pull dashboards from the specified path
1. Begin syncing at the configured interval
### Manage repository resources
#### List repositories
To list all repositories:
```sh
grafanactl resources get repositories
```
#### Get repository details
To get details for a specific repository:
```sh
grafanactl resources get repository/<REPOSITORY_NAME>
grafanactl resources get repository/<REPOSITORY_NAME> -o json
grafanactl resources get repository/<REPOSITORY_NAME> -o yaml
```
#### Update the repository
To update a repository:
```sh
grafanactl resources edit repository/<REPOSITORY_NAME>
```
#### Delete the repository
To delete a repository:
```sh
grafanactl resources delete repository/<REPOSITORY_NAME>
```
### Verify setup
Check that Git Sync is working:
```sh
# List repositories
grafanactl resources get repositories
# Check Grafana UI
# Navigate to: Administration → Provisioning → Git Sync
```
## Verify your dashboards in Grafana
To verify that your dashboards are available at the location that you specified, click **Dashboards**. The name of the dashboard is listed in the **Name** column.
Now that your dashboards have been synced from a repository, you can customize the name, change the branch, and create a pull request (PR) for it. Refer to [Manage provisioned repositories with Git Sync](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/observability-as-code/provision-resources/use-git-sync/) for more information.
Now that your dashboards have been synced from a repository, you can customize the name, change the branch, and create a pull request (PR) for it. Refer to [Manage provisioned repositories with Git Sync](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/use-git-sync/) for more information.
## Configure webhooks and image rendering
## Extend Git Sync for real-time notification and image rendering
You can extend Git Sync by getting instant updates and pull requests using webhooks and add dashboard previews in pull requests.
Optionally, you can extend Git Sync by enabling pull request notifications and image previews of dashboard changes.
| Capability | Benefit | Requires |
| ----------------------------------------------------- | ------------------------------------------------------------------------------ | -------------------------------------- |
| Adds a table summarizing changes to your pull request | Provides a convenient way to save changes back to GitHub | Webhooks configured |
| Add a dashboard preview image to a PR | View a snapshot of dashboard changes to a pull request without opening Grafana | Image renderer and webhooks configured |
### Set up webhooks for realtime notification and pull request integration
@@ -191,25 +332,26 @@ When connecting to a GitHub repository, Git Sync uses webhooks to enable real-ti
You can set up webhooks with whichever service or tooling you prefer. You can use Cloudflare Tunnels with a Cloudflare-managed domain, port-forwarding and DNS options, or a tool such as `ngrok`.
To set up webhooks you need to expose your Grafana instance to the public Internet. You can do this via port forwarding and DNS, a tool such as `ngrok`, or any other method you prefer. The permissions set in your GitHub access token provide the authorization for this communication.
To set up webhooks, you need to expose your Grafana instance to the public Internet. You can do this via port forwarding and DNS, a tool such as `ngrok`, or any other method you prefer. The permissions set in your GitHub access token provide the authorization for this communication.
After you have the public URL, you can add it to your Grafana configuration file:
```yaml
```ini
[server]
root_url = https://PUBLIC_DOMAIN.HERE
root_url = https://<PUBLIC_DOMAIN>
```
Replace _`<PUBLIC_DOMAIN>`_ with your public domain.
To check the configured webhooks, go to **Administration** > **Provisioning** and click the **View** link for your GitHub repository.
#### Expose necessary paths only
If your security setup does not permit publicly exposing the Grafana instance, you can either choose to `allowlist` the GitHub IP addresses, or expose only the necessary paths.
If your security setup doesn't permit publicly exposing the Grafana instance, you can either choose to allowlist the GitHub IP addresses, or expose only the necessary paths.
The necessary paths required to be exposed are, in RegExp:
- `/apis/provisioning\.grafana\.app/v0(alpha1)?/namespaces/[^/]+/repositories/[^/]+/(webhook|render/.*)$`
<!-- TODO: Path for the blob storage for image rendering? @ryantxu would know this best. -->
### Set up image rendering for dashboard previews
@@ -217,12 +359,13 @@ Set up image rendering to add visual previews of dashboard updates directly in p
To enable this capability, install the Grafana Image Renderer in your Grafana instance. For more information and installation instructions, refer to the [Image Renderer service](https://github.com/grafana/grafana-image-renderer).
## Modify configurations after set up is complete
## Next steps
To update your repository configuration after you've completed set up:
You've successfully set up Git Sync to manage your Grafana dashboards through version control. Your dashboards are now synchronized with a GitHub repository, enabling collaborative development and change tracking.
1. Log in to your Grafana server with an account that has the Grafana Admin flag set.
1. Select **Administration** in the left-side menu and then **Provisioning**.
1. Select **Settings** for the repository you wish to modify.
1. Use the **Configure repository** screen to update any of the settings.
1. Select **Save** to preserve the updates.
To learn more about using Git Sync:
- [Work with provisioned dashboards](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/provisioned-dashboards/)
- [Manage provisioned repositories with Git Sync](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/use-git-sync/)
- [Export resources](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/as-code/observability-as-code/provision-resources/export-resources/)
- [grafanactl documentation](https://grafana.github.io/grafanactl/)

View File

@@ -0,0 +1,80 @@
---
description: Learn how to troubleshoot common problems with the Grafana MySQL data source plugin
keywords:
- grafana
- mysql
- query
labels:
products:
- cloud
- enterprise
- oss
menuTitle: Troubleshoot
title: Troubleshoot common problems with the Grafana MySQL data source plugin
weight: 40
refs:
variables:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/dashboards/variables/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/dashboards/variables/
variable-syntax-advanced-variable-format-options:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/dashboards/variables/variable-syntax/#advanced-variable-format-options
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/dashboards/variables/variable-syntax/#advanced-variable-format-options
annotate-visualizations:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/dashboards/build-dashboards/annotate-visualizations/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/dashboards/build-dashboards/annotate-visualizations/
explore:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/explore/
query-transform-data:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/query-transform-data/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/panels-visualizations/query-transform-data/
panel-inspector:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/panel-inspector/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/panels-visualizations/panel-inspector/
query-editor:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/query-transform-data/#query-editors
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/visualizations/panels-visualizations/query-transform-data/#query-editors
alert-rules:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/alerting/fundamentals/alert-rules/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/alerting-and-irm/alerting/alerting-rules/
template-annotations-and-labels:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/alerting/alerting-rules/templates/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana-cloud/alerting-and-irm/alerting/alerting-rules/templates/
configure-standard-options:
- pattern: /docs/grafana/
- destination: /docs/grafana/<GRAFANA_VERSION>/panels-visualizations/configure-standard-options/
---
# Troubleshoot common problems with the Grafana MySQL data source plugin
This page lists common issues you might experience when setting up the Grafana MySQL data source plugin.
### My data source connection fails when using the Grafana MySQL data source plugin
- Check if the MySQL server is up and running.
- Make sure that your firewall is open for MySQL server (default port is `3306`).
- Ensure that you have the correct permissions to access the MySQL server and also have permission to access the database.
- If the error persists, create a new user for the Grafana MySQL data source plugin with correct permissions and try to connect with it.
### What should I do if I see "An unexpected error happened" or "Could not connect to MySQL" after trying all of the above?
- Check the Grafana logs for more details about the error.
- For Grafana Cloud customers, contact support.

View File

@@ -8,6 +8,7 @@ test.use({
scopeFilters: true,
groupByVariable: true,
reloadDashboardsOnParamsChange: true,
useScopesNavigationEndpoint: true,
},
});
@@ -61,31 +62,6 @@ test.describe('Scope Redirect Functionality', () => {
});
});
test('should fall back to scope navigation when no redirectUrl', async ({ page, gotoDashboardPage }) => {
const scopes = testScopesWithRedirect();
await test.step('Navigate to dashboard and open scopes selector', async () => {
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
await openScopesSelector(page, scopes);
});
await test.step('Select scope without redirectUrl', async () => {
// Select the scope without redirectUrl directly
await selectScope(page, 'sn-redirect-fallback', scopes[1]);
});
await test.step('Apply scopes and verify fallback behavior', async () => {
await applyScopes(page, [scopes[1]]);
// Should stay on current dashboard since no redirectUrl is provided
// The scope navigation fallback should not redirect (as per existing behavior)
await expect(page).toHaveURL(/\/d\/cuj-dashboard-1/);
// Verify the scope was applied
await expect(page).toHaveURL(/scopes=scope-sn-redirect-fallback/);
});
});
test('should not redirect when reloading page on dashboard not in dashboard list', async ({
page,
gotoDashboardPage,
@@ -171,4 +147,47 @@ test.describe('Scope Redirect Functionality', () => {
await expect(page).not.toHaveURL(/scopes=/);
});
});
test('should not redirect to redirectPath when on active scope navigation', async ({ page, gotoDashboardPage }) => {
const scopes = testScopesWithRedirect();
await test.step('Set up scope navigation to dashboard-1', async () => {
// First, apply a scope that creates scope navigation to dashboard-1 (without redirectPath)
await gotoDashboardPage({ uid: 'cuj-dashboard-1' });
await openScopesSelector(page, scopes);
await selectScope(page, 'sn-redirect-setup', scopes[2]);
await applyScopes(page, [scopes[2]]);
// Verify we're on dashboard-1 with the scope applied
await expect(page).toHaveURL(/\/d\/cuj-dashboard-1/);
await expect(page).toHaveURL(/scopes=scope-sn-redirect-setup/);
});
await test.step('Navigate to dashboard-1 to be on active scope navigation', async () => {
// Navigate to dashboard-1 which is now a scope navigation target
await gotoDashboardPage({
uid: 'cuj-dashboard-1',
queryParams: new URLSearchParams({ scopes: 'scope-sn-redirect-setup' }),
});
// Verify we're on dashboard-1
await expect(page).toHaveURL(/\/d\/cuj-dashboard-1/);
});
await test.step('Apply scope with redirectPath and verify no redirect', async () => {
// Now apply a different scope that has redirectPath
// Since we're on an active scope navigation, it should NOT redirect
await openScopesSelector(page, scopes);
await selectScope(page, 'sn-redirect-with-navigation', scopes[3]);
await applyScopes(page, [scopes[3]]);
// Verify the new scope was applied
await expect(page).toHaveURL(/scopes=scope-sn-redirect-with-navigation/);
// Since we're already on the active scope navigation (dashboard-1),
// we should NOT redirect to redirectPath (dashboard-3)
await expect(page).toHaveURL(/\/d\/cuj-dashboard-1/);
await expect(page).not.toHaveURL(/\/d\/cuj-dashboard-3/);
});
});
});

View File

@@ -156,13 +156,18 @@ export async function applyScopes(page: Page, scopes?: TestScope[]) {
return;
}
const url: string =
const dashboardBindingsUrl: string =
'**/apis/scope.grafana.app/v0alpha1/namespaces/*/find/scope_dashboard_bindings?' +
scopes.map((scope) => `scope=scope-${scope.name}`).join('&');
const scopeNavigationsUrl: string =
'**/apis/scope.grafana.app/v0alpha1/namespaces/*/find/scope_navigations?' +
scopes.map((scope) => `scope=scope-${scope.name}`).join('&');
const groups: string[] = ['Most relevant', 'Dashboards', 'Something else', ''];
await page.route(url, async (route) => {
// Mock scope_dashboard_bindings endpoint
await page.route(dashboardBindingsUrl, async (route) => {
await route.fulfill({
status: 200,
contentType: 'application/json',
@@ -215,7 +220,52 @@ export async function applyScopes(page: Page, scopes?: TestScope[]) {
});
});
const responsePromise = page.waitForResponse((response) => response.url().includes(`/find/scope_dashboard_bindings`));
// Mock scope_navigations endpoint
await page.route(scopeNavigationsUrl, async (route) => {
await route.fulfill({
status: 200,
contentType: 'application/json',
body: JSON.stringify({
apiVersion: 'scope.grafana.app/v0alpha1',
items: scopes.flatMap((scope) => {
const navigations: Array<{
kind: string;
apiVersion: string;
metadata: { name: string; resourceVersion: string; creationTimestamp: string };
spec: { url: string; scope: string };
status: { title: string };
}> = [];
// Create a scope navigation if dashboardUid is provided
if (scope.dashboardUid && scope.addLinks) {
navigations.push({
kind: 'ScopeNavigation',
apiVersion: 'scope.grafana.app/v0alpha1',
metadata: {
name: `scope-${scope.name}-nav`,
resourceVersion: '1',
creationTimestamp: 'stamp',
},
spec: {
url: `/d/${scope.dashboardUid}`,
scope: `scope-${scope.name}`,
},
status: {
title: scope.dashboardTitle ?? scope.title,
},
});
}
return navigations;
}),
}),
});
});
const responsePromise = page.waitForResponse(
(response) =>
response.url().includes(`/find/scope_dashboard_bindings`) || response.url().includes(`/find/scope_navigations`)
);
const scopeRequestPromises: Array<Promise<Response>> = [];
for (const scope of scopes) {

View File

@@ -124,5 +124,23 @@ export const testScopesWithRedirect = (): TestScope[] => {
dashboardTitle: 'CUJ Dashboard 2',
addLinks: true,
},
{
name: 'sn-redirect-setup',
title: 'Setup Navigation',
// No redirectPath - used to set up scope navigation to dashboard-1
filters: [{ key: 'namespace', operator: 'equals', value: 'setup-nav' }],
dashboardUid: 'cuj-dashboard-1', // Creates scope navigation to this dashboard
dashboardTitle: 'CUJ Dashboard 1',
addLinks: true,
},
{
name: 'sn-redirect-with-navigation',
title: 'Redirect With Navigation',
redirectPath: '/d/cuj-dashboard-3', // Redirect target
filters: [{ key: 'namespace', operator: 'equals', value: 'redirect-with-nav' }],
dashboardUid: 'cuj-dashboard-1', // Creates scope navigation to this dashboard
dashboardTitle: 'CUJ Dashboard 1',
addLinks: true,
},
];
};

7
go.mod
View File

@@ -48,7 +48,7 @@ require (
github.com/blugelabs/bluge_segment_api v0.2.0 // @grafana/grafana-backend-group
github.com/bradfitz/gomemcache v0.0.0-20230905024940-24af94b03874 // @grafana/grafana-backend-group
github.com/bwmarrin/snowflake v0.3.0 // @grafana/grafana-app-platform-squad
github.com/centrifugal/centrifuge v0.37.2 // @grafana/grafana-app-platform-squad
github.com/centrifugal/centrifuge v0.38.0 // @grafana/grafana-app-platform-squad
github.com/crewjam/saml v0.4.14 // @grafana/identity-access-team
github.com/dgraph-io/badger/v4 v4.7.0 // @grafana/grafana-search-and-storage
github.com/dlmiddlecote/sqlstats v1.0.2 // @grafana/grafana-backend-group
@@ -386,7 +386,7 @@ require (
github.com/caio/go-tdigest v3.1.0+incompatible // indirect
github.com/cenkalti/backoff/v4 v4.3.0 // @grafana/alerting-backend
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/centrifugal/protocol v0.16.2 // indirect
github.com/centrifugal/protocol v0.17.0 // indirect
github.com/cespare/xxhash v1.1.0 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/cheekybits/genny v1.0.0 // indirect
@@ -562,7 +562,7 @@ require (
github.com/prometheus/procfs v0.16.1 // indirect
github.com/protocolbuffers/txtpbfmt v0.0.0-20241112170944-20d2c9ebc01d // indirect
github.com/puzpuzpuz/xsync/v2 v2.5.1 // indirect
github.com/redis/rueidis v1.0.64 // indirect
github.com/redis/rueidis v1.0.68 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/rivo/uniseg v0.4.7 // indirect
github.com/rogpeppe/go-internal v1.14.1 // indirect
@@ -687,6 +687,7 @@ require (
github.com/moby/term v0.5.0 // indirect
github.com/morikuni/aec v1.0.0 // indirect
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55 // indirect
github.com/quagmt/udecimal v1.9.0 // indirect
github.com/shirou/gopsutil/v4 v4.25.3 // indirect
github.com/tklauser/go-sysconf v0.3.14 // indirect
github.com/tklauser/numcpus v0.8.0 // indirect

14
go.sum
View File

@@ -1006,10 +1006,10 @@ github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F9
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/census-instrumentation/opencensus-proto v0.3.0/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/census-instrumentation/opencensus-proto v0.4.1/go.mod h1:4T9NM4+4Vw91VeyqjLS6ao50K5bOcLKN6Q42XnYaRYw=
github.com/centrifugal/centrifuge v0.37.2 h1:rerQNvDfYN2FZEkVtb/hvGV7SIrJfEQrKF3MaE8GDlo=
github.com/centrifugal/centrifuge v0.37.2/go.mod h1:aj4iRJGhzi3SlL8iUtVezxway1Xf8g+hmNQkLLO7sS8=
github.com/centrifugal/protocol v0.16.2 h1:KoIHgDeX1fFxyxQoKW+6E8ZTCf5mwGm8JyGoJ5NBMbQ=
github.com/centrifugal/protocol v0.16.2/go.mod h1:Q7OpS/8HMXDnL7f9DpNx24IhG96MP88WPpVTTCdrokI=
github.com/centrifugal/centrifuge v0.38.0 h1:UJTowwc5lSwnpvd3vbrTseODbU7osSggN67RTrJ8EfQ=
github.com/centrifugal/centrifuge v0.38.0/go.mod h1:rcZLARnO5GXOeE9qG7iIPMvERxESespqkSX4cGLCAzo=
github.com/centrifugal/protocol v0.17.0 h1:hD0WczyiG7zrVJcgkQsd5/nhfFXt0Y04SJHV2Z7B1rg=
github.com/centrifugal/protocol v0.17.0/go.mod h1:9MdiYyjw5Bw1+d5Sp4Y0NK+qiuTNyd88nrHJsUUh8k4=
github.com/cespare/xxhash v1.1.0 h1:a6HrQnmkObjyL+Gs60czilIUGqrzKutQD6XZog3p+ko=
github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghfAqPWnc=
github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
@@ -2334,11 +2334,13 @@ github.com/puzpuzpuz/xsync/v2 v2.5.1 h1:mVGYAvzDSu52+zaGyNjC+24Xw2bQi3kTr4QJ6N9p
github.com/puzpuzpuz/xsync/v2 v2.5.1/go.mod h1:gD2H2krq/w52MfPLE+Uy64TzJDVY7lP2znR9qmR35kU=
github.com/puzpuzpuz/xsync/v4 v4.2.0 h1:dlxm77dZj2c3rxq0/XNvvUKISAmovoXF4a4qM6Wvkr0=
github.com/puzpuzpuz/xsync/v4 v4.2.0/go.mod h1:VJDmTCJMBt8igNxnkQd86r+8KUeN1quSfNKu5bLYFQo=
github.com/quagmt/udecimal v1.9.0 h1:TLuZiFeg0HhS6X8VDa78Y6XTaitZZfh+z5q4SXMzpDQ=
github.com/quagmt/udecimal v1.9.0/go.mod h1:ScmJ/xTGZcEoYiyMMzgDLn79PEJHcMBiJ4NNRT3FirA=
github.com/rcrowley/go-metrics v0.0.0-20181016184325-3113b8401b8a/go.mod h1:bCqnVzQkZxMG4s8nGwiZ5l3QUCyqpo9Y+/ZMZ9VjZe4=
github.com/redis/go-redis/v9 v9.14.0 h1:u4tNCjXOyzfgeLN+vAZaW1xUooqWDqVEsZN0U01jfAE=
github.com/redis/go-redis/v9 v9.14.0/go.mod h1:huWgSWd8mW6+m0VPhJjSSQ+d6Nh1VICQ6Q5lHuCH/Iw=
github.com/redis/rueidis v1.0.64 h1:XqgbueDuNV3qFdVdQwAHJl1uNt90zUuAJuzqjH4cw6Y=
github.com/redis/rueidis v1.0.64/go.mod h1:Lkhr2QTgcoYBhxARU7kJRO8SyVlgUuEkcJO1Y8MCluA=
github.com/redis/rueidis v1.0.68 h1:gept0E45JGxVigWb3zoWHvxEc4IOC7kc4V/4XvN8eG8=
github.com/redis/rueidis v1.0.68/go.mod h1:Lkhr2QTgcoYBhxARU7kJRO8SyVlgUuEkcJO1Y8MCluA=
github.com/remyoudompheng/bigfft v0.0.0-20200410134404-eec4a21b6bb0/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=

View File

@@ -708,6 +708,8 @@ github.com/envoyproxy/go-control-plane/envoy v1.32.3/go.mod h1:F6hWupPfh75TBXGKA
github.com/envoyproxy/go-control-plane/envoy v1.32.4/go.mod h1:Gzjc5k8JcJswLjAx1Zm+wSYE20UrLtt7JZMWiWQXQEw=
github.com/envoyproxy/protoc-gen-validate v1.0.4/go.mod h1:qys6tmnRsYrQqIhm2bvKZH4Blx/1gTIZ2UKVY1M+Yew=
github.com/envoyproxy/protoc-gen-validate v1.1.0/go.mod h1:sXRDRVmzEbkM7CVcM06s9shE/m23dg3wzjl0UWqJ2q4=
github.com/ericlagergren/decimal v0.0.0-20240411145413-00de7ca16731 h1:R/ZjJpjQKsZ6L/+Gf9WHbt31GG8NMVcpRqUE+1mMIyo=
github.com/ericlagergren/decimal v0.0.0-20240411145413-00de7ca16731/go.mod h1:M9R1FoZ3y//hwwnJtO51ypFGwm8ZfpxPT/ZLtO1mcgQ=
github.com/evanphx/json-patch v5.6.0+incompatible/go.mod h1:50XU6AFN0ol/bzJsmQLiYLvXMP4fmwYFNcr97nuDLSk=
github.com/evanphx/json-patch/v5 v5.9.11/go.mod h1:3j+LviiESTElxA4p3EMKAB9HXj3/XEtnUf6OZxqIQTM=
github.com/fatih/color v1.15.0/go.mod h1:0h5ZqXfHYED7Bhv2ZJamyIOUej9KtShiJESRwBDUSsw=
@@ -1330,6 +1332,7 @@ github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e h1:aoZm08cpOy4WuID//EZDgc
github.com/pkg/sftp v1.13.1 h1:I2qBYMChEhIjOgazfJmV3/mZM256btk6wkCDRmW7JYs=
github.com/pkg/xattr v0.4.10 h1:Qe0mtiNFHQZ296vRgUjRCoPHPqH7VdTOrZx3g0T+pGA=
github.com/pkg/xattr v0.4.10/go.mod h1:di8WF84zAKk8jzR1UBTEWh9AUlIZZ7M/JNt8e9B6ktU=
github.com/planetscale/vtprotobuf v0.6.0/go.mod h1:t/avpk3KcrXxUnYOhZhMXJlSEyie6gQbtLq5NM3loB8=
github.com/posener/complete v1.2.3 h1:NP0eAhjcjImqslEwo/1hq7gpajME0fTLTezBKDqfXqo=
github.com/power-devops/perfstat v0.0.0-20210106213030-5aafc221ea8c/go.mod h1:OmDBASR4679mdNQnz2pUhc2G8CO2JrUAVFDRBDP/hJE=
github.com/pquerna/cachecontrol v0.1.0 h1:yJMy84ti9h/+OEWa752kBTKv4XC30OtVVHYv/8cTqKc=
@@ -1397,6 +1400,7 @@ github.com/schollz/closestmatch v2.1.0+incompatible/go.mod h1:RtP1ddjLong6gTkbtm
github.com/schollz/progressbar/v3 v3.14.6 h1:GyjwcWBAf+GFDMLziwerKvpuS7ZF+mNTAXIB2aspiZs=
github.com/schollz/progressbar/v3 v3.14.6/go.mod h1:Nrzpuw3Nl0srLY0VlTvC4V6RL50pcEymjy6qyJAaLa0=
github.com/sclevine/spec v1.4.0/go.mod h1:LvpgJaFyvQzRvc1kaDs0bulYwzC70PbiYjC4QnFHkOM=
github.com/segmentio/asm v1.1.4/go.mod h1:Ld3L4ZXGNcSLRg4JBsZ3//1+f/TjYl0Mzen/DQy1EJg=
github.com/segmentio/fasthash v1.0.3 h1:EI9+KE1EwvMLBWwjpRDc+fEM+prwxDYbslddQGtrmhM=
github.com/segmentio/fasthash v1.0.3/go.mod h1:waKX8l2N8yckOgmSsXJi7x1ZfdKZ4x7KRMzBtS3oedY=
github.com/segmentio/parquet-go v0.0.0-20220811205829-7efc157d28af/go.mod h1:PxYdAI6cGd+s1j4hZDQbz3VFgobF5fDA0weLeNWKTE4=
@@ -1935,6 +1939,7 @@ golang.org/x/net v0.0.0-20210428140749-89ef3d95e781/go.mod h1:OJAsFXCWl8Ukc7SiCT
golang.org/x/net v0.0.0-20211123203042-d83791d6bcd9/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20211216030914-fe4d6282115f/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.3.0/go.mod h1:MBQ8lrhLObU/6UmLb4fmbmk5OcyYmqtbGd/9yIeKjEE=
golang.org/x/net v0.14.0/go.mod h1:PpSgVXXLK0OxS0F31C1/tv6XNguvCrnXIDrFMspZIUI=
golang.org/x/net v0.16.0/go.mod h1:NxSsAGuq816PNPmqtQdLE42eU2Fs7NoRIZrHJAlaCOE=
golang.org/x/net v0.23.0/go.mod h1:JKghWKKOSdJwpW2GEx0Ja7fmaKnMsbu+MWVZTokSYmg=
golang.org/x/net v0.24.0/go.mod h1:2Q7sJY5mzlzWjKtYUEXSlBWCdyaioyXzRB2RtU8KVE8=
@@ -2001,6 +2006,7 @@ golang.org/x/term v0.32.0/go.mod h1:uZG1FhGx848Sqfsq4/DlJr3xGGsYMu/L5GW4abiaEPQ=
golang.org/x/term v0.33.0/go.mod h1:s18+ql9tYWp1IfpV9DmCtQDDSRBUjKaw9M1eAv5UeF0=
golang.org/x/term v0.34.0/go.mod h1:5jC53AEywhIVebHgPVeg0mj8OD3VO9OzclacVrqpaAw=
golang.org/x/term v0.35.0/go.mod h1:TPGtkTLesOwf2DE8CgVYiZinHAOuy5AYUYT1lENIZnA=
golang.org/x/text v0.12.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.17.0/go.mod h1:BuEKDfySbSR4drPmRPG/7iBdf8hvFMuRexcpahXilzY=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
golang.org/x/text v0.22.0/go.mod h1:YRoo4H8PVmsu+E3Ou7cqLVH8oXWIHVoX0jqUWALQhfY=
@@ -2077,6 +2083,7 @@ google.golang.org/genproto/googleapis/api v0.0.0-20250825161204-c5933d9347a5/go.
google.golang.org/genproto/googleapis/api v0.0.0-20250929231259-57b25ae835d4/go.mod h1:NnuHhy+bxcg30o7FnVAZbXsPHUDQ9qKWAQKCD7VxFtk=
google.golang.org/genproto/googleapis/bytestream v0.0.0-20250603155806-513f23925822 h1:zWFRixYR5QlotL+Uv3YfsPRENIrQFXiGs+iwqel6fOQ=
google.golang.org/genproto/googleapis/bytestream v0.0.0-20250603155806-513f23925822/go.mod h1:h6yxum/C2qRb4txaZRLDHK8RyS0H/o2oEDeKY4onY/Y=
google.golang.org/genproto/googleapis/rpc v0.0.0-20230822172742-b8732ec3820d/go.mod h1:+Bk1OCOj40wS2hwAMA+aCW9ypzm63QTBBHp6lQ3p+9M=
google.golang.org/genproto/googleapis/rpc v0.0.0-20231002182017-d307bd883b97/go.mod h1:v7nGkzlmW8P3n/bKmWBn2WpBjpOEx8Q6gMueudAmKfY=
google.golang.org/genproto/googleapis/rpc v0.0.0-20231106174013-bbf56f31fb17/go.mod h1:oQ5rr10WTTMvP4A36n8JpR1OrO1BEiV4f78CneXZxkA=
google.golang.org/genproto/googleapis/rpc v0.0.0-20240123012728-ef4313101c80/go.mod h1:PAREbraiVEVGVdTZsVWjSbbTtSyGbAgIIvni8a8CD5s=
@@ -2107,6 +2114,7 @@ google.golang.org/genproto/googleapis/rpc v0.0.0-20251014184007-4626949a642f/go.
google.golang.org/genproto/googleapis/rpc v0.0.0-20251103181224-f26f9409b101/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/grpc v1.23.1/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
google.golang.org/grpc v1.24.0/go.mod h1:XDChyiUovWa60DnaeDeZmSW86xtLtjtZbwvSiRnRtcA=
google.golang.org/grpc v1.58.2/go.mod h1:tgX3ZQDlNJGU96V6yHh1T/JeoBQ2TXdr43YbYSsCJk0=
google.golang.org/grpc v1.59.0/go.mod h1:aUPDwccQo6OTjy7Hct4AfBPD1GptF4fyUjIkQ9YtF98=
google.golang.org/grpc v1.61.0/go.mod h1:VUbo7IFqmF1QtCAstipjG0GIoq49KvMe9+h1jFLBNJs=
google.golang.org/grpc v1.62.1/go.mod h1:IWTG0VlJLCh1SkC58F7np9ka9mx/WNkjl4PGJaiq+QE=

View File

@@ -124,7 +124,6 @@
"@types/eslint": "9.6.1",
"@types/eslint-scope": "^8.0.0",
"@types/file-saver": "2.0.7",
"@types/glob": "^9.0.0",
"@types/google.analytics": "^0.0.46",
"@types/gtag.js": "^0.0.20",
"@types/history": "4.7.11",

View File

@@ -1,5 +1,5 @@
/**
* A library containing the different design components of the Grafana ecosystem.
* A library containing e2e selectors for the Grafana ecosystem.
*
* @packageDocumentation
*/

View File

@@ -451,6 +451,19 @@ describe('TableNG', () => {
expect(screen.getByText('A1')).toBeInTheDocument();
expect(screen.getByText('1')).toBeInTheDocument();
});
it('shows full column name in title attribute for truncated headers', () => {
const { container } = render(
<TableNG enableVirtualization={false} data={createBasicDataFrame()} width={800} height={600} />
);
const headers = container.querySelectorAll('[role="columnheader"]');
const firstHeaderSpan = headers[0].querySelector('span');
const secondHeaderSpan = headers[1].querySelector('span');
expect(firstHeaderSpan).toHaveAttribute('title', 'Column A');
expect(secondHeaderSpan).toHaveAttribute('title', 'Column B');
});
});
describe('Footer options', () => {

View File

@@ -55,7 +55,9 @@ const HeaderCell: React.FC<HeaderCellProps> = ({
{showTypeIcons && (
<Icon className={styles.headerCellIcon} name={getFieldTypeIcon(field)} title={field?.type} size="sm" />
)}
<span className={styles.headerCellLabel}>{getDisplayName(field)}</span>
<span className={styles.headerCellLabel} title={displayName}>
{displayName}
</span>
{direction && (
<Icon
className={cx(styles.headerCellIcon, styles.headerSortIcon)}

View File

@@ -111,17 +111,15 @@ func TestGetHomeDashboard(t *testing.T) {
}
}
func newTestLive(t *testing.T, store db.DB) *live.GrafanaLive {
func newTestLive(t *testing.T) *live.GrafanaLive {
features := featuremgmt.WithFeatures()
cfg := setting.NewCfg()
cfg.AppURL = "http://localhost:3000/"
gLive, err := live.ProvideService(nil, cfg,
routing.NewRouteRegister(),
nil, nil, nil, nil,
store,
nil,
&usagestats.UsageStatsMock{T: t},
nil,
features, acimpl.ProvideAccessControl(features),
&dashboards.FakeDashboardService{},
nil, nil)
@@ -751,7 +749,7 @@ func TestIntegrationDashboardAPIEndpoint(t *testing.T) {
hs := HTTPServer{
Cfg: cfg,
ProvisioningService: provisioning.NewProvisioningServiceMock(context.Background()),
Live: newTestLive(t, db.InitTestDB(t)),
Live: newTestLive(t),
QuotaService: quotatest.New(false, nil),
LibraryElementService: &libraryelementsfake.LibraryElementService{},
DashboardService: dashboardService,
@@ -1003,7 +1001,7 @@ func postDashboardScenario(t *testing.T, desc string, url string, routePattern s
hs := HTTPServer{
Cfg: cfg,
ProvisioningService: provisioning.NewProvisioningServiceMock(context.Background()),
Live: newTestLive(t, db.InitTestDB(t)),
Live: newTestLive(t),
QuotaService: quotatest.New(false, nil),
pluginStore: &pluginstore.FakePluginStore{},
LibraryElementService: &libraryelementsfake.LibraryElementService{},
@@ -1043,7 +1041,7 @@ func restoreDashboardVersionScenario(t *testing.T, desc string, url string, rout
hs := HTTPServer{
Cfg: cfg,
ProvisioningService: provisioning.NewProvisioningServiceMock(context.Background()),
Live: newTestLive(t, db.InitTestDB(t)),
Live: newTestLive(t),
QuotaService: quotatest.New(false, nil),
LibraryElementService: &libraryelementsfake.LibraryElementService{},
DashboardService: mock,

View File

@@ -343,7 +343,7 @@ func TestUpdateDataSourceByID_DataSourceNameExists(t *testing.T) {
Cfg: setting.NewCfg(),
AccessControl: acimpl.ProvideAccessControl(featuremgmt.WithFeatures()),
accesscontrolService: actest.FakeService{},
Live: newTestLive(t, nil),
Live: newTestLive(t),
}
sc := setupScenarioContext(t, "/api/datasources/1")
@@ -450,7 +450,7 @@ func TestAPI_datasources_AccessControl(t *testing.T) {
hs.Cfg = setting.NewCfg()
hs.DataSourcesService = &dataSourcesServiceMock{expectedDatasource: &datasources.DataSource{}}
hs.accesscontrolService = actest.FakeService{}
hs.Live = newTestLive(t, hs.SQLStore)
hs.Live = newTestLive(t)
hs.promRegister, hs.dsConfigHandlerRequestsDuration = setupDsConfigHandlerMetrics()
})

View File

@@ -1,11 +0,0 @@
package dtos
import "encoding/json"
type LivePublishCmd struct {
Channel string `json:"channel"`
Data json.RawMessage `json:"data,omitempty"`
}
type LivePublishResponse struct {
}

View File

@@ -11,15 +11,16 @@ import (
"os/signal"
"syscall"
"github.com/prometheus/client_golang/prometheus"
"k8s.io/client-go/rest"
"k8s.io/client-go/transport"
"github.com/grafana/grafana-app-sdk/logging"
"github.com/grafana/grafana-app-sdk/operator"
folder "github.com/grafana/grafana/apps/folder/pkg/apis/folder/v1beta1"
"github.com/grafana/grafana/apps/iam/pkg/app"
"github.com/grafana/grafana/pkg/server"
"github.com/grafana/grafana/pkg/setting"
"github.com/prometheus/client_golang/prometheus"
"k8s.io/client-go/rest"
"k8s.io/client-go/transport"
"github.com/grafana/authlib/authn"
utilnet "k8s.io/apimachinery/pkg/util/net"
@@ -95,7 +96,7 @@ func buildIAMConfigFromSettings(cfg *setting.Cfg, registerer prometheus.Register
if zanzanaURL == "" {
return nil, fmt.Errorf("zanzana_url is required in [operator] section")
}
iamCfg.AppConfig.ZanzanaClientCfg.URL = zanzanaURL
iamCfg.AppConfig.ZanzanaClientCfg.Addr = zanzanaURL
iamCfg.AppConfig.InformerConfig.MaxConcurrentWorkers = operatorSec.Key("max_concurrent_workers").MustUint64(20)

View File

@@ -9,9 +9,13 @@ import (
pluginsapp "github.com/grafana/grafana/apps/plugins/pkg/app"
"github.com/grafana/grafana/apps/plugins/pkg/app/meta"
"github.com/grafana/grafana/pkg/configprovider"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/apiserver"
"github.com/grafana/grafana/pkg/services/apiserver/appinstaller"
grafanaauthorizer "github.com/grafana/grafana/pkg/services/apiserver/auth/authorizer"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginassets"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
)
var (
@@ -20,10 +24,20 @@ var (
)
type AppInstaller struct {
metaManager *meta.ProviderManager
cfgProvider configprovider.ConfigProvider
restConfigProvider apiserver.RestConfigProvider
*pluginsapp.PluginAppInstaller
}
func ProvideAppInstaller(accessControlService accesscontrol.Service, accessClient authlib.AccessClient) (*AppInstaller, error) {
func ProvideAppInstaller(
cfgProvider configprovider.ConfigProvider,
restConfigProvider apiserver.RestConfigProvider,
pluginStore pluginstore.Store,
pluginAssetsService *pluginassets.Service,
accessControlService accesscontrol.Service, accessClient authlib.AccessClient,
) (*AppInstaller, error) {
if err := registerAccessControlRoles(accessControlService); err != nil {
return nil, fmt.Errorf("registering access control roles: %w", err)
}
@@ -35,8 +49,8 @@ func ProvideAppInstaller(accessControlService accesscontrol.Service, accessClien
coreProvider := meta.NewCoreProvider()
cloudProvider := meta.NewCatalogProvider(grafanaComAPIURL)
metaProviderManager := meta.NewProviderManager(coreProvider, cloudProvider)
localProvider := meta.NewLocalProvider(pluginStore, pluginAssetsService)
metaProviderManager := meta.NewProviderManager(localProvider, coreProvider, cloudProvider)
authorizer := grafanaauthorizer.NewResourceAuthorizer(accessClient)
i, err := pluginsapp.ProvideAppInstaller(authorizer, metaProviderManager)
if err != nil {
@@ -44,6 +58,9 @@ func ProvideAppInstaller(accessControlService accesscontrol.Service, accessClien
}
return &AppInstaller{
metaManager: metaProviderManager,
cfgProvider: cfgProvider,
restConfigProvider: restConfigProvider,
PluginAppInstaller: i,
}, nil
}

View File

@@ -2,7 +2,6 @@ package appregistry
import (
"github.com/google/wire"
"github.com/grafana/grafana/pkg/registry/apps/quotas"
"github.com/grafana/grafana/pkg/registry/apps/alerting/historian"
"github.com/grafana/grafana/pkg/registry/apps/alerting/notifications"
@@ -14,6 +13,7 @@ import (
"github.com/grafana/grafana/pkg/registry/apps/logsdrilldown"
"github.com/grafana/grafana/pkg/registry/apps/playlist"
"github.com/grafana/grafana/pkg/registry/apps/plugins"
"github.com/grafana/grafana/pkg/registry/apps/quotas"
"github.com/grafana/grafana/pkg/registry/apps/shorturl"
)

20
pkg/server/wire_gen.go generated
View File

@@ -672,10 +672,7 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
starService := starimpl.ProvideService(sqlStore)
searchSearchService := search2.ProvideService(cfg, sqlStore, starService, dashboardService, folderimplService, featureToggles, sortService)
plugincontextProvider := plugincontext.ProvideService(cfg, cacheService, pluginstoreService, cacheServiceImpl, service15, service13, requestConfigProvider)
qsDatasourceClientBuilder := dsquerierclient.NewNullQSDatasourceClientBuilder()
exprService := expr.ProvideService(cfg, middlewareHandler, plugincontextProvider, featureToggles, registerer, tracingService, qsDatasourceClientBuilder)
queryServiceImpl := query.ProvideService(cfg, cacheServiceImpl, exprService, ossDataSourceRequestValidator, middlewareHandler, plugincontextProvider, qsDatasourceClientBuilder)
grafanaLive, err := live.ProvideService(plugincontextProvider, cfg, routeRegisterImpl, pluginstoreService, middlewareHandler, cacheService, cacheServiceImpl, sqlStore, secretsService, usageStats, queryServiceImpl, featureToggles, accessControl, dashboardService, orgService, eventualRestConfigProvider)
grafanaLive, err := live.ProvideService(plugincontextProvider, cfg, routeRegisterImpl, pluginstoreService, middlewareHandler, cacheService, cacheServiceImpl, secretsService, usageStats, featureToggles, accessControl, dashboardService, orgService, eventualRestConfigProvider)
if err != nil {
return nil, err
}
@@ -684,6 +681,8 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
authnAuthenticator := authnimpl.ProvideAuthnServiceAuthenticateOnly(authnimplService)
contexthandlerContextHandler := contexthandler.ProvideService(cfg, authnAuthenticator, featureToggles)
logger := loggermw.Provide(cfg, featureToggles)
qsDatasourceClientBuilder := dsquerierclient.NewNullQSDatasourceClientBuilder()
exprService := expr.ProvideService(cfg, middlewareHandler, plugincontextProvider, featureToggles, registerer, tracingService, qsDatasourceClientBuilder)
ngAlert := metrics2.ProvideService()
repositoryImpl := annotationsimpl.ProvideService(sqlStore, cfg, featureToggles, tagimplService, tracingService, dBstore, dashboardService, registerer)
alertNG, err := ngalert.ProvideService(cfg, featureToggles, cacheServiceImpl, service15, routeRegisterImpl, sqlStore, kvStore, exprService, dataSourceProxyService, quotaService, secretsService, notificationService, ngAlert, folderimplService, accessControl, dashboardService, renderingService, inProcBus, acimplService, repositoryImpl, pluginstoreService, tracingService, dBstore, httpclientProvider, plugincontextProvider, receiverPermissionsService, userService)
@@ -708,6 +707,7 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
}
ossSearchUserFilter := filters.ProvideOSSSearchUserFilter()
ossService := searchusers.ProvideUsersService(cfg, ossSearchUserFilter, userService)
queryServiceImpl := query.ProvideService(cfg, cacheServiceImpl, exprService, ossDataSourceRequestValidator, middlewareHandler, plugincontextProvider, qsDatasourceClientBuilder)
serviceAccountsProxy, err := proxy.ProvideServiceAccountsProxy(cfg, accessControl, acimplService, featureToggles, serviceAccountPermissionsService, serviceAccountsService, routeRegisterImpl)
if err != nil {
return nil, err
@@ -784,7 +784,7 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
if err != nil {
return nil, err
}
appInstaller, err := plugins.ProvideAppInstaller(acimplService, accessClient)
appInstaller, err := plugins.ProvideAppInstaller(configProvider, eventualRestConfigProvider, pluginstoreService, pluginassetsService, acimplService, accessClient)
if err != nil {
return nil, err
}
@@ -1329,10 +1329,7 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
starService := starimpl.ProvideService(sqlStore)
searchSearchService := search2.ProvideService(cfg, sqlStore, starService, dashboardService, folderimplService, featureToggles, sortService)
plugincontextProvider := plugincontext.ProvideService(cfg, cacheService, pluginstoreService, cacheServiceImpl, service15, service13, requestConfigProvider)
qsDatasourceClientBuilder := dsquerierclient.NewNullQSDatasourceClientBuilder()
exprService := expr.ProvideService(cfg, middlewareHandler, plugincontextProvider, featureToggles, registerer, tracingService, qsDatasourceClientBuilder)
queryServiceImpl := query.ProvideService(cfg, cacheServiceImpl, exprService, ossDataSourceRequestValidator, middlewareHandler, plugincontextProvider, qsDatasourceClientBuilder)
grafanaLive, err := live.ProvideService(plugincontextProvider, cfg, routeRegisterImpl, pluginstoreService, middlewareHandler, cacheService, cacheServiceImpl, sqlStore, secretsService, usageStats, queryServiceImpl, featureToggles, accessControl, dashboardService, orgService, eventualRestConfigProvider)
grafanaLive, err := live.ProvideService(plugincontextProvider, cfg, routeRegisterImpl, pluginstoreService, middlewareHandler, cacheService, cacheServiceImpl, secretsService, usageStats, featureToggles, accessControl, dashboardService, orgService, eventualRestConfigProvider)
if err != nil {
return nil, err
}
@@ -1341,6 +1338,8 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
authnAuthenticator := authnimpl.ProvideAuthnServiceAuthenticateOnly(authnimplService)
contexthandlerContextHandler := contexthandler.ProvideService(cfg, authnAuthenticator, featureToggles)
logger := loggermw.Provide(cfg, featureToggles)
qsDatasourceClientBuilder := dsquerierclient.NewNullQSDatasourceClientBuilder()
exprService := expr.ProvideService(cfg, middlewareHandler, plugincontextProvider, featureToggles, registerer, tracingService, qsDatasourceClientBuilder)
notificationServiceMock := notifications.MockNotificationService()
ngAlert := metrics2.ProvideServiceForTest()
repositoryImpl := annotationsimpl.ProvideService(sqlStore, cfg, featureToggles, tagimplService, tracingService, dBstore, dashboardService, registerer)
@@ -1366,6 +1365,7 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
}
ossSearchUserFilter := filters.ProvideOSSSearchUserFilter()
ossService := searchusers.ProvideUsersService(cfg, ossSearchUserFilter, userService)
queryServiceImpl := query.ProvideService(cfg, cacheServiceImpl, exprService, ossDataSourceRequestValidator, middlewareHandler, plugincontextProvider, qsDatasourceClientBuilder)
serviceAccountsProxy, err := proxy.ProvideServiceAccountsProxy(cfg, accessControl, acimplService, featureToggles, serviceAccountPermissionsService, serviceAccountsService, routeRegisterImpl)
if err != nil {
return nil, err
@@ -1442,7 +1442,7 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
if err != nil {
return nil, err
}
appInstaller, err := plugins.ProvideAppInstaller(acimplService, accessClient)
appInstaller, err := plugins.ProvideAppInstaller(configProvider, eventualRestConfigProvider, pluginstoreService, pluginassetsService, acimplService, accessClient)
if err != nil {
return nil, err
}

View File

@@ -152,7 +152,7 @@ func ProvideStandaloneAuthZClient(
//nolint:staticcheck // not yet migrated to OpenFeature
zanzanaEnabled := features.IsEnabledGlobally(featuremgmt.FlagZanzana)
zanzanaClient, err := ProvideStandaloneZanzanaClient(cfg, features)
zanzanaClient, err := ProvideStandaloneZanzanaClient(cfg, features, reg)
if err != nil {
return nil, err
}

View File

@@ -4,16 +4,19 @@ import (
"context"
"errors"
"fmt"
"time"
"github.com/fullstorydev/grpchan/inprocgrpc"
authnlib "github.com/grafana/authlib/authn"
authzv1 "github.com/grafana/authlib/authz/proto/v1"
"github.com/grafana/authlib/grpcutils"
"github.com/grafana/authlib/types"
"github.com/grafana/dskit/middleware"
"github.com/grafana/dskit/services"
grpcAuth "github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/auth"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/promauto"
"google.golang.org/grpc"
"google.golang.org/grpc/credentials"
"google.golang.org/grpc/credentials/insecure"
@@ -43,14 +46,14 @@ func ProvideZanzanaClient(cfg *setting.Cfg, db db.DB, tracer tracing.Tracer, fea
switch cfg.ZanzanaClient.Mode {
case setting.ZanzanaModeClient:
return NewRemoteZanzanaClient(
fmt.Sprintf("stacks-%s", cfg.StackID),
ZanzanaClientConfig{
URL: cfg.ZanzanaClient.Addr,
Token: cfg.ZanzanaClient.Token,
TokenExchangeURL: cfg.ZanzanaClient.TokenExchangeURL,
ServerCertFile: cfg.ZanzanaClient.ServerCertFile,
})
zanzanaConfig := ZanzanaClientConfig{
Addr: cfg.ZanzanaClient.Addr,
Token: cfg.ZanzanaClient.Token,
TokenExchangeURL: cfg.ZanzanaClient.TokenExchangeURL,
TokenNamespace: cfg.ZanzanaClient.TokenNamespace,
ServerCertFile: cfg.ZanzanaClient.ServerCertFile,
}
return NewRemoteZanzanaClient(zanzanaConfig, reg)
case setting.ZanzanaModeEmbedded:
logger := log.New("zanzana.server")
@@ -97,32 +100,33 @@ func ProvideZanzanaClient(cfg *setting.Cfg, db db.DB, tracer tracing.Tracer, fea
// ProvideStandaloneZanzanaClient provides a standalone Zanzana client, without registering the Zanzana service.
// Client connects to a remote Zanzana server specified in the configuration.
func ProvideStandaloneZanzanaClient(cfg *setting.Cfg, features featuremgmt.FeatureToggles) (zanzana.Client, error) {
func ProvideStandaloneZanzanaClient(cfg *setting.Cfg, features featuremgmt.FeatureToggles, reg prometheus.Registerer) (zanzana.Client, error) {
//nolint:staticcheck // not yet migrated to OpenFeature
if !features.IsEnabledGlobally(featuremgmt.FlagZanzana) {
return zClient.NewNoopClient(), nil
}
zanzanaConfig := ZanzanaClientConfig{
URL: cfg.ZanzanaClient.Addr,
Addr: cfg.ZanzanaClient.Addr,
Token: cfg.ZanzanaClient.Token,
TokenExchangeURL: cfg.ZanzanaClient.TokenExchangeURL,
TokenNamespace: cfg.ZanzanaClient.TokenNamespace,
ServerCertFile: cfg.ZanzanaClient.ServerCertFile,
}
return NewRemoteZanzanaClient(cfg.ZanzanaClient.TokenNamespace, zanzanaConfig)
return NewRemoteZanzanaClient(zanzanaConfig, reg)
}
type ZanzanaClientConfig struct {
URL string
Addr string
Token string
TokenExchangeURL string
ServerCertFile string
TokenNamespace string
ServerCertFile string
}
// NewRemoteZanzanaClient creates a new Zanzana client that connects to remote Zanzana server.
func NewRemoteZanzanaClient(namespace string, cfg ZanzanaClientConfig) (zanzana.Client, error) {
func NewRemoteZanzanaClient(cfg ZanzanaClientConfig, reg prometheus.Registerer) (zanzana.Client, error) {
tokenClient, err := authnlib.NewTokenExchangeClient(authnlib.TokenExchangeConfig{
Token: cfg.Token,
TokenExchangeURL: cfg.TokenExchangeURL,
@@ -139,18 +143,25 @@ func NewRemoteZanzanaClient(namespace string, cfg ZanzanaClientConfig) (zanzana.
}
}
authzRequestDuration := promauto.With(reg).NewHistogramVec(prometheus.HistogramOpts{
Name: "authz_zanzana_client_request_duration_seconds",
Help: "Time spent executing requests to zanzana server.",
NativeHistogramBucketFactor: 1.1,
NativeHistogramMaxBucketNumber: 160,
NativeHistogramMinResetDuration: time.Hour,
}, []string{"operation", "status_code"})
unaryInterceptors, streamInterceptors := instrument(authzRequestDuration, middleware.ReportGRPCStatusOption)
dialOptions := []grpc.DialOption{
grpc.WithTransportCredentials(transportCredentials),
grpc.WithPerRPCCredentials(
NewGRPCTokenAuth(
AuthzServiceAudience,
namespace,
tokenClient,
),
NewGRPCTokenAuth(AuthzServiceAudience, cfg.TokenNamespace, tokenClient),
),
grpc.WithChainUnaryInterceptor(unaryInterceptors...),
grpc.WithChainStreamInterceptor(streamInterceptors...),
}
conn, err := grpc.NewClient(cfg.URL, dialOptions...)
conn, err := grpc.NewClient(cfg.Addr, dialOptions...)
if err != nil {
return nil, fmt.Errorf("failed to create zanzana client to remote server: %w", err)
}

View File

@@ -1,48 +0,0 @@
package database
import (
"fmt"
"time"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/localcache"
"github.com/grafana/grafana/pkg/services/live/model"
)
type Storage struct {
store db.DB
cache *localcache.CacheService
}
func NewStorage(store db.DB, cache *localcache.CacheService) *Storage {
return &Storage{store: store, cache: cache}
}
func getLiveMessageCacheKey(orgID int64, channel string) string {
return fmt.Sprintf("live_message_%d_%s", orgID, channel)
}
func (s *Storage) SaveLiveMessage(query *model.SaveLiveMessageQuery) error {
// Come back to saving into database after evaluating database structure.
s.cache.Set(getLiveMessageCacheKey(query.OrgID, query.Channel), model.LiveMessage{
ID: 0, // Not used actually.
OrgID: query.OrgID,
Channel: query.Channel,
Data: query.Data,
Published: time.Now(),
}, 0)
return nil
}
func (s *Storage) GetLiveMessage(query *model.GetLiveMessageQuery) (model.LiveMessage, bool, error) {
// Come back to saving into database after evaluating database structure.
m, ok := s.cache.Get(getLiveMessageCacheKey(query.OrgID, query.Channel))
if !ok {
return model.LiveMessage{}, false, nil
}
msg, ok := m.(model.LiveMessage)
if !ok {
return model.LiveMessage{}, false, fmt.Errorf("unexpected live message type in cache: %T", m)
}
return msg, true, nil
}

View File

@@ -1,18 +0,0 @@
package tests
import (
"testing"
"time"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/localcache"
"github.com/grafana/grafana/pkg/services/live/database"
)
// SetupTestStorage initializes a storage to used by the integration tests.
// This is required to properly register and execute migrations.
func SetupTestStorage(t *testing.T) *database.Storage {
sqlStore := db.InitTestDB(t)
localCache := localcache.New(time.Hour, time.Hour)
return database.NewStorage(sqlStore, localCache)
}

View File

@@ -1,67 +0,0 @@
package tests
import (
"encoding/json"
"testing"
"github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/services/live/model"
"github.com/grafana/grafana/pkg/tests/testsuite"
"github.com/grafana/grafana/pkg/util/testutil"
)
func TestMain(m *testing.M) {
testsuite.Run(m)
}
func TestIntegrationLiveMessage(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
storage := SetupTestStorage(t)
getQuery := &model.GetLiveMessageQuery{
OrgID: 1,
Channel: "test_channel",
}
_, ok, err := storage.GetLiveMessage(getQuery)
require.NoError(t, err)
require.False(t, ok)
saveQuery := &model.SaveLiveMessageQuery{
OrgID: 1,
Channel: "test_channel",
Data: []byte(`{}`),
}
err = storage.SaveLiveMessage(saveQuery)
require.NoError(t, err)
msg, ok, err := storage.GetLiveMessage(getQuery)
require.NoError(t, err)
require.True(t, ok)
require.Equal(t, int64(1), msg.OrgID)
require.Equal(t, "test_channel", msg.Channel)
require.Equal(t, json.RawMessage(`{}`), msg.Data)
require.NotZero(t, msg.Published)
// try saving again, should be replaced.
saveQuery2 := &model.SaveLiveMessageQuery{
OrgID: 1,
Channel: "test_channel",
Data: []byte(`{"input": "hello"}`),
}
err = storage.SaveLiveMessage(saveQuery2)
require.NoError(t, err)
getQuery2 := &model.GetLiveMessageQuery{
OrgID: 1,
Channel: "test_channel",
}
msg2, ok, err := storage.GetLiveMessage(getQuery2)
require.NoError(t, err)
require.True(t, ok)
require.Equal(t, int64(1), msg2.OrgID)
require.Equal(t, "test_channel", msg2.Channel)
require.Equal(t, json.RawMessage(`{"input": "hello"}`), msg2.Data)
require.NotZero(t, msg2.Published)
}

View File

@@ -1,70 +0,0 @@
package features
import (
"context"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/services/live/model"
)
var (
logger = log.New("live.features") // scoped to all features?
)
//go:generate mockgen -destination=broadcast_mock.go -package=features github.com/grafana/grafana/pkg/services/live/features LiveMessageStore
type LiveMessageStore interface {
SaveLiveMessage(query *model.SaveLiveMessageQuery) error
GetLiveMessage(query *model.GetLiveMessageQuery) (model.LiveMessage, bool, error)
}
// BroadcastRunner will simply broadcast all events to `grafana/broadcast/*` channels
// This assumes that data is a JSON object
type BroadcastRunner struct {
liveMessageStore LiveMessageStore
}
func NewBroadcastRunner(liveMessageStore LiveMessageStore) *BroadcastRunner {
return &BroadcastRunner{liveMessageStore: liveMessageStore}
}
// GetHandlerForPath called on init
func (b *BroadcastRunner) GetHandlerForPath(_ string) (model.ChannelHandler, error) {
return b, nil // all dashboards share the same handler
}
// OnSubscribe will let anyone connect to the path
func (b *BroadcastRunner) OnSubscribe(_ context.Context, u identity.Requester, e model.SubscribeEvent) (model.SubscribeReply, backend.SubscribeStreamStatus, error) {
reply := model.SubscribeReply{
Presence: true,
JoinLeave: true,
}
query := &model.GetLiveMessageQuery{
OrgID: u.GetOrgID(),
Channel: e.Channel,
}
msg, ok, err := b.liveMessageStore.GetLiveMessage(query)
if err != nil {
return model.SubscribeReply{}, 0, err
}
if ok {
reply.Data = msg.Data
}
return reply, backend.SubscribeStreamStatusOK, nil
}
// OnPublish is called when a client wants to broadcast on the websocket
func (b *BroadcastRunner) OnPublish(_ context.Context, u identity.Requester, e model.PublishEvent) (model.PublishReply, backend.PublishStreamStatus, error) {
query := &model.SaveLiveMessageQuery{
OrgID: u.GetOrgID(),
Channel: e.Channel,
Data: e.Data,
}
if err := b.liveMessageStore.SaveLiveMessage(query); err != nil {
return model.PublishReply{}, 0, err
}
return model.PublishReply{Data: e.Data}, backend.PublishStreamStatusOK, nil
}

View File

@@ -1,66 +0,0 @@
// Code generated by MockGen. DO NOT EDIT.
// Source: github.com/grafana/grafana/pkg/services/live/features (interfaces: LiveMessageStore)
// Package features is a generated GoMock package.
package features
import (
reflect "reflect"
gomock "github.com/golang/mock/gomock"
model "github.com/grafana/grafana/pkg/services/live/model"
)
// MockLiveMessageStore is a mock of LiveMessageStore interface.
type MockLiveMessageStore struct {
ctrl *gomock.Controller
recorder *MockLiveMessageStoreMockRecorder
}
// MockLiveMessageStoreMockRecorder is the mock recorder for MockLiveMessageStore.
type MockLiveMessageStoreMockRecorder struct {
mock *MockLiveMessageStore
}
// NewMockLiveMessageStore creates a new mock instance.
func NewMockLiveMessageStore(ctrl *gomock.Controller) *MockLiveMessageStore {
mock := &MockLiveMessageStore{ctrl: ctrl}
mock.recorder = &MockLiveMessageStoreMockRecorder{mock}
return mock
}
// EXPECT returns an object that allows the caller to indicate expected use.
func (m *MockLiveMessageStore) EXPECT() *MockLiveMessageStoreMockRecorder {
return m.recorder
}
// GetLiveMessage mocks base method.
func (m *MockLiveMessageStore) GetLiveMessage(arg0 *model.GetLiveMessageQuery) (model.LiveMessage, bool, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetLiveMessage", arg0)
ret0, _ := ret[0].(model.LiveMessage)
ret1, _ := ret[1].(bool)
ret2, _ := ret[2].(error)
return ret0, ret1, ret2
}
// GetLiveMessage indicates an expected call of GetLiveMessage.
func (mr *MockLiveMessageStoreMockRecorder) GetLiveMessage(arg0 interface{}) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetLiveMessage", reflect.TypeOf((*MockLiveMessageStore)(nil).GetLiveMessage), arg0)
}
// SaveLiveMessage mocks base method.
func (m *MockLiveMessageStore) SaveLiveMessage(arg0 *model.SaveLiveMessageQuery) error {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "SaveLiveMessage", arg0)
ret0, _ := ret[0].(error)
return ret0
}
// SaveLiveMessage indicates an expected call of SaveLiveMessage.
func (mr *MockLiveMessageStoreMockRecorder) SaveLiveMessage(arg0 interface{}) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "SaveLiveMessage", reflect.TypeOf((*MockLiveMessageStore)(nil).SaveLiveMessage), arg0)
}

View File

@@ -1,87 +0,0 @@
package features
import (
"context"
"encoding/json"
"testing"
"github.com/golang/mock/gomock"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/stretchr/testify/require"
"github.com/grafana/grafana/pkg/services/live/model"
"github.com/grafana/grafana/pkg/services/user"
)
func TestNewBroadcastRunner(t *testing.T) {
mockCtrl := gomock.NewController(t)
defer mockCtrl.Finish()
d := NewMockLiveMessageStore(mockCtrl)
br := NewBroadcastRunner(d)
require.NotNil(t, br)
}
func TestBroadcastRunner_OnSubscribe(t *testing.T) {
mockCtrl := gomock.NewController(t)
defer mockCtrl.Finish()
mockDispatcher := NewMockLiveMessageStore(mockCtrl)
channel := "stream/channel/test"
data := json.RawMessage(`{}`)
mockDispatcher.EXPECT().GetLiveMessage(&model.GetLiveMessageQuery{
OrgID: 1,
Channel: channel,
}).DoAndReturn(func(query *model.GetLiveMessageQuery) (model.LiveMessage, bool, error) {
return model.LiveMessage{
Data: data,
}, true, nil
}).Times(1)
br := NewBroadcastRunner(mockDispatcher)
require.NotNil(t, br)
handler, err := br.GetHandlerForPath("test")
require.NoError(t, err)
reply, status, err := handler.OnSubscribe(
context.Background(),
&user.SignedInUser{OrgID: 1, UserID: 2},
model.SubscribeEvent{Channel: channel, Path: "test"},
)
require.NoError(t, err)
require.Equal(t, backend.SubscribeStreamStatusOK, status)
require.Equal(t, data, reply.Data)
require.True(t, reply.Presence)
require.True(t, reply.JoinLeave)
require.False(t, reply.Recover)
}
func TestBroadcastRunner_OnPublish(t *testing.T) {
mockCtrl := gomock.NewController(t)
defer mockCtrl.Finish()
mockDispatcher := NewMockLiveMessageStore(mockCtrl)
channel := "stream/channel/test"
data := json.RawMessage(`{}`)
var orgID int64 = 1
mockDispatcher.EXPECT().SaveLiveMessage(&model.SaveLiveMessageQuery{
OrgID: orgID,
Channel: channel,
Data: data,
}).DoAndReturn(func(query *model.SaveLiveMessageQuery) error {
return nil
}).Times(1)
br := NewBroadcastRunner(mockDispatcher)
require.NotNil(t, br)
handler, err := br.GetHandlerForPath("test")
require.NoError(t, err)
reply, status, err := handler.OnPublish(
context.Background(),
&user.SignedInUser{OrgID: 1, UserID: 2},
model.PublishEvent{Channel: channel, Path: "test", Data: data},
)
require.NoError(t, err)
require.Equal(t, backend.PublishStreamStatusOK, status)
require.Equal(t, data, reply.Data)
}

View File

@@ -7,9 +7,8 @@ import (
"strings"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/cmd/grafana-cli/logger"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/live/model"
@@ -35,7 +34,6 @@ type dashboardEvent struct {
type DashboardHandler struct {
Publisher model.ChannelPublisher
ClientCount model.ChannelClientCount
Store db.DB
DashboardService dashboards.DashboardService
AccessControl accesscontrol.AccessControl
}

View File

@@ -5,9 +5,11 @@ import (
"errors"
"github.com/centrifugal/centrifuge"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/cmd/grafana-cli/logger"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/services/live/model"
"github.com/grafana/grafana/pkg/services/live/orgchannel"

View File

@@ -15,7 +15,6 @@ import (
"github.com/centrifugal/centrifuge"
"github.com/gobwas/glob"
jsoniter "github.com/json-iterator/go"
"github.com/redis/go-redis/v9"
"go.opentelemetry.io/otel"
"go.opentelemetry.io/otel/attribute"
@@ -25,12 +24,9 @@ import (
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana-plugin-sdk-go/live"
"github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/api/response"
"github.com/grafana/grafana/pkg/api/routing"
"github.com/grafana/grafana/pkg/apimachinery/errutil"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/localcache"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/infra/usagestats"
@@ -43,7 +39,6 @@ import (
"github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/datasources"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/live/database"
"github.com/grafana/grafana/pkg/services/live/features"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/services/live/liveplugin"
@@ -57,7 +52,6 @@ import (
"github.com/grafana/grafana/pkg/services/org"
"github.com/grafana/grafana/pkg/services/pluginsintegration/plugincontext"
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
"github.com/grafana/grafana/pkg/services/query"
"github.com/grafana/grafana/pkg/services/secrets"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util"
@@ -80,8 +74,8 @@ type CoreGrafanaScope struct {
func ProvideService(plugCtxProvider *plugincontext.Provider, cfg *setting.Cfg, routeRegister routing.RouteRegister,
pluginStore pluginstore.Store, pluginClient plugins.Client, cacheService *localcache.CacheService,
dataSourceCache datasources.CacheService, sqlStore db.DB, secretsService secrets.Service,
usageStatsService usagestats.Service, queryDataService query.Service, toggles featuremgmt.FeatureToggles,
dataSourceCache datasources.CacheService, secretsService secrets.Service,
usageStatsService usagestats.Service, toggles featuremgmt.FeatureToggles,
accessControl accesscontrol.AccessControl, dashboardService dashboards.DashboardService,
orgService org.Service, configProvider apiserver.RestConfigProvider) (*GrafanaLive, error) {
g := &GrafanaLive{
@@ -93,9 +87,7 @@ func ProvideService(plugCtxProvider *plugincontext.Provider, cfg *setting.Cfg, r
pluginClient: pluginClient,
CacheService: cacheService,
DataSourceCache: dataSourceCache,
SQLStore: sqlStore,
SecretsService: secretsService,
queryDataService: queryDataService,
channels: make(map[string]model.ChannelHandler),
GrafanaScope: CoreGrafanaScope{
Features: make(map[string]model.ChannelHandlerFactory),
@@ -186,14 +178,11 @@ func ProvideService(plugCtxProvider *plugincontext.Provider, cfg *setting.Cfg, r
dash := &features.DashboardHandler{
Publisher: g.Publish,
ClientCount: g.ClientCount,
Store: sqlStore,
DashboardService: dashboardService,
AccessControl: accessControl,
}
g.storage = database.NewStorage(g.SQLStore, g.CacheService)
g.GrafanaScope.Dashboards = dash
g.GrafanaScope.Features["dashboard"] = dash
g.GrafanaScope.Features["broadcast"] = features.NewBroadcastRunner(g.storage)
// Testing watch with just the provisioning support -- this will be removed when it is well validated
//nolint:staticcheck // not yet migrated to OpenFeature
@@ -388,14 +377,14 @@ func ProvideService(plugCtxProvider *plugincontext.Provider, cfg *setting.Cfg, r
UserID: strconv.FormatInt(id, 10),
}
newCtx := centrifuge.SetCredentials(ctx.Req.Context(), cred)
newCtx = livecontext.SetContextSignedUser(newCtx, user)
newCtx = identity.WithRequester(newCtx, user)
r := ctx.Req.WithContext(newCtx)
wsHandler.ServeHTTP(ctx.Resp, r)
}
g.pushWebsocketHandler = func(ctx *contextmodel.ReqContext) {
user := ctx.SignedInUser
newCtx := livecontext.SetContextSignedUser(ctx.Req.Context(), user)
newCtx := identity.WithRequester(ctx.Req.Context(), user)
newCtx = livecontext.SetContextStreamID(newCtx, web.Params(ctx.Req)[":streamId"])
r := ctx.Req.WithContext(newCtx)
pushWSHandler.ServeHTTP(ctx.Resp, r)
@@ -403,7 +392,7 @@ func ProvideService(plugCtxProvider *plugincontext.Provider, cfg *setting.Cfg, r
g.pushPipelineWebsocketHandler = func(ctx *contextmodel.ReqContext) {
user := ctx.SignedInUser
newCtx := livecontext.SetContextSignedUser(ctx.Req.Context(), user)
newCtx := identity.WithRequester(ctx.Req.Context(), user)
newCtx = livecontext.SetContextChannelID(newCtx, web.Params(ctx.Req)["*"])
r := ctx.Req.WithContext(newCtx)
pushPipelineWSHandler.ServeHTTP(ctx.Resp, r)
@@ -475,14 +464,12 @@ type GrafanaLive struct {
RouteRegister routing.RouteRegister
CacheService *localcache.CacheService
DataSourceCache datasources.CacheService
SQLStore db.DB
SecretsService secrets.Service
pluginStore pluginstore.Store
pluginClient plugins.Client
queryDataService query.Service
orgService org.Service
keyPrefix string
keyPrefix string // HA prefix for grafana cloud (since the org is always 1)
node *centrifuge.Node
surveyCaller *survey.Caller
@@ -505,7 +492,6 @@ type GrafanaLive struct {
contextGetter *liveplugin.ContextGetter
runStreamManager *runstream.Manager
storage *database.Storage
usageStatsService usagestats.Service
usageStats usageStats
@@ -673,18 +659,13 @@ func (g *GrafanaLive) HandleDatasourceUpdate(orgID int64, dsUID string) {
}
}
// Use a configuration that's compatible with the standard library
// to minimize the risk of introducing bugs. This will make sure
// that map keys is ordered.
var jsonStd = jsoniter.ConfigCompatibleWithStandardLibrary
func (g *GrafanaLive) handleOnRPC(clientContextWithSpan context.Context, client *centrifuge.Client, e centrifuge.RPCEvent) (centrifuge.RPCReply, error) {
logger.Debug("Client calls RPC", "user", client.UserID(), "client", client.ID(), "method", e.Method)
if e.Method != "grafana.query" {
return centrifuge.RPCReply{}, centrifuge.ErrorMethodNotFound
}
user, ok := livecontext.GetContextSignedUser(clientContextWithSpan)
if !ok {
user, err := identity.GetRequester(clientContextWithSpan)
if err != nil {
logger.Error("No user found in context", "user", client.UserID(), "client", client.ID(), "method", e.Method)
return centrifuge.RPCReply{}, centrifuge.ErrorInternal
}
@@ -694,38 +675,15 @@ func (g *GrafanaLive) handleOnRPC(clientContextWithSpan context.Context, client
return centrifuge.RPCReply{}, centrifuge.ErrorExpired
}
var req dtos.MetricRequest
err := json.Unmarshal(e.Data, &req)
if err != nil {
return centrifuge.RPCReply{}, centrifuge.ErrorBadRequest
}
resp, err := g.queryDataService.QueryData(clientContextWithSpan, user, false, req)
if err != nil {
logger.Error("Error query data", "user", client.UserID(), "client", client.ID(), "method", e.Method, "error", err)
if errors.Is(err, datasources.ErrDataSourceAccessDenied) {
return centrifuge.RPCReply{}, &centrifuge.Error{Code: uint32(http.StatusForbidden), Message: http.StatusText(http.StatusForbidden)}
}
var gfErr errutil.Error
if errors.As(err, &gfErr) && gfErr.Reason.Status() == errutil.StatusBadRequest {
return centrifuge.RPCReply{}, &centrifuge.Error{Code: uint32(http.StatusBadRequest), Message: http.StatusText(http.StatusBadRequest)}
}
return centrifuge.RPCReply{}, centrifuge.ErrorInternal
}
data, err := jsonStd.Marshal(resp)
if err != nil {
logger.Error("Error marshaling query response", "user", client.UserID(), "client", client.ID(), "method", e.Method, "error", err)
return centrifuge.RPCReply{}, centrifuge.ErrorInternal
}
return centrifuge.RPCReply{
Data: data,
}, nil
// RPC events not available
return centrifuge.RPCReply{}, centrifuge.ErrorNotAvailable
}
func (g *GrafanaLive) handleOnSubscribe(clientContextWithSpan context.Context, client *centrifuge.Client, e centrifuge.SubscribeEvent) (centrifuge.SubscribeReply, error) {
logger.Debug("Client wants to subscribe", "user", client.UserID(), "client", client.ID(), "channel", e.Channel)
user, ok := livecontext.GetContextSignedUser(clientContextWithSpan)
if !ok {
user, err := identity.GetRequester(clientContextWithSpan)
if err != nil {
logger.Error("No user found in context", "user", client.UserID(), "client", client.ID(), "channel", e.Channel)
return centrifuge.SubscribeReply{}, centrifuge.ErrorInternal
}
@@ -830,8 +788,8 @@ func (g *GrafanaLive) handleOnSubscribe(clientContextWithSpan context.Context, c
func (g *GrafanaLive) handleOnPublish(clientCtxWithSpan context.Context, client *centrifuge.Client, e centrifuge.PublishEvent) (centrifuge.PublishReply, error) {
logger.Debug("Client wants to publish", "user", client.UserID(), "client", client.ID(), "channel", e.Channel)
user, ok := livecontext.GetContextSignedUser(clientCtxWithSpan)
if !ok {
user, err := identity.GetRequester(clientCtxWithSpan)
if err != nil {
logger.Error("No user found in context", "user", client.UserID(), "client", client.ID(), "channel", e.Channel)
return centrifuge.PublishReply{}, centrifuge.ErrorInternal
}
@@ -1083,7 +1041,7 @@ func (g *GrafanaLive) ClientCount(orgID int64, channel string) (int, error) {
}
func (g *GrafanaLive) HandleHTTPPublish(ctx *contextmodel.ReqContext) response.Response {
cmd := dtos.LivePublishCmd{}
cmd := model.LivePublishCmd{}
if err := web.Bind(ctx.Req, &cmd); err != nil {
return response.Error(http.StatusBadRequest, "bad request data", err)
}
@@ -1122,7 +1080,7 @@ func (g *GrafanaLive) HandleHTTPPublish(ctx *contextmodel.ReqContext) response.R
logger.Error("Error processing input", "user", user, "channel", channel, "error", err)
return response.Error(http.StatusInternalServerError, http.StatusText(http.StatusInternalServerError), nil)
}
return response.JSON(http.StatusOK, dtos.LivePublishResponse{})
return response.JSON(http.StatusOK, model.LivePublishResponse{})
}
}
@@ -1150,7 +1108,7 @@ func (g *GrafanaLive) HandleHTTPPublish(ctx *contextmodel.ReqContext) response.R
}
}
logger.Debug("Publication successful", "identity", ctx.GetID(), "channel", cmd.Channel)
return response.JSON(http.StatusOK, dtos.LivePublishResponse{})
return response.JSON(http.StatusOK, model.LivePublishResponse{})
}
type streamChannelListResponse struct {

View File

@@ -11,20 +11,17 @@ import (
"testing"
"time"
"github.com/centrifugal/centrifuge"
"github.com/go-jose/go-jose/v4"
"github.com/go-jose/go-jose/v4/jwt"
"github.com/stretchr/testify/require"
"github.com/centrifugal/centrifuge"
"github.com/grafana/grafana/pkg/api/routing"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/infra/db"
"github.com/grafana/grafana/pkg/infra/usagestats"
"github.com/grafana/grafana/pkg/services/accesscontrol/acimpl"
"github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/tests/testsuite"
"github.com/grafana/grafana/pkg/util/testutil"
@@ -245,7 +242,7 @@ func Test_handleOnPublish_IDTokenExpiration(t *testing.T) {
t.Run("expired token", func(t *testing.T) {
expiration := time.Now().Add(-time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnPublish(ctx, client, centrifuge.PublishEvent{
Channel: "test",
Data: []byte("test"),
@@ -257,7 +254,7 @@ func Test_handleOnPublish_IDTokenExpiration(t *testing.T) {
t.Run("unexpired token", func(t *testing.T) {
expiration := time.Now().Add(time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnPublish(ctx, client, centrifuge.PublishEvent{
Channel: "test",
Data: []byte("test"),
@@ -280,7 +277,7 @@ func Test_handleOnRPC_IDTokenExpiration(t *testing.T) {
t.Run("expired token", func(t *testing.T) {
expiration := time.Now().Add(-time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnRPC(ctx, client, centrifuge.RPCEvent{
Method: "grafana.query",
Data: []byte("test"),
@@ -292,7 +289,7 @@ func Test_handleOnRPC_IDTokenExpiration(t *testing.T) {
t.Run("unexpired token", func(t *testing.T) {
expiration := time.Now().Add(time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnRPC(ctx, client, centrifuge.RPCEvent{
Method: "grafana.query",
Data: []byte("test"),
@@ -315,7 +312,7 @@ func Test_handleOnSubscribe_IDTokenExpiration(t *testing.T) {
t.Run("expired token", func(t *testing.T) {
expiration := time.Now().Add(-time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnSubscribe(ctx, client, centrifuge.SubscribeEvent{
Channel: "test",
})
@@ -326,7 +323,7 @@ func Test_handleOnSubscribe_IDTokenExpiration(t *testing.T) {
t.Run("unexpired token", func(t *testing.T) {
expiration := time.Now().Add(time.Hour)
token := createToken(t, &expiration)
ctx := livecontext.SetContextSignedUser(context.Background(), &identity.StaticRequester{IDToken: token})
ctx := identity.WithRequester(context.Background(), &identity.StaticRequester{IDToken: token})
reply, err := g.handleOnSubscribe(ctx, client, centrifuge.SubscribeEvent{
Channel: "test",
})
@@ -347,10 +344,8 @@ func setupLiveService(cfg *setting.Cfg, t *testing.T) (*GrafanaLive, error) {
cfg,
routing.NewRouteRegister(),
nil, nil, nil, nil,
db.InitTestDB(t),
nil,
&usagestats.UsageStatsMock{T: t},
nil,
featuremgmt.WithFeatures(),
acimpl.ProvideAccessControl(featuremgmt.WithFeatures()),
&dashboards.FakeDashboardService{},
@@ -361,7 +356,12 @@ type dummyTransport struct {
name string
}
var (
_ centrifuge.Transport = (*dummyTransport)(nil)
)
func (t *dummyTransport) Name() string { return t.name }
func (t *dummyTransport) AcceptProtocol() string { return "" }
func (t *dummyTransport) Protocol() centrifuge.ProtocolType { return centrifuge.ProtocolTypeJSON }
func (t *dummyTransport) ProtocolVersion() centrifuge.ProtocolVersion {
return centrifuge.ProtocolVersion2

View File

@@ -2,27 +2,8 @@ package livecontext
import (
"context"
"github.com/grafana/grafana/pkg/apimachinery/identity"
)
type signedUserContextKeyType int
var signedUserContextKey signedUserContextKeyType
func SetContextSignedUser(ctx context.Context, user identity.Requester) context.Context {
ctx = context.WithValue(ctx, signedUserContextKey, user)
return ctx
}
func GetContextSignedUser(ctx context.Context) (identity.Requester, bool) {
if val := ctx.Value(signedUserContextKey); val != nil {
user, ok := val.(identity.Requester)
return user, ok
}
return nil, false
}
type streamIDContextKey struct{}
func SetContextStreamID(ctx context.Context, streamID string) context.Context {

View File

@@ -67,21 +67,9 @@ type ChannelHandlerFactory interface {
GetHandlerForPath(path string) (ChannelHandler, error)
}
type LiveMessage struct {
ID int64 `xorm:"pk autoincr 'id'"`
OrgID int64 `xorm:"org_id"`
Channel string
Data json.RawMessage
Published time.Time
type LivePublishCmd struct {
Channel string `json:"channel"`
Data json.RawMessage `json:"data,omitempty"`
}
type SaveLiveMessageQuery struct {
OrgID int64 `xorm:"org_id"`
Channel string
Data json.RawMessage
}
type GetLiveMessageQuery struct {
OrgID int64 `xorm:"org_id"`
Channel string
}
type LivePublishResponse struct{}

View File

@@ -6,7 +6,7 @@ import (
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/live/model"
)
@@ -25,8 +25,8 @@ func (s *BuiltinDataOutput) Type() string {
}
func (s *BuiltinDataOutput) OutputData(ctx context.Context, vars Vars, data []byte) ([]*ChannelData, error) {
u, ok := livecontext.GetContextSignedUser(ctx)
if !ok {
u, err := identity.GetRequester(ctx)
if err != nil {
return nil, errors.New("user not found in context")
}
handler, _, err := s.channelHandlerGetter.GetChannelHandler(ctx, u, vars.Channel)

View File

@@ -7,7 +7,6 @@ import (
"github.com/grafana/grafana-plugin-sdk-go/live"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/services/live/model"
)
@@ -30,8 +29,8 @@ func (s *BuiltinSubscriber) Type() string {
}
func (s *BuiltinSubscriber) Subscribe(ctx context.Context, vars Vars, data []byte) (model.SubscribeReply, backend.SubscribeStreamStatus, error) {
u, ok := livecontext.GetContextSignedUser(ctx)
if !ok {
u, err := identity.GetRequester(ctx)
if err != nil {
return model.SubscribeReply{}, backend.SubscribeStreamStatusPermissionDenied, nil
}
handler, _, err := s.channelHandlerGetter.GetChannelHandler(ctx, u, vars.Channel)

View File

@@ -5,7 +5,7 @@ import (
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/live/managedstream"
"github.com/grafana/grafana/pkg/services/live/model"
)
@@ -30,8 +30,8 @@ func (s *ManagedStreamSubscriber) Subscribe(ctx context.Context, vars Vars, _ []
logger.Error("Error getting managed stream", "error", err)
return model.SubscribeReply{}, 0, err
}
u, ok := livecontext.GetContextSignedUser(ctx)
if !ok {
u, err := identity.GetRequester(ctx)
if err != nil {
return model.SubscribeReply{}, backend.SubscribeStreamStatusPermissionDenied, nil
}
return stream.OnSubscribe(ctx, u, model.SubscribeEvent{

View File

@@ -4,7 +4,6 @@ import (
"context"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/services/live/model"
)

View File

@@ -5,6 +5,7 @@ import (
"github.com/gorilla/websocket"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/live/convert"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/services/live/pipeline"
@@ -44,8 +45,8 @@ func (s *PipelinePushHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request)
return
}
user, ok := livecontext.GetContextSignedUser(r.Context())
if !ok {
user, err := identity.GetRequester(r.Context())
if err != nil {
logger.Error("No user found in context")
rw.WriteHeader(http.StatusInternalServerError)
return

View File

@@ -5,8 +5,9 @@ import (
"time"
"github.com/gorilla/websocket"
liveDto "github.com/grafana/grafana-plugin-sdk-go/live"
liveDto "github.com/grafana/grafana-plugin-sdk-go/live"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/live/convert"
"github.com/grafana/grafana/pkg/services/live/livecontext"
"github.com/grafana/grafana/pkg/services/live/managedstream"
@@ -47,8 +48,8 @@ func (s *Handler) ServeHTTP(rw http.ResponseWriter, r *http.Request) {
return
}
user, ok := livecontext.GetContextSignedUser(r.Context())
if !ok {
user, err := identity.GetRequester(r.Context())
if err != nil {
logger.Error("No user found in context")
rw.WriteHeader(http.StatusInternalServerError)
return

View File

@@ -17,6 +17,7 @@ type Plugin struct {
// App fields
Parent *ParentPlugin
Children []string
IncludedInAppID string
DefaultNavURL string
Pinned bool
@@ -85,6 +86,18 @@ func ToGrafanaDTO(p *plugins.Plugin) Plugin {
dto.Parent = &ParentPlugin{ID: p.Parent.ID}
}
if len(p.Children) > 0 {
children := make([]string, 0, len(p.Children))
for _, child := range p.Children {
if child != nil {
children = append(children, child.ID)
}
}
if len(children) > 0 {
dto.Children = children
}
}
return dto
}

View File

@@ -2,27 +2,26 @@ package setting
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"time"
"github.com/grafana/grafana/pkg/infra/tracing"
"github.com/prometheus/client_golang/prometheus"
"go.opentelemetry.io/otel"
"go.opentelemetry.io/otel/trace"
"gopkg.in/ini.v1"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/apis/meta/v1/unstructured"
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apimachinery/pkg/runtime/schema"
utilnet "k8s.io/apimachinery/pkg/util/net"
"k8s.io/apimachinery/pkg/runtime/serializer"
"k8s.io/apiserver/pkg/endpoints/request"
"k8s.io/client-go/dynamic"
clientrest "k8s.io/client-go/rest"
"k8s.io/client-go/rest"
"k8s.io/client-go/transport"
authlib "github.com/grafana/authlib/authn"
logging "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/infra/tracing"
"github.com/grafana/grafana/pkg/semconv"
)
@@ -38,18 +37,11 @@ const (
ApiGroup = "setting.grafana.app"
apiVersion = "v0alpha1"
resource = "settings"
kind = "Setting"
listKind = "SettingList"
)
var settingGroupVersion = schema.GroupVersionResource{
Group: ApiGroup,
Version: apiVersion,
Resource: resource,
}
var settingGroupListKind = map[schema.GroupVersionResource]string{
settingGroupVersion: listKind,
var settingGroupVersion = schema.GroupVersion{
Group: ApiGroup,
Version: apiVersion,
}
type remoteSettingServiceMetrics struct {
@@ -106,10 +98,10 @@ type Service interface {
}
type remoteSettingService struct {
dynamicClient dynamic.Interface
log logging.Logger
pageSize int64
metrics remoteSettingServiceMetrics
restClient *rest.RESTClient
log logging.Logger
pageSize int64
metrics remoteSettingServiceMetrics
}
var _ Service = (*remoteSettingService)(nil)
@@ -126,7 +118,7 @@ type Config struct {
// At least one of WrapTransport or TokenExchangeClient is required.
WrapTransport transport.WrapperFunc
// TLSClientConfig configures TLS for the client connection.
TLSClientConfig clientrest.TLSClientConfig
TLSClientConfig rest.TLSClientConfig
// QPS limits requests per second (defaults to DefaultQPS).
QPS float32
// Burst allows request bursts above QPS (defaults to DefaultBurst).
@@ -145,29 +137,39 @@ type Setting struct {
Value string `json:"value"`
}
// settingResource represents a single Setting resource from the K8s API.
type settingResource struct {
Spec Setting `json:"spec"`
}
// settingListMetadata contains pagination info from the K8s list response.
type settingListMetadata struct {
Continue string `json:"continue,omitempty"`
}
// New creates a Service from the provided configuration.
func New(config Config) (Service, error) {
log := logging.New(LogPrefix)
dynamicClient, err := getDynamicClient(config, log)
restClient, err := getRestClient(config, log)
if err != nil {
return nil, err
return nil, fmt.Errorf("failed to create REST client: %w", err)
}
pageSize := DefaultPageSize
if config.PageSize > 0 {
pageSize = config.PageSize
}
metrics := initMetrics()
return &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: pageSize,
log: log,
metrics: metrics,
restClient: restClient,
log: log,
pageSize: pageSize,
metrics: initMetrics(),
}, nil
}
func (m *remoteSettingService) ListAsIni(ctx context.Context, labelSelector metav1.LabelSelector) (*ini.File, error) {
func (s *remoteSettingService) ListAsIni(ctx context.Context, labelSelector metav1.LabelSelector) (*ini.File, error) {
namespace, ok := request.NamespaceFrom(ctx)
ns := semconv.GrafanaNamespaceName(namespace)
ctx, span := tracer.Start(ctx, "remoteSettingService.ListAsIni",
@@ -178,33 +180,34 @@ func (m *remoteSettingService) ListAsIni(ctx context.Context, labelSelector meta
return nil, tracing.Errorf(span, "missing namespace in context")
}
settings, err := m.List(ctx, labelSelector)
settings, err := s.List(ctx, labelSelector)
if err != nil {
return nil, err
}
iniFile, err := m.toIni(settings)
iniFile, err := toIni(settings)
if err != nil {
return nil, tracing.Error(span, err)
}
return iniFile, nil
}
func (m *remoteSettingService) List(ctx context.Context, labelSelector metav1.LabelSelector) ([]*Setting, error) {
func (s *remoteSettingService) List(ctx context.Context, labelSelector metav1.LabelSelector) ([]*Setting, error) {
namespace, ok := request.NamespaceFrom(ctx)
ns := semconv.GrafanaNamespaceName(namespace)
ctx, span := tracer.Start(ctx, "remoteSettingService.List",
trace.WithAttributes(ns))
defer span.End()
if !ok || namespace == "" {
return nil, tracing.Errorf(span, "missing namespace in context")
}
log := m.log.FromContext(ctx).New(ns.Key, ns.Value, "function", "remoteSettingService.List", "traceId", span.SpanContext().TraceID())
log := s.log.FromContext(ctx).New(ns.Key, ns.Value, "function", "remoteSettingService.List", "traceId", span.SpanContext().TraceID())
startTime := time.Now()
var status string
defer func() {
duration := time.Since(startTime).Seconds()
m.metrics.listDuration.WithLabelValues(status).Observe(duration)
s.metrics.listDuration.WithLabelValues(status).Observe(duration)
}()
selector, err := metav1.LabelSelectorAsSelector(&labelSelector)
@@ -216,64 +219,142 @@ func (m *remoteSettingService) List(ctx context.Context, labelSelector metav1.La
log.Debug("empty selector. Fetching all settings")
}
var allSettings []*Setting
// Pre-allocate with estimated capacity
allSettings := make([]*Setting, 0, s.pageSize*8)
var continueToken string
hasNext := true
totalPages := 0
// Using an upper limit to prevent infinite loops
for hasNext && totalPages < 1000 {
totalPages++
opts := metav1.ListOptions{
Limit: m.pageSize,
Continue: continueToken,
}
if !selector.Empty() {
opts.LabelSelector = selector.String()
}
settingsList, lErr := m.dynamicClient.Resource(settingGroupVersion).Namespace(namespace).List(ctx, opts)
settings, nextToken, lErr := s.fetchPage(ctx, namespace, selector.String(), continueToken)
if lErr != nil {
status = "error"
return nil, tracing.Error(span, lErr)
}
for i := range settingsList.Items {
setting, pErr := parseSettingResource(&settingsList.Items[i])
if pErr != nil {
status = "error"
return nil, tracing.Error(span, pErr)
}
allSettings = append(allSettings, setting)
}
continueToken = settingsList.GetContinue()
allSettings = append(allSettings, settings...)
continueToken = nextToken
if continueToken == "" {
hasNext = false
}
}
status = "success"
m.metrics.listResultSize.WithLabelValues(status).Observe(float64(len(allSettings)))
s.metrics.listResultSize.WithLabelValues(status).Observe(float64(len(allSettings)))
return allSettings, nil
}
func parseSettingResource(setting *unstructured.Unstructured) (*Setting, error) {
spec, found, err := unstructured.NestedMap(setting.Object, "spec")
func (s *remoteSettingService) fetchPage(ctx context.Context, namespace, labelSelector, continueToken string) ([]*Setting, string, error) {
req := s.restClient.Get().
Resource(resource).
Namespace(namespace).
Param("limit", fmt.Sprintf("%d", s.pageSize))
if labelSelector != "" {
req = req.Param("labelSelector", labelSelector)
}
if continueToken != "" {
req = req.Param("continue", continueToken)
}
stream, err := req.Stream(ctx)
if err != nil {
return nil, fmt.Errorf("failed to get spec from setting: %w", err)
}
if !found {
return nil, fmt.Errorf("spec not found in setting %s", setting.GetName())
return nil, "", fmt.Errorf("request failed: %w", err)
}
defer func() { _ = stream.Close() }()
var result Setting
if err := runtime.DefaultUnstructuredConverter.FromUnstructured(spec, &result); err != nil {
return nil, fmt.Errorf("failed to convert spec to Setting: %w", err)
}
return &result, nil
return parseSettingList(stream)
}
func (m *remoteSettingService) toIni(settings []*Setting) (*ini.File, error) {
// parseSettingList parses a SettingList JSON response using token-by-token streaming.
func parseSettingList(r io.Reader) ([]*Setting, string, error) {
decoder := json.NewDecoder(r)
// Currently, first page may have a large number of items.
settings := make([]*Setting, 0, 1600)
var continueToken string
// Skip to the start of the object
if _, err := decoder.Token(); err != nil {
return nil, "", fmt.Errorf("expected start of object: %w", err)
}
for decoder.More() {
// Read field name
tok, err := decoder.Token()
if err != nil {
return nil, "", fmt.Errorf("failed to read field name: %w", err)
}
fieldName, ok := tok.(string)
if !ok {
continue
}
switch fieldName {
case "metadata":
var meta settingListMetadata
if err := decoder.Decode(&meta); err != nil {
return nil, "", fmt.Errorf("failed to decode metadata: %w", err)
}
continueToken = meta.Continue
case "items":
// Parse items array token-by-token
itemSettings, err := parseItems(decoder)
if err != nil {
return nil, "", err
}
settings = append(settings, itemSettings...)
default:
// Skip unknown fields
var skip json.RawMessage
if err := decoder.Decode(&skip); err != nil {
return nil, "", fmt.Errorf("failed to skip field %s: %w", fieldName, err)
}
}
}
return settings, continueToken, nil
}
func parseItems(decoder *json.Decoder) ([]*Setting, error) {
// Expect start of array
tok, err := decoder.Token()
if err != nil {
return nil, fmt.Errorf("expected start of items array: %w", err)
}
if tok != json.Delim('[') {
return nil, fmt.Errorf("expected '[', got %v", tok)
}
settings := make([]*Setting, 0, DefaultPageSize)
// Parse each item
for decoder.More() {
var item settingResource
if err := decoder.Decode(&item); err != nil {
return nil, fmt.Errorf("failed to decode setting item: %w", err)
}
settings = append(settings, &Setting{
Section: item.Spec.Section,
Key: item.Spec.Key,
Value: item.Spec.Value,
})
}
// Consume end of array
if _, err := decoder.Token(); err != nil {
return nil, fmt.Errorf("expected end of items array: %w", err)
}
return settings, nil
}
func toIni(settings []*Setting) (*ini.File, error) {
conf := ini.Empty()
for _, setting := range settings {
if !conf.HasSection(setting.Section) {
@@ -287,7 +368,7 @@ func (m *remoteSettingService) toIni(settings []*Setting) (*ini.File, error) {
return conf, nil
}
func getDynamicClient(config Config, log logging.Logger) (dynamic.Interface, error) {
func getRestClient(config Config, log logging.Logger) (*rest.RESTClient, error) {
if config.URL == "" {
return nil, fmt.Errorf("URL cannot be empty")
}
@@ -296,7 +377,7 @@ func getDynamicClient(config Config, log logging.Logger) (dynamic.Interface, err
}
wrapTransport := config.WrapTransport
if config.WrapTransport == nil {
if wrapTransport == nil {
log.Debug("using default wrapTransport with TokenExchangeClient")
wrapTransport = func(rt http.RoundTripper) http.RoundTripper {
return &authRoundTripper{
@@ -316,13 +397,21 @@ func getDynamicClient(config Config, log logging.Logger) (dynamic.Interface, err
burst = config.Burst
}
return dynamic.NewForConfig(&clientrest.Config{
restConfig := &rest.Config{
Host: config.URL,
WrapTransport: wrapTransport,
TLSClientConfig: config.TLSClientConfig,
WrapTransport: wrapTransport,
QPS: qps,
Burst: burst,
})
// Configure for our API group
APIPath: "/apis",
ContentConfig: rest.ContentConfig{
GroupVersion: &settingGroupVersion,
NegotiatedSerializer: serializer.NewCodecFactory(nil).WithoutConversion(),
},
}
return rest.RESTClientFor(restConfig)
}
// authRoundTripper wraps an HTTP transport with token-based authentication.
@@ -341,10 +430,9 @@ func (a *authRoundTripper) RoundTrip(req *http.Request) (*http.Response, error)
if err != nil {
return nil, fmt.Errorf("failed to exchange token: %w", err)
}
req = utilnet.CloneRequest(req)
req.Header.Set("X-Access-Token", fmt.Sprintf("Bearer %s", token.Token))
return a.transport.RoundTrip(req)
reqCopy := req.Clone(req.Context())
reqCopy.Header.Set("X-Access-Token", fmt.Sprintf("Bearer %s", token.Token))
return a.transport.RoundTrip(reqCopy)
}
func initMetrics() remoteSettingServiceMetrics {
@@ -373,12 +461,12 @@ func initMetrics() remoteSettingServiceMetrics {
return metrics
}
func (m *remoteSettingService) Describe(descs chan<- *prometheus.Desc) {
m.metrics.listDuration.Describe(descs)
m.metrics.listResultSize.Describe(descs)
func (s *remoteSettingService) Describe(descs chan<- *prometheus.Desc) {
s.metrics.listDuration.Describe(descs)
s.metrics.listResultSize.Describe(descs)
}
func (m *remoteSettingService) Collect(metrics chan<- prometheus.Metric) {
m.metrics.listDuration.Collect(metrics)
m.metrics.listResultSize.Collect(metrics)
func (s *remoteSettingService) Collect(metrics chan<- prometheus.Metric) {
s.metrics.listDuration.Collect(metrics)
s.metrics.listResultSize.Collect(metrics)
}

View File

@@ -1,69 +1,36 @@
package setting
import (
"bytes"
"context"
"fmt"
"net/http"
"net/http/httptest"
"strings"
"testing"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/apis/meta/v1/unstructured"
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apiserver/pkg/endpoints/request"
"k8s.io/client-go/dynamic/fake"
k8testing "k8s.io/client-go/testing"
authlib "github.com/grafana/authlib/authn"
"github.com/grafana/grafana/pkg/infra/log"
)
func TestRemoteSettingService_ListAsIni(t *testing.T) {
t.Run("should filter settings by label selector", func(t *testing.T) {
// Create multiple settings, only some matching the selector
setting1 := newUnstructuredSetting("test-namespace", Setting{Section: "database", Key: "type", Value: "postgres"})
setting2 := newUnstructuredSetting("test-namespace", Setting{Section: "server", Key: "port", Value: "3000"})
setting3 := newUnstructuredSetting("test-namespace", Setting{Section: "database", Key: "host", Value: "localhost"})
client := newTestClient(500, setting1, setting2, setting3)
// Create a selector that should match only database settings
selector := metav1.LabelSelector{
MatchLabels: map[string]string{
"section": "database",
},
}
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.ListAsIni(ctx, selector)
require.NoError(t, err)
assert.NotNil(t, result)
// Should only have database settings, not server settings
assert.True(t, result.HasSection("database"))
assert.Equal(t, "postgres", result.Section("database").Key("type").String())
assert.Equal(t, "localhost", result.Section("database").Key("host").String())
// Should NOT have server settings
assert.False(t, result.HasSection("server"))
})
t.Run("should return all settings with empty selector", func(t *testing.T) {
// Create multiple settings across different sections
setting1 := newUnstructuredSetting("test-namespace", Setting{Section: "server", Key: "port", Value: "3000"})
setting2 := newUnstructuredSetting("test-namespace", Setting{Section: "database", Key: "type", Value: "mysql"})
client := newTestClient(500, setting1, setting2)
// Empty selector should select everything
selector := metav1.LabelSelector{}
settings := []Setting{
{Section: "server", Key: "port", Value: "3000"},
{Section: "database", Key: "type", Value: "mysql"},
}
server := newTestServer(t, settings, "")
defer server.Close()
client := newTestClient(t, server.URL, 500)
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.ListAsIni(ctx, selector)
result, err := client.ListAsIni(ctx, metav1.LabelSelector{})
require.NoError(t, err)
assert.NotNil(t, result)
// Should have all settings from all sections
assert.True(t, result.HasSection("server"))
assert.Equal(t, "3000", result.Section("server").Key("port").String())
assert.True(t, result.HasSection("database"))
@@ -73,209 +40,168 @@ func TestRemoteSettingService_ListAsIni(t *testing.T) {
func TestRemoteSettingService_List(t *testing.T) {
t.Run("should handle single page response", func(t *testing.T) {
setting := newUnstructuredSetting("test-namespace", Setting{Section: "server", Key: "port", Value: "3000"})
client := newTestClient(500, setting)
settings := []Setting{
{Section: "server", Key: "port", Value: "3000"},
}
server := newTestServer(t, settings, "")
defer server.Close()
client := newTestClient(t, server.URL, 500)
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.List(ctx, metav1.LabelSelector{})
require.NoError(t, err)
assert.Len(t, result, 1)
spec := result[0]
assert.Equal(t, "server", spec.Section)
assert.Equal(t, "port", spec.Key)
assert.Equal(t, "3000", spec.Value)
assert.Equal(t, "server", result[0].Section)
assert.Equal(t, "port", result[0].Key)
assert.Equal(t, "3000", result[0].Value)
})
t.Run("should handle multiple pages", func(t *testing.T) {
totalPages := 3
pageSize := 5
pages := make([][]*unstructured.Unstructured, totalPages)
for pageNum := 0; pageNum < totalPages; pageNum++ {
for idx := 0; idx < pageSize; idx++ {
item := newUnstructuredSetting(
"test-namespace",
Setting{
Section: fmt.Sprintf("section-%d", pageNum),
Key: fmt.Sprintf("key-%d", idx),
Value: fmt.Sprintf("val-%d-%d", pageNum, idx),
},
)
pages[pageNum] = append(pages[pageNum], item)
}
}
scheme := runtime.NewScheme()
dynamicClient := fake.NewSimpleDynamicClientWithCustomListKinds(scheme, settingGroupListKind)
listCallCount := 0
dynamicClient.PrependReactor("list", "settings", func(action k8testing.Action) (handled bool, ret runtime.Object, err error) {
listCallCount++
continueToken := fmt.Sprintf("continue-%d", listCallCount)
if listCallCount == totalPages {
continueToken = ""
}
if listCallCount <= totalPages {
list := &unstructured.UnstructuredList{
Object: map[string]interface{}{
"apiVersion": ApiGroup + "/" + apiVersion,
"kind": listKind,
},
}
list.SetContinue(continueToken)
for _, item := range pages[listCallCount-1] {
list.Items = append(list.Items, *item)
}
return true, list, nil
}
return false, nil, nil
})
client := &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: int64(pageSize),
log: log.NewNopLogger(),
metrics: initMetrics(),
t.Run("should handle multiple settings", func(t *testing.T) {
settings := []Setting{
{Section: "server", Key: "port", Value: "3000"},
{Section: "database", Key: "host", Value: "localhost"},
{Section: "database", Key: "port", Value: "5432"},
}
server := newTestServer(t, settings, "")
defer server.Close()
client := newTestClient(t, server.URL, 500)
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.List(ctx, metav1.LabelSelector{})
require.NoError(t, err)
assert.Len(t, result, totalPages*pageSize)
assert.Equal(t, totalPages, listCallCount)
assert.Len(t, result, 3)
})
t.Run("should pass label selector when provided", func(t *testing.T) {
scheme := runtime.NewScheme()
dynamicClient := fake.NewSimpleDynamicClientWithCustomListKinds(scheme, settingGroupListKind)
dynamicClient.PrependReactor("list", "settings", func(action k8testing.Action) (handled bool, ret runtime.Object, err error) {
listAction := action.(k8testing.ListActionImpl)
assert.Equal(t, "app=grafana", listAction.ListOptions.LabelSelector)
return true, &unstructured.UnstructuredList{}, nil
})
client := &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: 500,
log: log.NewNopLogger(),
metrics: initMetrics(),
t.Run("should handle pagination with continue token", func(t *testing.T) {
// First page
page1Settings := []Setting{
{Section: "section-0", Key: "key-0", Value: "value-0"},
{Section: "section-0", Key: "key-1", Value: "value-1"},
}
// Second page
page2Settings := []Setting{
{Section: "section-1", Key: "key-0", Value: "value-2"},
{Section: "section-1", Key: "key-1", Value: "value-3"},
}
requestCount := 0
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
requestCount++
continueToken := r.URL.Query().Get("continue")
var settings []Setting
var nextContinue string
if continueToken == "" {
settings = page1Settings
nextContinue = "page2"
} else {
settings = page2Settings
nextContinue = ""
}
w.Header().Set("Content-Type", "application/json")
_, _ = w.Write([]byte(generateSettingsJSON(settings, nextContinue)))
}))
defer server.Close()
client := newTestClient(t, server.URL, 2)
ctx := request.WithNamespace(context.Background(), "test-namespace")
_, err := client.List(ctx, metav1.LabelSelector{MatchLabels: map[string]string{"app": "grafana"}})
result, err := client.List(ctx, metav1.LabelSelector{})
require.NoError(t, err)
assert.Len(t, result, 4)
assert.Equal(t, 2, requestCount)
})
t.Run("should stop pagination at 1000 pages", func(t *testing.T) {
scheme := runtime.NewScheme()
dynamicClient := fake.NewSimpleDynamicClientWithCustomListKinds(scheme, settingGroupListKind)
listCallCount := 0
dynamicClient.PrependReactor("list", "settings", func(action k8testing.Action) (handled bool, ret runtime.Object, err error) {
listCallCount++
// Always return a continue token to simulate infinite pagination
list := &unstructured.UnstructuredList{}
list.SetContinue("continue-forever")
return true, list, nil
})
t.Run("should return error when namespace is missing", func(t *testing.T) {
server := newTestServer(t, nil, "")
defer server.Close()
client := &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: 10,
log: log.NewNopLogger(),
metrics: initMetrics(),
}
client := newTestClient(t, server.URL, 500)
ctx := context.Background() // No namespace
ctx := request.WithNamespace(context.Background(), "test-namespace")
_, err := client.List(ctx, metav1.LabelSelector{})
require.NoError(t, err)
assert.Equal(t, 1000, listCallCount, "Should stop at 1000 pages to prevent infinite loops")
})
t.Run("should return error when parsing setting fails", func(t *testing.T) {
scheme := runtime.NewScheme()
dynamicClient := fake.NewSimpleDynamicClientWithCustomListKinds(scheme, settingGroupListKind)
dynamicClient.PrependReactor("list", "settings", func(action k8testing.Action) (handled bool, ret runtime.Object, err error) {
// Return a malformed setting without spec
list := &unstructured.UnstructuredList{
Object: map[string]interface{}{
"apiVersion": ApiGroup + "/" + apiVersion,
"kind": listKind,
},
}
malformedSetting := &unstructured.Unstructured{
Object: map[string]interface{}{
"apiVersion": ApiGroup + "/" + apiVersion,
"kind": kind,
"metadata": map[string]interface{}{
"name": "malformed",
"namespace": "test-namespace",
},
// Missing spec
},
}
list.Items = append(list.Items, *malformedSetting)
return true, list, nil
})
client := &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: 500,
log: log.NewNopLogger(),
metrics: initMetrics(),
}
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.List(ctx, metav1.LabelSelector{})
require.Error(t, err)
assert.Nil(t, result)
assert.Contains(t, err.Error(), "spec not found")
})
}
func TestParseSettingResource(t *testing.T) {
t.Run("should parse valid setting resource", func(t *testing.T) {
setting := newUnstructuredSetting("test-namespace", Setting{Section: "database", Key: "type", Value: "postgres"})
result, err := parseSettingResource(setting)
require.NoError(t, err)
assert.NotNil(t, result)
assert.Equal(t, "database", result.Section)
assert.Equal(t, "type", result.Key)
assert.Equal(t, "postgres", result.Value)
assert.Contains(t, err.Error(), "missing namespace")
})
t.Run("should return error when spec is missing", func(t *testing.T) {
setting := &unstructured.Unstructured{
Object: map[string]interface{}{
"apiVersion": ApiGroup + "/" + apiVersion,
"kind": kind,
"metadata": map[string]interface{}{
"name": "test-setting",
"namespace": "test-namespace",
},
// No spec
},
}
t.Run("should return error on HTTP error", func(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusInternalServerError)
_, _ = w.Write([]byte("internal server error"))
}))
defer server.Close()
result, err := parseSettingResource(setting)
client := newTestClient(t, server.URL, 500)
ctx := request.WithNamespace(context.Background(), "test-namespace")
result, err := client.List(ctx, metav1.LabelSelector{})
require.Error(t, err)
assert.Nil(t, result)
assert.Contains(t, err.Error(), "spec not found")
})
}
func TestRemoteSettingService_ToIni(t *testing.T) {
func TestParseSettingList(t *testing.T) {
t.Run("should parse valid settings list", func(t *testing.T) {
jsonData := `{
"apiVersion": "setting.grafana.app/v0alpha1",
"kind": "SettingList",
"metadata": {"continue": ""},
"items": [
{"spec": {"section": "database", "key": "type", "value": "postgres"}},
{"spec": {"section": "server", "key": "port", "value": "3000"}}
]
}`
settings, continueToken, err := parseSettingList(strings.NewReader(jsonData))
require.NoError(t, err)
assert.Len(t, settings, 2)
assert.Equal(t, "", continueToken)
assert.Equal(t, "database", settings[0].Section)
assert.Equal(t, "type", settings[0].Key)
assert.Equal(t, "postgres", settings[0].Value)
})
t.Run("should parse continue token", func(t *testing.T) {
jsonData := `{
"apiVersion": "setting.grafana.app/v0alpha1",
"kind": "SettingList",
"metadata": {"continue": "next-page-token"},
"items": []
}`
_, continueToken, err := parseSettingList(strings.NewReader(jsonData))
require.NoError(t, err)
assert.Equal(t, "next-page-token", continueToken)
})
t.Run("should handle empty items", func(t *testing.T) {
jsonData := `{
"apiVersion": "setting.grafana.app/v0alpha1",
"kind": "SettingList",
"metadata": {},
"items": []
}`
settings, _, err := parseSettingList(strings.NewReader(jsonData))
require.NoError(t, err)
assert.Len(t, settings, 0)
})
}
func TestToIni(t *testing.T) {
t.Run("should convert settings to ini format", func(t *testing.T) {
settings := []*Setting{
{Section: "database", Key: "type", Value: "postgres"},
@@ -283,12 +209,7 @@ func TestRemoteSettingService_ToIni(t *testing.T) {
{Section: "server", Key: "http_port", Value: "3000"},
}
client := &remoteSettingService{
pageSize: 500,
log: log.NewNopLogger(),
}
result, err := client.toIni(settings)
result, err := toIni(settings)
require.NoError(t, err)
assert.NotNil(t, result)
@@ -302,12 +223,7 @@ func TestRemoteSettingService_ToIni(t *testing.T) {
t.Run("should handle empty settings list", func(t *testing.T) {
var settings []*Setting
client := &remoteSettingService{
pageSize: 500,
log: log.NewNopLogger(),
}
result, err := client.toIni(settings)
result, err := toIni(settings)
require.NoError(t, err)
assert.NotNil(t, result)
@@ -315,35 +231,13 @@ func TestRemoteSettingService_ToIni(t *testing.T) {
assert.Len(t, sections, 1) // Only default section
})
t.Run("should create section if it does not exist", func(t *testing.T) {
settings := []*Setting{
{Section: "new_section", Key: "new_key", Value: "new_value"},
}
client := &remoteSettingService{
pageSize: 500,
log: log.NewNopLogger(),
}
result, err := client.toIni(settings)
require.NoError(t, err)
assert.True(t, result.HasSection("new_section"))
assert.Equal(t, "new_value", result.Section("new_section").Key("new_key").String())
})
t.Run("should handle multiple keys in same section", func(t *testing.T) {
settings := []*Setting{
{Section: "auth", Key: "disable_login_form", Value: "false"},
{Section: "auth", Key: "disable_signout_menu", Value: "true"},
}
client := &remoteSettingService{
pageSize: 500,
log: log.NewNopLogger(),
}
result, err := client.toIni(settings)
result, err := toIni(settings)
require.NoError(t, err)
assert.True(t, result.HasSection("auth"))
@@ -383,24 +277,23 @@ func TestNew(t *testing.T) {
assert.Equal(t, int64(100), remoteClient.pageSize)
})
t.Run("should use default page size when zero is provided", func(t *testing.T) {
t.Run("should create client with custom QPS and Burst", func(t *testing.T) {
config := Config{
URL: "https://example.com",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
PageSize: 0,
QPS: 50.0,
Burst: 100,
}
client, err := New(config)
require.NoError(t, err)
assert.NotNil(t, client)
remoteClient := client.(*remoteSettingService)
assert.Equal(t, DefaultPageSize, remoteClient.pageSize)
})
t.Run("should return error when config is invalid", func(t *testing.T) {
t.Run("should return error when URL is empty", func(t *testing.T) {
config := Config{
URL: "", // Invalid: empty URL
URL: "",
}
client, err := New(config)
@@ -409,134 +302,126 @@ func TestNew(t *testing.T) {
assert.Nil(t, client)
assert.Contains(t, err.Error(), "URL cannot be empty")
})
}
func TestGetDynamicClient(t *testing.T) {
logger := log.NewNopLogger()
t.Run("should return error when SettingServiceURL is empty", func(t *testing.T) {
config := Config{
URL: "",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
}
client, err := getDynamicClient(config, logger)
require.Error(t, err)
assert.Nil(t, client)
assert.Contains(t, err.Error(), "URL cannot be empty")
})
t.Run("should return error when both TokenExchangeClient and WrapTransport are nil", func(t *testing.T) {
t.Run("should return error when auth is not configured", func(t *testing.T) {
config := Config{
URL: "https://example.com",
TokenExchangeClient: nil,
WrapTransport: nil,
}
client, err := getDynamicClient(config, logger)
client, err := New(config)
require.Error(t, err)
assert.Nil(t, client)
assert.Contains(t, err.Error(), "must set either TokenExchangeClient or WrapTransport")
})
t.Run("should create client with WrapTransport", func(t *testing.T) {
config := Config{
URL: "https://example.com",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
}
client, err := getDynamicClient(config, logger)
require.NoError(t, err)
assert.NotNil(t, client)
})
t.Run("should not fail when QPS and Burst are not provided", func(t *testing.T) {
config := Config{
URL: "https://example.com",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
}
client, err := getDynamicClient(config, logger)
require.NoError(t, err)
assert.NotNil(t, client)
})
t.Run("should not fail when custom QPS and Burst are provided", func(t *testing.T) {
config := Config{
URL: "https://example.com",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
QPS: 10.0,
Burst: 20,
}
client, err := getDynamicClient(config, logger)
require.NoError(t, err)
assert.NotNil(t, client)
})
t.Run("should use WrapTransport when both WrapTransport and TokenExchangeClient are provided", func(t *testing.T) {
t.Run("should use WrapTransport when provided", func(t *testing.T) {
wrapTransportCalled := false
tokenExchangeClient := &authlib.TokenExchangeClient{}
config := Config{
URL: "https://example.com",
TokenExchangeClient: tokenExchangeClient,
URL: "https://example.com",
WrapTransport: func(rt http.RoundTripper) http.RoundTripper {
wrapTransportCalled = true
return rt
},
}
client, err := getDynamicClient(config, logger)
client, err := New(config)
require.NoError(t, err)
assert.NotNil(t, client)
assert.True(t, wrapTransportCalled, "WrapTransport should be called and take precedence over TokenExchangeClient")
assert.True(t, wrapTransportCalled)
})
}
// Helper function to create an unstructured Setting object for tests
func newUnstructuredSetting(namespace string, spec Setting) *unstructured.Unstructured {
// Generate resource name in the format {section}--{key}
name := fmt.Sprintf("%s--%s", spec.Section, spec.Key)
// Helper functions
obj := &unstructured.Unstructured{
Object: map[string]interface{}{
"apiVersion": ApiGroup + "/" + apiVersion,
"kind": kind,
"metadata": map[string]interface{}{
"name": name,
"namespace": namespace,
},
"spec": map[string]interface{}{
"section": spec.Section,
"key": spec.Key,
"value": spec.Value,
},
},
}
// Always set section and key labels
obj.SetLabels(map[string]string{
"section": spec.Section,
"key": spec.Key,
})
return obj
func newTestServer(t *testing.T, settings []Setting, continueToken string) *httptest.Server {
t.Helper()
return httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
_, _ = w.Write([]byte(generateSettingsJSON(settings, continueToken)))
}))
}
// Helper function to create a test client with the dynamic fake client
func newTestClient(pageSize int64, objects ...runtime.Object) *remoteSettingService {
scheme := runtime.NewScheme()
dynamicClient := fake.NewSimpleDynamicClientWithCustomListKinds(scheme, settingGroupListKind, objects...)
func newTestClient(t *testing.T, serverURL string, pageSize int64) Service {
t.Helper()
config := Config{
URL: serverURL,
WrapTransport: func(rt http.RoundTripper) http.RoundTripper { return rt },
PageSize: pageSize,
}
client, err := New(config)
require.NoError(t, err)
return client
}
return &remoteSettingService{
dynamicClient: dynamicClient,
pageSize: pageSize,
log: log.NewNopLogger(),
metrics: initMetrics(),
func generateSettingsJSON(settings []Setting, continueToken string) string {
var sb strings.Builder
sb.WriteString(fmt.Sprintf(`{"apiVersion":"setting.grafana.app/v0alpha1","kind":"SettingList","metadata":{"continue":"%s"},"items":[`, continueToken))
for i, s := range settings {
if i > 0 {
sb.WriteString(",")
}
sb.WriteString(fmt.Sprintf(
`{"apiVersion":"setting.grafana.app/v0alpha1","kind":"Setting","metadata":{"name":"%s--%s","namespace":"test-namespace"},"spec":{"section":"%s","key":"%s","value":"%s"}}`,
s.Section, s.Key, s.Section, s.Key, s.Value,
))
}
sb.WriteString(`]}`)
return sb.String()
}
// Benchmark tests for streaming JSON parser
func BenchmarkParseSettingList(b *testing.B) {
jsonData := generateSettingListJSON(4000, 100)
jsonBytes := []byte(jsonData)
b.ResetTimer()
b.ReportAllocs()
for i := 0; i < b.N; i++ {
reader := bytes.NewReader(jsonBytes)
_, _, _ = parseSettingList(reader)
}
}
func BenchmarkParseSettingList_SinglePage(b *testing.B) {
jsonData := generateSettingListJSON(500, 50)
jsonBytes := []byte(jsonData)
b.ResetTimer()
b.ReportAllocs()
for i := 0; i < b.N; i++ {
reader := bytes.NewReader(jsonBytes)
_, _, _ = parseSettingList(reader)
}
}
// generateSettingListJSON generates a K8s-style SettingList JSON response for benchmarks
func generateSettingListJSON(totalSettings, numSections int) string {
var sb strings.Builder
sb.WriteString(`{"apiVersion":"setting.grafana.app/v0alpha1","kind":"SettingList","metadata":{"continue":""},"items":[`)
settingsPerSection := totalSettings / numSections
first := true
for section := 0; section < numSections; section++ {
for key := 0; key < settingsPerSection; key++ {
if !first {
sb.WriteString(",")
}
first = false
sb.WriteString(fmt.Sprintf(
`{"apiVersion":"setting.grafana.app/v0alpha1","kind":"Setting","metadata":{"name":"section-%03d--key-%03d","namespace":"bench-ns"},"spec":{"section":"section-%03d","key":"key-%03d","value":"value-for-section-%d-key-%d"}}`,
section, key, section, key, section, key,
))
}
}
sb.WriteString(`]}`)
return sb.String()
}

View File

@@ -165,5 +165,7 @@ func (oss *OSSMigrations) AddMigration(mg *Migrator) {
ualert.AddStateAnnotationsColumn(mg)
ualert.CollateBinAlertRuleNamespace(mg)
ualert.CollateBinAlertRuleGroup(mg)
}

View File

@@ -0,0 +1,10 @@
package ualert
import "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
// CollateBinAlertRuleNamespace ensures that namespace_uid column collates in the same way go sorts strings.
func CollateBinAlertRuleNamespace(mg *migrator.Migrator) {
mg.AddMigration("ensure namespace_uid column sorts the same way as golang", migrator.NewRawSQLMigration("").
Mysql("ALTER TABLE alert_rule MODIFY namespace_uid VARCHAR(40) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin NOT NULL;").
Postgres(`ALTER TABLE alert_rule ALTER COLUMN namespace_uid SET DATA TYPE varchar(40) COLLATE "C";`))
}

View File

@@ -102,7 +102,7 @@ func (cfg *Cfg) processPreinstallPlugins(rawInstallPlugins []string, preinstallP
if len(parts) > 1 {
version = parts[1]
if len(parts) > 2 {
url = parts[2]
url = strings.Join(parts[2:], "@")
}
}

View File

@@ -210,6 +210,11 @@ func Test_readPluginSettings(t *testing.T) {
rawInput: "plugin1@@https://example.com/plugin1.tar.gz",
expected: append(defaultPreinstallPluginsList, InstallPlugin{ID: "plugin1", Version: "", URL: "https://example.com/plugin1.tar.gz"}),
},
{
name: "should parse a plugin with credentials in the URL",
rawInput: "plugin1@@https://username:password@example.com/plugin1.tar.gz",
expected: append(defaultPreinstallPluginsList, InstallPlugin{ID: "plugin1", Version: "", URL: "https://username:password@example.com/plugin1.tar.gz"}),
},
{
name: "when preinstall_async is false, should add all plugins to preinstall_sync",
rawInput: "plugin1",

View File

@@ -110,15 +110,24 @@ func (cfg *Cfg) readZanzanaSettings() {
zc.Mode = "embedded"
}
zc.Token = clientSec.Key("token").MustString("")
zc.TokenExchangeURL = clientSec.Key("token_exchange_url").MustString("")
zc.Addr = clientSec.Key("address").MustString("")
zc.ServerCertFile = clientSec.Key("tls_cert").MustString("")
// TODO: read Token and TokenExchangeURL from grpc_client_authentication section
grpcClientAuthSection := cfg.SectionWithEnvOverrides("grpc_client_authentication")
zc.Token = grpcClientAuthSection.Key("token").MustString("")
zc.TokenExchangeURL = grpcClientAuthSection.Key("token_exchange_url").MustString("")
zc.TokenNamespace = grpcClientAuthSection.Key("token_namespace").MustString("stacks-" + cfg.StackID)
// TODO: remove old settings when migrated
token := clientSec.Key("token").MustString("")
tokenExchangeURL := clientSec.Key("token_exchange_url").MustString("")
if token != "" {
zc.Token = token
}
if tokenExchangeURL != "" {
zc.TokenExchangeURL = tokenExchangeURL
}
cfg.ZanzanaClient = zc
zs := ZanzanaServerSettings{}

View File

@@ -57,11 +57,11 @@ func TestIntegrationPluginMeta(t *testing.T) {
foundIDs := make(map[string]bool)
for _, item := range response.Result.Items {
require.NotNil(t, item.Spec.PluginJSON)
foundIDs[item.Spec.PluginJSON.Id] = true
require.NotEmpty(t, item.Spec.PluginJSON.Id)
require.NotEmpty(t, item.Spec.PluginJSON.Type)
require.NotEmpty(t, item.Spec.PluginJSON.Name)
require.NotNil(t, item.Spec.PluginJson)
foundIDs[item.Spec.PluginJson.Id] = true
require.NotEmpty(t, item.Spec.PluginJson.Id)
require.NotEmpty(t, item.Spec.PluginJson.Type)
require.NotEmpty(t, item.Spec.PluginJson.Name)
}
require.True(t, foundIDs["grafana-piechart-panel"])
require.True(t, foundIDs["grafana-clock-panel"])
@@ -109,10 +109,10 @@ func TestIntegrationPluginMeta(t *testing.T) {
}, &pluginsv0alpha1.Meta{})
require.NotNil(t, response.Result)
require.NotNil(t, response.Result.Spec.PluginJSON)
require.Equal(t, "grafana-piechart-panel", response.Result.Spec.PluginJSON.Id)
require.NotEmpty(t, response.Result.Spec.PluginJSON.Name)
require.NotEmpty(t, response.Result.Spec.PluginJSON.Type)
require.NotNil(t, response.Result.Spec.PluginJson)
require.Equal(t, "grafana-piechart-panel", response.Result.Spec.PluginJson.Id)
require.NotEmpty(t, response.Result.Spec.PluginJson.Name)
require.NotEmpty(t, response.Result.Spec.PluginJson.Type)
})
t.Run("get plugin meta for non-existent plugin", func(t *testing.T) {

View File

@@ -0,0 +1,211 @@
package provisioning
import (
"context"
"testing"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/apis/meta/v1/unstructured"
provisioning "github.com/grafana/grafana/apps/provisioning/pkg/apis/provisioning/v0alpha1"
"github.com/grafana/grafana/pkg/util/testutil"
)
// TestIntegrationProvisioning_RepositoryFieldSelector tests that fieldSelector
// works correctly for Repository resources. This prevents regression where
// fieldSelector=metadata.name=<name> was not working properly.
func TestIntegrationProvisioning_RepositoryFieldSelector(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
helper := runGrafana(t)
ctx := context.Background()
// Create multiple repositories for testing
repo1Name := "repo-selector-test-1"
repo2Name := "repo-selector-test-2"
repo3Name := "repo-selector-test-3"
// Create first repository
repo1 := helper.RenderObject(t, "testdata/local-write.json.tmpl", map[string]any{
"Name": repo1Name,
"SyncEnabled": false, // Disable sync to speed up test
})
_, err := helper.Repositories.Resource.Create(ctx, repo1, metav1.CreateOptions{})
require.NoError(t, err, "failed to create first repository")
helper.WaitForHealthyRepository(t, repo1Name)
// Create second repository
repo2 := helper.RenderObject(t, "testdata/local-write.json.tmpl", map[string]any{
"Name": repo2Name,
"SyncEnabled": false,
})
_, err = helper.Repositories.Resource.Create(ctx, repo2, metav1.CreateOptions{})
require.NoError(t, err, "failed to create second repository")
helper.WaitForHealthyRepository(t, repo2Name)
// Create third repository
repo3 := helper.RenderObject(t, "testdata/local-write.json.tmpl", map[string]any{
"Name": repo3Name,
"SyncEnabled": false,
})
_, err = helper.Repositories.Resource.Create(ctx, repo3, metav1.CreateOptions{})
require.NoError(t, err, "failed to create third repository")
helper.WaitForHealthyRepository(t, repo3Name)
// Verify all repositories were created
allRepos, err := helper.Repositories.Resource.List(ctx, metav1.ListOptions{})
require.NoError(t, err, "should be able to list all repositories")
require.GreaterOrEqual(t, len(allRepos.Items), 3, "should have at least 3 repositories")
t.Run("should filter by metadata.name and return single repository", func(t *testing.T) {
list, err := helper.Repositories.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=" + repo2Name,
})
require.NoError(t, err, "fieldSelector query should succeed")
require.Len(t, list.Items, 1, "should return exactly one repository")
require.Equal(t, repo2Name, list.Items[0].GetName(), "should return the correct repository")
})
t.Run("should filter by different metadata.name", func(t *testing.T) {
list, err := helper.Repositories.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=" + repo1Name,
})
require.NoError(t, err, "fieldSelector query should succeed")
require.Len(t, list.Items, 1, "should return exactly one repository")
require.Equal(t, repo1Name, list.Items[0].GetName(), "should return the first repository")
})
t.Run("should return empty when fieldSelector does not match any repository", func(t *testing.T) {
list, err := helper.Repositories.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=non-existent-repository",
})
require.NoError(t, err, "fieldSelector query should succeed even with no matches")
require.Empty(t, list.Items, "should return empty list when no repositories match")
})
t.Run("listing without fieldSelector should return all repositories", func(t *testing.T) {
list, err := helper.Repositories.Resource.List(ctx, metav1.ListOptions{})
require.NoError(t, err, "should be able to list without fieldSelector")
require.GreaterOrEqual(t, len(list.Items), 3, "should return all repositories when no filter is applied")
// Verify our test repositories are in the list
names := make(map[string]bool)
for _, item := range list.Items {
names[item.GetName()] = true
}
require.True(t, names[repo1Name], "should contain repo1")
require.True(t, names[repo2Name], "should contain repo2")
require.True(t, names[repo3Name], "should contain repo3")
})
}
// TestIntegrationProvisioning_JobFieldSelector tests that fieldSelector
// works correctly for Job resources. This prevents regression where
// fieldSelector=metadata.name=<name> was not working properly.
func TestIntegrationProvisioning_JobFieldSelector(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
helper := runGrafana(t)
ctx := context.Background()
// Create a repository to trigger jobs
repoName := "job-selector-test-repo"
repo := helper.RenderObject(t, "testdata/local-write.json.tmpl", map[string]any{
"Name": repoName,
"SyncEnabled": false,
})
_, err := helper.Repositories.Resource.Create(ctx, repo, metav1.CreateOptions{})
require.NoError(t, err, "failed to create repository")
helper.WaitForHealthyRepository(t, repoName)
// Copy some test files to trigger jobs
helper.CopyToProvisioningPath(t, "testdata/all-panels.json", "job-test-dashboard-1.json")
helper.CopyToProvisioningPath(t, "testdata/text-options.json", "job-test-dashboard-2.json")
// Trigger multiple jobs to have multiple job resources
job1Spec := provisioning.JobSpec{
Action: provisioning.JobActionPull,
Pull: &provisioning.SyncJobOptions{},
}
// Trigger first job
body1 := asJSON(job1Spec)
result1 := helper.AdminREST.Post().
Namespace("default").
Resource("repositories").
Name(repoName).
SubResource("jobs").
Body(body1).
SetHeader("Content-Type", "application/json").
Do(ctx)
require.NoError(t, result1.Error(), "should be able to trigger first job")
obj1, err := result1.Get()
require.NoError(t, err, "should get first job object")
job1 := obj1.(*unstructured.Unstructured)
job1Name := job1.GetName()
require.NotEmpty(t, job1Name, "first job should have a name")
// Wait for first job to complete before starting second
helper.AwaitJobs(t, repoName)
// Trigger second job
result2 := helper.AdminREST.Post().
Namespace("default").
Resource("repositories").
Name(repoName).
SubResource("jobs").
Body(body1).
SetHeader("Content-Type", "application/json").
Do(ctx)
require.NoError(t, result2.Error(), "should be able to trigger second job")
obj2, err := result2.Get()
require.NoError(t, err, "should get second job object")
job2 := obj2.(*unstructured.Unstructured)
job2Name := job2.GetName()
require.NotEmpty(t, job2Name, "second job should have a name")
t.Run("should filter by metadata.name and return single job", func(t *testing.T) {
// Note: Jobs are ephemeral and may complete quickly, so we test while they exist
list, err := helper.Jobs.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=" + job2Name,
})
require.NoError(t, err, "fieldSelector query should succeed")
// The job might have completed already, but if it exists, it should be the only one
if len(list.Items) > 0 {
require.Len(t, list.Items, 1, "should return at most one job")
require.Equal(t, job2Name, list.Items[0].GetName(), "should return the correct job")
}
})
t.Run("should filter by different metadata.name", func(t *testing.T) {
list, err := helper.Jobs.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=" + job1Name,
})
require.NoError(t, err, "fieldSelector query should succeed")
// The job might have completed already, but if it exists, it should be the only one
if len(list.Items) > 0 {
require.Len(t, list.Items, 1, "should return at most one job")
require.Equal(t, job1Name, list.Items[0].GetName(), "should return the first job")
}
})
t.Run("should return empty when fieldSelector does not match any job", func(t *testing.T) {
list, err := helper.Jobs.Resource.List(ctx, metav1.ListOptions{
FieldSelector: "metadata.name=non-existent-job",
})
require.NoError(t, err, "fieldSelector query should succeed even with no matches")
require.Empty(t, list.Items, "should return empty list when no jobs match")
})
t.Run("listing without fieldSelector should work", func(t *testing.T) {
list, err := helper.Jobs.Resource.List(ctx, metav1.ListOptions{})
require.NoError(t, err, "should be able to list without fieldSelector")
// Jobs may have completed, so we don't assert on count, just that the query works
t.Logf("Found %d active jobs without filter", len(list.Items))
})
}

View File

@@ -59,7 +59,7 @@ function AnalyzeRuleButtonView({
});
openAssistant({
origin: 'alerting',
origin: 'alerting/analyze-rule-menu-item',
mode: 'assistant',
prompt: analyzeRulePrompt,
context: [alertContext],

View File

@@ -40,26 +40,44 @@ interface ShowMoreInstancesProps {
stats: ShowMoreStats;
onClick?: React.ComponentProps<typeof LinkButton>['onClick'];
href?: React.ComponentProps<typeof LinkButton>['href'];
enableFiltering?: boolean;
alertState?: InstanceStateFilter;
}
function ShowMoreInstances({ stats, onClick, href }: ShowMoreInstancesProps) {
function ShowMoreInstances({ stats, onClick, href, enableFiltering, alertState }: ShowMoreInstancesProps) {
const styles = useStyles2(getStyles);
const { visibleItemsCount, totalItemsCount } = stats;
return (
<div className={styles.footerRow}>
<div>
<Trans
i18nKey="alerting.rule-details-matching-instances.showing-count"
values={{ visibleItemsCount, totalItemsCount }}
>
Showing {{ visibleItemsCount }} out of {{ totalItemsCount }} instances
</Trans>
{enableFiltering && alertState ? (
<Trans
i18nKey="alerting.rule-details-matching-instances.showing-count-with-state"
values={{ visibleItemsCount, alertState, totalItemsCount }}
>
Showing {{ visibleItemsCount }} {{ alertState }} out of {{ totalItemsCount }} instances
</Trans>
) : (
<Trans
i18nKey="alerting.rule-details-matching-instances.showing-count"
values={{ visibleItemsCount, totalItemsCount }}
>
Showing {{ visibleItemsCount }} out of {{ totalItemsCount }} instances
</Trans>
)}
</div>
<LinkButton size="sm" variant="secondary" data-testid="show-all" onClick={onClick} href={href}>
<Trans i18nKey="alerting.rule-details-matching-instances.button-show-all" values={{ totalItemsCount }}>
Show all {{ totalItemsCount }} alert instances
</Trans>
{enableFiltering ? (
<Trans i18nKey="alerting.rule-details-matching-instances.button-show-all">Show all</Trans>
) : (
<Trans
i18nKey="alerting.rule-details-matching-instances.button-show-all-instances"
values={{ totalItemsCount }}
>
Show all {{ totalItemsCount }} alert instances
</Trans>
)}
</LinkButton>
</div>
);
@@ -128,6 +146,8 @@ export function RuleDetailsMatchingInstances(props: Props) {
stats={stats}
onClick={enableFiltering ? resetFilter : undefined}
href={!enableFiltering ? ruleViewPageLink : undefined}
enableFiltering={enableFiltering}
alertState={alertState}
/>
) : undefined;

View File

@@ -245,10 +245,29 @@ export const dashboardEditActions = {
description: t('dashboard.variable.description.action', 'Change variable description'),
prop: 'description',
}),
changeVariableHideValue: makeEditAction<SceneVariable, 'hide'>({
description: t('dashboard.variable.hide.action', 'Change variable hide option'),
prop: 'hide',
}),
changeVariableHideValue({ source, oldValue, newValue }: EditActionProps<SceneVariable, 'hide'>) {
const variableSet = source.parent;
const variablesBeforeChange =
variableSet instanceof SceneVariableSet ? [...(variableSet.state.variables ?? [])] : undefined;
dashboardEditActions.edit({
description: t('dashboard.variable.hide.action', 'Change variable hide option'),
source,
perform: () => {
source.setState({ hide: newValue });
// Updating the variables set since components that show/hide variables subscribe to the variable set, not the individual variables.
if (variableSet instanceof SceneVariableSet) {
variableSet.setState({ variables: [...(variableSet.state.variables ?? [])] });
}
},
undo: () => {
source.setState({ hide: oldValue });
if (variableSet instanceof SceneVariableSet && variablesBeforeChange) {
variableSet.setState({ variables: variablesBeforeChange });
}
},
});
},
moveElement(props: MoveElementActionHelperProps) {
const { movedObject, source, perform, undo } = props;

View File

@@ -64,6 +64,7 @@ const getStyles = (theme: GrafanaTheme2) => ({
display: 'flex',
alignItems: 'center',
gap: theme.spacing(1),
padding: theme.spacing(1),
}),
controlWrapper: css({
height: theme.spacing(2),

View File

@@ -82,19 +82,39 @@ export function VariableValueSelectWrapper({ variable, inMenu }: VariableSelectP
// For switch variables in menu, we want to show the switch on the left and the label on the right
if (inMenu && sceneUtils.isSwitchVariable(variable)) {
return (
<div className={styles.switchMenuContainer} data-testid={selectors.pages.Dashboard.SubMenu.submenuItem}>
<div
className={cx(
styles.switchMenuContainer,
isSelected && 'dashboard-selected-element',
isSelectable && !isSelected && 'dashboard-selectable-element'
)}
onPointerDown={onPointerDown}
data-testid={selectors.pages.Dashboard.SubMenu.submenuItem}
>
<div className={styles.switchControl}>
<variable.Component model={variable} />
</div>
<VariableLabel variable={variable} layout={'vertical'} className={styles.switchLabel} />
<VariableLabel
variable={variable}
layout={'vertical'}
className={cx(isSelectable && styles.labelSelectable, styles.switchLabel)}
/>
</div>
);
}
if (inMenu) {
return (
<div className={styles.verticalContainer} data-testid={selectors.pages.Dashboard.SubMenu.submenuItem}>
<VariableLabel variable={variable} layout={'vertical'} />
<div
className={cx(
styles.verticalContainer,
isSelected && 'dashboard-selected-element',
isSelectable && !isSelected && 'dashboard-selectable-element'
)}
onPointerDown={onPointerDown}
data-testid={selectors.pages.Dashboard.SubMenu.submenuItem}
>
<VariableLabel variable={variable} layout={'vertical'} className={cx(isSelectable && styles.labelSelectable)} />
<variable.Component model={variable} />
</div>
);
@@ -164,11 +184,13 @@ const getStyles = (theme: GrafanaTheme2) => ({
verticalContainer: css({
display: 'flex',
flexDirection: 'column',
padding: theme.spacing(1),
}),
switchMenuContainer: css({
display: 'flex',
alignItems: 'center',
gap: theme.spacing(1),
padding: theme.spacing(1),
}),
switchControl: css({
'& > div': {

View File

@@ -1,4 +1,4 @@
import { css, cx } from '@emotion/css';
import { css } from '@emotion/css';
import { GrafanaTheme2 } from '@grafana/data';
import { SceneDataLayerProvider, SceneVariable } from '@grafana/scenes';
@@ -17,8 +17,6 @@ interface DashboardControlsMenuProps {
}
export function DashboardControlsMenu({ variables, links, annotations, dashboardUID }: DashboardControlsMenuProps) {
const styles = useStyles2(getStyles);
return (
<Box
minWidth={32}
@@ -29,7 +27,7 @@ export function DashboardControlsMenu({ variables, links, annotations, dashboard
direction={'column'}
borderRadius={'default'}
backgroundColor={'primary'}
padding={1.5}
padding={1}
gap={0.5}
onClick={(e) => {
// Normally, clicking the overlay closes the dropdown.
@@ -39,7 +37,7 @@ export function DashboardControlsMenu({ variables, links, annotations, dashboard
>
{/* Variables */}
{variables.map((variable, index) => (
<div className={cx({ [styles.menuItem]: index > 0 })} key={variable.state.key}>
<div key={variable.state.key}>
<VariableValueSelectWrapper variable={variable} inMenu />
</div>
))}
@@ -47,7 +45,7 @@ export function DashboardControlsMenu({ variables, links, annotations, dashboard
{/* Annotation layers */}
{annotations.length > 0 &&
annotations.map((layer, index) => (
<div className={cx({ [styles.menuItem]: variables.length > 0 || index > 0 })} key={layer.state.key}>
<div key={layer.state.key}>
<DataLayerControl layer={layer} inMenu />
</div>
))}
@@ -79,10 +77,7 @@ function MenuDivider() {
const getStyles = (theme: GrafanaTheme2) => ({
divider: css({
marginTop: theme.spacing(2),
marginTop: theme.spacing(1),
padding: theme.spacing(0, 0.5),
}),
menuItem: css({
marginTop: theme.spacing(2),
}),
});

View File

@@ -22,7 +22,6 @@ const featureIni = `# In your custom.ini file
[feature_toggles]
provisioning = true
kubernetesDashboards = true ; use k8s from browser
`;
const ngrokExample = `ngrok http 3000
@@ -103,7 +102,7 @@ const getModalContent = (setupType: SetupType) => {
),
description: t(
'provisioning.getting-started.step-description-enable-feature-toggles',
'Add these settings to your custom.ini file to enable necessary features:'
'Add the provisioning feature toggle to your custom.ini file. Note: kubernetesDashboards is enabled by default, but if you have explicitly disabled it, you will need to enable it in your Grafana settings or remove the override from your configuration.'
),
code: featureIni,
},

View File

@@ -22,8 +22,8 @@ describe('ScopesService', () => {
| undefined;
let dashboardsStateSubscription:
| ((
state: { navigationScope?: string; drawerOpened: boolean },
prevState: { navigationScope?: string; drawerOpened: boolean }
state: { navigationScope?: string; drawerOpened: boolean; navScopePath?: string[] },
prevState: { navigationScope?: string; drawerOpened: boolean; navScopePath?: string[] }
) => void)
| undefined;
@@ -56,7 +56,7 @@ describe('ScopesService', () => {
selectorStateSubscription = callback;
return { unsubscribe: jest.fn() };
}),
changeScopes: jest.fn(),
changeScopes: jest.fn().mockResolvedValue(undefined),
resolvePathToRoot: jest.fn().mockResolvedValue({ path: [], tree: {} }),
} as unknown as jest.Mocked<ScopesSelectorService>;
@@ -71,6 +71,7 @@ describe('ScopesService', () => {
loading: false,
searchQuery: '',
navigationScope: undefined,
navScopePath: undefined,
},
stateObservable: new BehaviorSubject({
drawerOpened: false,
@@ -82,12 +83,14 @@ describe('ScopesService', () => {
loading: false,
searchQuery: '',
navigationScope: undefined,
navScopePath: undefined,
}),
subscribeToState: jest.fn((callback) => {
dashboardsStateSubscription = callback;
return { unsubscribe: jest.fn() };
}),
setNavigationScope: jest.fn(),
setNavScopePath: jest.fn(),
} as unknown as jest.Mocked<ScopesDashboardsService>;
locationService = {
@@ -188,7 +191,7 @@ describe('ScopesService', () => {
service = new ScopesService(selectorService, dashboardsService, locationService);
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1');
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1', undefined, undefined);
});
it('should read navigation_scope along with other scope parameters', () => {
@@ -199,7 +202,7 @@ describe('ScopesService', () => {
service = new ScopesService(selectorService, dashboardsService, locationService);
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1');
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1', undefined, undefined);
expect(selectorService.changeScopes).toHaveBeenCalledWith(['scope1'], undefined, 'node1', false);
});
@@ -213,6 +216,45 @@ describe('ScopesService', () => {
expect(dashboardsService.setNavigationScope).not.toHaveBeenCalled();
});
it('should read nav_scope_path along with navigation_scope from URL on init', () => {
locationService.getLocation = jest.fn().mockReturnValue({
pathname: '/test',
search: '?navigation_scope=navScope1&nav_scope_path=mimir%2Cloki',
});
service = new ScopesService(selectorService, dashboardsService, locationService);
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1', undefined, ['mimir', 'loki']);
});
it('should handle nav_scope_path without navigation_scope by calling setNavScopePath after changeScopes', async () => {
locationService.getLocation = jest.fn().mockReturnValue({
pathname: '/test',
search: '?scopes=scope1&nav_scope_path=mimir',
});
service = new ScopesService(selectorService, dashboardsService, locationService);
// Wait for the changeScopes promise to resolve
await Promise.resolve();
expect(dashboardsService.setNavScopePath).toHaveBeenCalledWith(['mimir']);
});
it('should handle URL-encoded nav_scope_path values', () => {
locationService.getLocation = jest.fn().mockReturnValue({
pathname: '/test',
search: '?navigation_scope=navScope1&nav_scope_path=' + encodeURIComponent('folder one,folder two'),
});
service = new ScopesService(selectorService, dashboardsService, locationService);
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1', undefined, [
'folder one',
'folder two',
]);
});
});
describe('URL synchronization', () => {
@@ -346,16 +388,22 @@ describe('ScopesService', () => {
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: undefined,
},
{
navigationScope: undefined,
drawerOpened: false,
navScopePath: undefined,
}
);
expect(locationService.partial).toHaveBeenCalledWith({
navigation_scope: 'navScope1',
});
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: 'navScope1',
nav_scope_path: null,
},
true
);
});
it('should update navigation_scope in URL when navigationScope changes', () => {
@@ -367,16 +415,22 @@ describe('ScopesService', () => {
{
navigationScope: 'navScope2',
drawerOpened: true,
navScopePath: undefined,
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: undefined,
}
);
expect(locationService.partial).toHaveBeenCalledWith({
navigation_scope: 'navScope2',
});
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: 'navScope2',
nav_scope_path: null,
},
true
);
});
it('should not update URL when navigationScope has not changed', () => {
@@ -390,10 +444,12 @@ describe('ScopesService', () => {
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: undefined,
},
{
navigationScope: 'navScope1',
drawerOpened: false,
navScopePath: undefined,
}
);
@@ -409,16 +465,126 @@ describe('ScopesService', () => {
{
navigationScope: undefined,
drawerOpened: false,
navScopePath: undefined,
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: undefined,
}
);
expect(locationService.partial).toHaveBeenCalledWith({
navigation_scope: undefined,
});
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: null,
nav_scope_path: null,
},
true
);
});
it('should write nav_scope_path to URL when navScopePath changes', () => {
if (!dashboardsStateSubscription) {
throw new Error('dashboardsStateSubscription not set');
}
dashboardsStateSubscription(
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: ['mimir', 'loki'],
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: undefined,
}
);
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: 'navScope1',
nav_scope_path: encodeURIComponent('mimir,loki'),
},
true
);
});
it('should update nav_scope_path in URL when navScopePath changes', () => {
if (!dashboardsStateSubscription) {
throw new Error('dashboardsStateSubscription not set');
}
dashboardsStateSubscription(
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: ['mimir', 'loki', 'tempo'],
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: ['mimir', 'loki'],
}
);
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: 'navScope1',
nav_scope_path: encodeURIComponent('mimir,loki,tempo'),
},
true
);
});
it('should clear nav_scope_path from URL when navScopePath becomes empty', () => {
if (!dashboardsStateSubscription) {
throw new Error('dashboardsStateSubscription not set');
}
dashboardsStateSubscription(
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: [],
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: ['mimir'],
}
);
expect(locationService.partial).toHaveBeenCalledWith(
{
navigation_scope: 'navScope1',
nav_scope_path: null,
},
true
);
});
it('should not update URL when only drawerOpened changes but navigationScope and navScopePath remain the same', () => {
if (!dashboardsStateSubscription) {
throw new Error('dashboardsStateSubscription not set');
}
jest.clearAllMocks();
dashboardsStateSubscription(
{
navigationScope: 'navScope1',
drawerOpened: false,
navScopePath: ['mimir'],
},
{
navigationScope: 'navScope1',
drawerOpened: true,
navScopePath: ['mimir'],
}
);
expect(locationService.partial).not.toHaveBeenCalled();
});
});
@@ -457,4 +623,113 @@ describe('ScopesService', () => {
);
});
});
describe('back/forward navigation handling', () => {
let locationSubject: BehaviorSubject<{ pathname: string; search: string }>;
beforeEach(() => {
locationSubject = new BehaviorSubject({
pathname: '/test',
search: '',
});
locationService.getLocation = jest.fn().mockReturnValue({
pathname: '/test',
search: '',
});
locationService.getLocationObservable = jest.fn().mockReturnValue(locationSubject);
// Set initial state for dashboards service
dashboardsService.state.navigationScope = undefined;
dashboardsService.state.navScopePath = undefined;
service = new ScopesService(selectorService, dashboardsService, locationService);
service.setEnabled(true);
jest.clearAllMocks();
});
it('should update navigation scope when URL changes via back/forward', () => {
// Simulate URL change (e.g., browser back button)
locationSubject.next({
pathname: '/test',
search: '?navigation_scope=navScope1',
});
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope1', undefined, undefined);
});
it('should update nav_scope_path when URL changes via back/forward', () => {
// Set current state
dashboardsService.state.navigationScope = 'navScope1';
dashboardsService.state.navScopePath = undefined;
// Simulate URL change with nav_scope_path
locationSubject.next({
pathname: '/test',
search: '?navigation_scope=navScope1&nav_scope_path=' + encodeURIComponent('mimir,loki'),
});
expect(dashboardsService.setNavScopePath).toHaveBeenCalledWith(['mimir', 'loki']);
});
it('should clear navigation scope when removed from URL via back/forward', () => {
// Set current state
dashboardsService.state.navigationScope = 'navScope1';
dashboardsService.state.navScopePath = ['mimir'];
// Simulate URL change (navigation scope removed)
locationSubject.next({
pathname: '/test',
search: '',
});
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith(undefined);
});
it('should handle navigation scope change along with nav_scope_path', () => {
// Set current state
dashboardsService.state.navigationScope = 'navScope1';
dashboardsService.state.navScopePath = ['mimir'];
// Simulate URL change to different navigation scope with new path
locationSubject.next({
pathname: '/test',
search: '?navigation_scope=navScope2&nav_scope_path=' + encodeURIComponent('loki,tempo'),
});
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('navScope2', undefined, ['loki', 'tempo']);
});
it('should handle URL-encoded navigation_scope from back/forward', () => {
// Set current state
dashboardsService.state.navigationScope = undefined;
// Simulate URL change with encoded navigation scope
locationSubject.next({
pathname: '/test',
search: '?navigation_scope=' + encodeURIComponent('scope with spaces'),
});
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith('scope with spaces', undefined, undefined);
});
it('should handle nav_scope_path change without navigation_scope', async () => {
// Set current state - no navigation scope but has nav scope path
dashboardsService.state.navigationScope = undefined;
dashboardsService.state.navScopePath = undefined;
selectorService.state.appliedScopes = [{ scopeId: 'scope1' }];
// Simulate URL change with only nav_scope_path
locationSubject.next({
pathname: '/test',
search: '?scopes=scope1&nav_scope_path=mimir',
});
// Wait for changeScopes promise
await Promise.resolve();
expect(dashboardsService.setNavScopePath).toHaveBeenCalledWith(['mimir']);
});
});
});

View File

@@ -5,6 +5,7 @@ import { map, distinctUntilChanged } from 'rxjs/operators';
import { LocationService, ScopesContextValue, ScopesContextValueState } from '@grafana/runtime';
import { ScopesDashboardsService } from './dashboards/ScopesDashboardsService';
import { deserializeFolderPath, serializeFolderPath } from './dashboards/scopeNavgiationUtils';
import { ScopesSelectorService } from './selector/ScopesSelectorService';
export interface State {
@@ -72,12 +73,21 @@ export class ScopesService implements ScopesContextValue {
const queryParams = new URLSearchParams(locationService.getLocation().search);
const scopeNodeId = queryParams.get('scope_node');
const navigationScope = queryParams.get('navigation_scope');
const navScopePath = queryParams.get('nav_scope_path');
if (navigationScope) {
this.dashboardsService.setNavigationScope(navigationScope);
this.dashboardsService.setNavigationScope(
navigationScope,
undefined,
navScopePath ? deserializeFolderPath(navScopePath) : undefined
);
}
this.changeScopes(queryParams.getAll('scopes'), undefined, scopeNodeId ?? undefined);
this.changeScopes(queryParams.getAll('scopes'), undefined, scopeNodeId ?? undefined).then(() => {
if (navScopePath && !navigationScope) {
this.dashboardsService.setNavScopePath(deserializeFolderPath(navScopePath));
}
});
// Pre-load scope node (which loads parent too)
const nodeToPreload = scopeNodeId;
@@ -99,6 +109,9 @@ export class ScopesService implements ScopesContextValue {
const scopes = queryParams.getAll('scopes');
const scopeNodeId = queryParams.get('scope_node');
const navigationScope = queryParams.get('navigation_scope');
const navScopePath = queryParams.get('nav_scope_path');
// Check if new scopes are different from the old scopes
const currentScopes = this.selectorService.state.appliedScopes.map((scope) => scope.scopeId);
if (scopes.length && !isEqual(scopes, currentScopes)) {
@@ -107,6 +120,31 @@ export class ScopesService implements ScopesContextValue {
// changes the URL directly, it would trigger a reload so scopes would still be reset.
this.changeScopes(scopes, undefined, scopeNodeId ?? undefined);
}
// Handle navigation_scope and nav_scope_path changes from back/forward navigation
const currentNavigationScope = this.dashboardsService.state.navigationScope;
const currentNavScopePath = this.dashboardsService.state.navScopePath;
const newNavScopePath = navScopePath ? deserializeFolderPath(navScopePath) : undefined;
const decodedNavigationScope = navigationScope ? decodeURIComponent(navigationScope) : undefined;
const navigationScopeChanged = decodedNavigationScope !== currentNavigationScope;
const navScopePathChanged = !isEqual(newNavScopePath, currentNavScopePath);
if (navigationScopeChanged) {
// Navigation scope changed - do full update
if (decodedNavigationScope) {
this.dashboardsService.setNavigationScope(decodedNavigationScope, undefined, newNavScopePath);
} else if (newNavScopePath?.length) {
this.changeScopes(scopes, undefined, scopeNodeId ?? undefined).then(() => {
this.dashboardsService.setNavScopePath(newNavScopePath);
});
} else {
this.dashboardsService.setNavigationScope(undefined);
}
} else if (navScopePathChanged) {
// Navigation scope unchanged but path changed
this.dashboardsService.setNavScopePath(newNavScopePath);
}
})
);
@@ -137,10 +175,17 @@ export class ScopesService implements ScopesContextValue {
// Update the URL based on change in the navigation scope
this.subscriptions.push(
this.dashboardsService.subscribeToState((state, prevState) => {
if (state.navigationScope !== prevState.navigationScope) {
this.locationService.partial({
navigation_scope: state.navigationScope,
});
if (
state.navigationScope !== prevState.navigationScope ||
!isEqual(state.navScopePath, prevState.navScopePath)
) {
this.locationService.partial(
{
navigation_scope: state.navigationScope ? encodeURIComponent(state.navigationScope) : null,
nav_scope_path: state.navScopePath?.length ? serializeFolderPath(state.navScopePath) : null,
},
true
);
}
})
);

View File

@@ -67,7 +67,12 @@ export function ScopesDashboards() {
/>
) : filteredFolders[''] ? (
<ScrollContainer>
<ScopesDashboardsTree folders={filteredFolders} folderPath={['']} onFolderUpdate={updateFolder} />
<ScopesDashboardsTree
folders={filteredFolders}
folderPath={['']}
subScopePath={[]}
onFolderUpdate={updateFolder}
/>
</ScrollContainer>
) : (
<p className={styles.noResultsContainer} data-testid="scopes-dashboards-notFoundForFilter">

View File

@@ -839,4 +839,52 @@ describe('ScopesDashboardsService', () => {
expect(service.state.drawerOpened).toBe(true);
});
});
describe('setNavScopePath', () => {
beforeEach(() => {
(locationService.getLocation as jest.Mock).mockReturnValue({ pathname: '/' } as Location);
});
it('should set nav scope path', async () => {
await service.setNavScopePath(['mimir']);
expect(service.state.navScopePath).toEqual(['mimir']);
});
it('should replace existing path with new path', async () => {
await service.setNavScopePath(['mimir']);
expect(service.state.navScopePath).toEqual(['mimir']);
await service.setNavScopePath(['loki']);
expect(service.state.navScopePath).toEqual(['loki']);
});
it('should handle multiple scopes in path', async () => {
await service.setNavScopePath(['mimir', 'loki']);
expect(service.state.navScopePath).toEqual(['mimir', 'loki']);
});
it('should clear path with empty array', async () => {
await service.setNavScopePath(['mimir', 'loki']);
expect(service.state.navScopePath).toEqual(['mimir', 'loki']);
await service.setNavScopePath([]);
expect(service.state.navScopePath).toEqual([]);
});
it('should handle undefined path as empty array', async () => {
await service.setNavScopePath(['mimir']);
expect(service.state.navScopePath).toEqual(['mimir']);
await service.setNavScopePath(undefined);
expect(service.state.navScopePath).toEqual([]);
});
it('should not update state if path is unchanged', async () => {
await service.setNavScopePath(['mimir']);
await service.setNavScopePath(['mimir']);
// Path should remain the same
expect(service.state.navScopePath).toEqual(['mimir']);
});
});
});

View File

@@ -7,8 +7,13 @@ import { config, locationService } from '@grafana/runtime';
import { ScopesApiClient } from '../ScopesApiClient';
import { ScopesServiceBase } from '../ScopesServiceBase';
import { isCurrentPath } from './scopeNavgiationUtils';
import { ScopeNavigation, SuggestedNavigationsFoldersMap, SuggestedNavigationsMap } from './types';
import { buildSubScopePath, isCurrentPath } from './scopeNavgiationUtils';
import {
ScopeNavigation,
SuggestedNavigationsFolder,
SuggestedNavigationsFoldersMap,
SuggestedNavigationsMap,
} from './types';
interface ScopesDashboardsServiceState {
// State of the drawer showing related dashboards
@@ -24,6 +29,8 @@ interface ScopesDashboardsServiceState {
loading: boolean;
searchQuery: string;
navigationScope?: string;
// Path of subScopes which should be expanded
navScopePath?: string[];
}
export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsServiceState> {
@@ -38,6 +45,7 @@ export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsS
forScopeNames: [],
loading: false,
searchQuery: '',
navScopePath: undefined,
});
// Add/ remove location subscribtion based on the drawer opened state
@@ -57,9 +65,40 @@ export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsS
});
}
private openSubScopeFolder = (subScopePath: string[]) => {
const subScope = subScopePath[subScopePath.length - 1];
const path = buildSubScopePath(subScope, this.state.folders);
// Get path to the folder - path can now be undefined
if (path && path.length > 0) {
this.updateFolder(path, true);
}
};
public setNavScopePath = async (navScopePath?: string[]) => {
const navScopePathArray = navScopePath ?? [];
if (!isEqual(navScopePathArray, this.state.navScopePath)) {
this.updateState({ navScopePath: navScopePathArray });
for (const subScope of navScopePathArray) {
// Find the actual path to the folder with this subScopeName
const folderPath = buildSubScopePath(subScope, this.state.folders);
if (folderPath && folderPath.length > 0) {
await this.fetchSubScopeItems(folderPath, subScope);
this.openSubScopeFolder([subScope]);
}
}
}
};
// The fallbackScopeNames is used to fetch the ScopeNavigations for the current dashboard when the navigationScope is not set.
// You only need to awaut this function if you need to wait for the dashboards to be fetched before doing something else.
public setNavigationScope = async (navigationScope?: string, fallbackScopeNames?: string[]) => {
public setNavigationScope = async (
navigationScope?: string,
fallbackScopeNames?: string[],
navScopePath?: string[]
) => {
if (this.state.navigationScope === navigationScope) {
return;
}
@@ -67,6 +106,7 @@ export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsS
const forScopeNames = navigationScope ? [navigationScope] : (fallbackScopeNames ?? []);
this.updateState({ navigationScope, drawerOpened: forScopeNames.length > 0 });
await this.fetchDashboards(forScopeNames);
await this.setNavScopePath(navScopePath);
};
// Expand the group that matches the current path, if it is not already expanded
@@ -148,6 +188,15 @@ export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsS
};
private fetchSubScopeItems = async (path: string[], subScopeName: string) => {
// Check if folder already has content - skip fetching to preserve existing state
const targetFolder = this.getFolder(path);
if (
targetFolder &&
(Object.keys(targetFolder.folders).length > 0 || Object.keys(targetFolder.suggestedNavigations).length > 0)
) {
return;
}
let subScopeFolders: SuggestedNavigationsFoldersMap | undefined;
try {
@@ -208,6 +257,15 @@ export class ScopesDashboardsService extends ScopesServiceBase<ScopesDashboardsS
this.updateState({ folders, filteredFolders });
};
// Helper to get a folder at a given path
private getFolder = (path: string[]): SuggestedNavigationsFolder | undefined => {
let folder: SuggestedNavigationsFoldersMap = this.state.folders;
for (let i = 0; i < path.length - 1; i++) {
folder = folder[path[i]]?.folders ?? {};
}
return folder[path[path.length - 1]];
};
public changeSearchQuery = (searchQuery: string) => {
searchQuery = searchQuery ?? '';

View File

@@ -12,10 +12,17 @@ export interface ScopesDashboardsTreeProps {
subScope?: string;
folders: SuggestedNavigationsFoldersMap;
folderPath: string[];
subScopePath?: string[];
onFolderUpdate: OnFolderUpdate;
}
export function ScopesDashboardsTree({ subScope, folders, folderPath, onFolderUpdate }: ScopesDashboardsTreeProps) {
export function ScopesDashboardsTree({
subScopePath,
subScope,
folders,
folderPath,
onFolderUpdate,
}: ScopesDashboardsTreeProps) {
const [queryParams] = useQueryParams();
const styles = useStyles2(getStyles);
@@ -54,6 +61,7 @@ export function ScopesDashboardsTree({ subScope, folders, folderPath, onFolderUp
{regularNavigations.map((navigation) => (
<ScopesNavigationTreeLink
subScope={subScope}
subScopePath={subScopePath}
key={navigation.id + navigation.title}
to={urlUtil.renderUrl(navigation.url, queryParams)}
title={navigation.title}
@@ -68,6 +76,7 @@ export function ScopesDashboardsTree({ subScope, folders, folderPath, onFolderUp
{subScopeFolders.map(([subFolderId, subFolder]) => (
<ScopesDashboardsTreeFolderItem
key={subFolderId}
subScopePath={[...(subScopePath ?? []), subFolder.subScopeName ?? '']}
folder={subFolder}
folders={folder.folders}
folderPath={[...folderPath, subFolderId]}

View File

@@ -16,6 +16,9 @@ const mockScopesSelectorService = {
const mockScopesDashboardsService = {
setNavigationScope: jest.fn(),
state: {
navScopePath: undefined,
},
};
jest.mock('../ScopesContextProvider', () => ({
@@ -133,7 +136,7 @@ describe('ScopesDashboardsTreeFolderItem', () => {
const exchangeButton = screen.getByRole('button', { name: /change root scope/i });
await user.click(exchangeButton);
expect(mockScopesDashboardsService.setNavigationScope).toHaveBeenCalledWith(undefined, ['subScope1']);
expect(mockScopesDashboardsService.setNavigationScope).toHaveBeenCalledWith(undefined, undefined, []);
});
it('calls changeScopes when exchange icon is clicked', async () => {
@@ -152,7 +155,7 @@ describe('ScopesDashboardsTreeFolderItem', () => {
const exchangeButton = screen.getByRole('button', { name: /change root scope/i });
await user.click(exchangeButton);
expect(mockScopesSelectorService.changeScopes).toHaveBeenCalledWith(['subScope1']);
expect(mockScopesSelectorService.changeScopes).toHaveBeenCalledWith(['subScope1'], undefined, undefined, false);
});
it('passes subScope prop to ScopesDashboardsTree when folder is expanded', () => {

View File

@@ -14,9 +14,11 @@ export interface ScopesDashboardsTreeFolderItemProps {
folderPath: string[];
folders: SuggestedNavigationsFoldersMap;
onFolderUpdate: OnFolderUpdate;
subScopePath?: string[];
}
export function ScopesDashboardsTreeFolderItem({
subScopePath,
folder,
folderPath,
folders,
@@ -53,12 +55,27 @@ export function ScopesDashboardsTreeFolderItem({
scope: folder.subScopeName || '',
})}
name="exchange-alt"
onClick={(e) => {
onClick={async (e) => {
e.preventDefault();
e.stopPropagation();
if (folder.subScopeName && scopesSelectorService) {
scopesDashboardsService?.setNavigationScope(undefined, [folder.subScopeName]);
scopesSelectorService.changeScopes([folder.subScopeName]);
const activeSubScopePath = scopesDashboardsService?.state.navScopePath;
// Check if the active scope is a child of the current folder's scope
const activeScope = activeSubScopePath?.[activeSubScopePath.length - 1];
const folderLocationInActivePath = activeSubScopePath?.indexOf(folder.subScopeName) ?? -1;
await scopesDashboardsService?.setNavigationScope(
folderLocationInActivePath >= 0 ? folder.subScopeName : undefined,
undefined,
activeSubScopePath?.slice(folderLocationInActivePath + 1) ?? []
);
// Now changeScopes will skip fetchDashboards because navigationScope is set
scopesSelectorService.changeScopes(
folderLocationInActivePath >= 0 && activeScope ? [activeScope] : [folder.subScopeName],
undefined,
undefined,
false
);
}
}}
/>
@@ -68,6 +85,7 @@ export function ScopesDashboardsTreeFolderItem({
{folder.expanded && (
<div className={styles.children}>
<ScopesDashboardsTree
subScopePath={subScopePath}
subScope={folder.subScopeName}
folders={folders}
folderPath={folderPath}

View File

@@ -209,7 +209,7 @@ describe('ScopesNavigationTreeLink', () => {
const link = screen.getByTestId('scopes-dashboards-test-id');
await userEvent.click(link);
expect(mockScopesDashboardsService.setNavigationScope).toHaveBeenCalledWith('currentScope');
expect(mockScopesDashboardsService.setNavigationScope).toHaveBeenCalledWith('currentScope', undefined, undefined);
});
it('should not set navigation scope when already set', async () => {

View File

@@ -8,16 +8,17 @@ import { Icon, useStyles2 } from '@grafana/ui';
import { useScopesServices } from '../ScopesContextProvider';
import { isCurrentPath, normalizePath } from './scopeNavgiationUtils';
import { isCurrentPath, normalizePath, serializeFolderPath } from './scopeNavgiationUtils';
export interface ScopesNavigationTreeLinkProps {
subScope?: string;
to: string;
title: string;
id: string;
subScopePath?: string[];
}
export function ScopesNavigationTreeLink({ subScope, to, title, id }: ScopesNavigationTreeLinkProps) {
export function ScopesNavigationTreeLink({ subScope, to, title, id, subScopePath }: ScopesNavigationTreeLinkProps) {
const styles = useStyles2(getStyles);
const linkIcon = useMemo(() => getLinkIcon(to), [to]);
const locPathname = useLocation().pathname;
@@ -25,7 +26,7 @@ export function ScopesNavigationTreeLink({ subScope, to, title, id }: ScopesNavi
// Ignore query params
const isCurrent = isCurrentPath(locPathname, to);
const handleClick = (e: React.MouseEvent<HTMLAnchorElement>) => {
const handleClick = async (e: React.MouseEvent<HTMLAnchorElement>) => {
if (subScope) {
e.preventDefault(); // Prevent default Link navigation
@@ -39,11 +40,18 @@ export function ScopesNavigationTreeLink({ subScope, to, title, id }: ScopesNavi
const searchParams = new URLSearchParams(url.search);
if (!currentNavigationScope && currentScope) {
searchParams.set('navigation_scope', currentScope);
services?.scopesDashboardsService?.setNavigationScope(currentScope);
await services?.scopesDashboardsService?.setNavigationScope(
currentScope,
undefined,
subScopePath && subScopePath.length > 0 ? subScopePath : undefined
);
}
// Update query params with the new subScope
searchParams.set('scopes', subScope);
// Set nav_scope_path to the subScopePath
searchParams.set('nav_scope_path', subScopePath ? serializeFolderPath(subScopePath) : '');
// Remove scope_node and scope_parent since we're changing to a subScope
searchParams.delete('scope_node');
searchParams.delete('scope_parent');

View File

@@ -1,4 +1,11 @@
import { getDashboardPathForComparison, isCurrentPath } from './scopeNavgiationUtils';
import {
buildSubScopePath,
deserializeFolderPath,
getDashboardPathForComparison,
isCurrentPath,
serializeFolderPath,
} from './scopeNavgiationUtils';
import { SuggestedNavigationsFoldersMap } from './types';
describe('scopeNavgiationUtils', () => {
it('should return the correct path for a dashboard', () => {
@@ -28,4 +35,194 @@ describe('scopeNavgiationUtils', () => {
expect(isCurrentPath('/d/dashboardId/slug', '/d/dashboardId#hash')).toBe(true);
expect(isCurrentPath('/d/dashboardId', '/d/dashboardId#hash')).toBe(true);
});
describe('deserializeFolderPath', () => {
it('should return empty array for empty string', () => {
expect(deserializeFolderPath('')).toEqual([]);
});
it('should parse a simple comma-separated string', () => {
expect(deserializeFolderPath('mimir,loki')).toEqual(['mimir', 'loki']);
});
it('should handle single value', () => {
expect(deserializeFolderPath('mimir')).toEqual(['mimir']);
});
it('should trim whitespace around values', () => {
expect(deserializeFolderPath(' mimir , loki ')).toEqual(['mimir', 'loki']);
});
it('should handle URL-encoded strings', () => {
expect(deserializeFolderPath(encodeURIComponent('mimir,loki'))).toEqual(['mimir', 'loki']);
});
it('should handle URL-encoded strings with special characters', () => {
expect(deserializeFolderPath(encodeURIComponent('folder one,folder two'))).toEqual(['folder one', 'folder two']);
});
it('should fallback to split without decoding if decodeURIComponent fails', () => {
// Invalid URI sequence that would cause decodeURIComponent to throw
const invalidUri = '%E0%A4%A';
expect(deserializeFolderPath(invalidUri)).toEqual(['%E0%A4%A']);
});
});
describe('serializeFolderPath', () => {
it('should return empty string for empty array', () => {
expect(serializeFolderPath([])).toBe('');
});
it('should serialize a simple array', () => {
expect(serializeFolderPath(['mimir', 'loki'])).toBe(encodeURIComponent('mimir,loki'));
});
it('should handle single value', () => {
expect(serializeFolderPath(['mimir'])).toBe('mimir');
});
it('should handle values with spaces', () => {
expect(serializeFolderPath(['folder one', 'folder two'])).toBe(encodeURIComponent('folder one,folder two'));
});
it('should return empty string for null/undefined input', () => {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
expect(serializeFolderPath(null as any)).toBe('');
// eslint-disable-next-line @typescript-eslint/no-explicit-any
expect(serializeFolderPath(undefined as any)).toBe('');
});
});
describe('serializeFolderPath and deserializeFolderPath round-trip', () => {
it('should round-trip simple paths', () => {
const original = ['mimir', 'loki'];
const serialized = serializeFolderPath(original);
const deserialized = deserializeFolderPath(serialized);
expect(deserialized).toEqual(original);
});
it('should round-trip paths with spaces', () => {
const original = ['folder one', 'folder two'];
const serialized = serializeFolderPath(original);
const deserialized = deserializeFolderPath(serialized);
expect(deserialized).toEqual(original);
});
});
describe('buildSubScopePath', () => {
it('should return undefined when folders is empty', () => {
const folders: SuggestedNavigationsFoldersMap = {};
expect(buildSubScopePath('mimir', folders)).toBeUndefined();
});
it('should find subScope at root level', () => {
const folders: SuggestedNavigationsFoldersMap = {
'Mimir Dashboards': {
title: 'Mimir Dashboards',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'mimir',
},
};
expect(buildSubScopePath('mimir', folders)).toEqual(['Mimir Dashboards']);
});
it('should find subScope in nested folders', () => {
const folders: SuggestedNavigationsFoldersMap = {
'': {
title: '',
expanded: true,
folders: {
'Parent Folder': {
title: 'Parent Folder',
expanded: false,
folders: {
'Mimir Dashboards': {
title: 'Mimir Dashboards',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'mimir',
},
},
suggestedNavigations: {},
},
},
suggestedNavigations: {},
},
};
expect(buildSubScopePath('mimir', folders)).toEqual(['', 'Parent Folder', 'Mimir Dashboards']);
});
it('should return undefined when subScope is not found', () => {
const folders: SuggestedNavigationsFoldersMap = {
'': {
title: '',
expanded: true,
folders: {
'Loki Dashboards': {
title: 'Loki Dashboards',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'loki',
},
},
suggestedNavigations: {},
},
};
expect(buildSubScopePath('mimir', folders)).toBeUndefined();
});
it('should return first match when multiple folders have the same subScope', () => {
const folders: SuggestedNavigationsFoldersMap = {
'Mimir Dashboards': {
title: 'Mimir Dashboards',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'mimir',
},
'Mimir Overview': {
title: 'Mimir Overview',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'mimir',
},
};
// Should return the first one found (order depends on Object.entries)
const result = buildSubScopePath('mimir', folders);
expect(result).toBeDefined();
expect(result?.length).toBe(1);
});
it('should find deeply nested subScope', () => {
const folders: SuggestedNavigationsFoldersMap = {
level1: {
title: 'Level 1',
expanded: true,
folders: {
level2: {
title: 'Level 2',
expanded: true,
folders: {
level3: {
title: 'Level 3',
expanded: false,
folders: {},
suggestedNavigations: {},
subScopeName: 'deep-scope',
},
},
suggestedNavigations: {},
},
},
suggestedNavigations: {},
},
};
expect(buildSubScopePath('deep-scope', folders)).toEqual(['level1', 'level2', 'level3']);
});
});
});

View File

@@ -1,3 +1,5 @@
import { SuggestedNavigationsFoldersMap } from './types';
// Helper function to get the base path for a dashboard URL for comparison purposes.
// e.g., /d/dashboardId/slug -> /d/dashboardId
// /d/dashboardId -> /d/dashboardId
@@ -5,12 +7,63 @@ export function getDashboardPathForComparison(pathname: string): string {
return pathname.split('/').slice(0, 3).join('/');
}
/**
* Finds the path to a folder with the given subScopeName by searching recursively.
* @param subScope - The subScope name to find
* @param folders - The root folder structure to search
* @returns Array representing the path to the folder, or undefined if not found
*/
export function buildSubScopePath(subScope: string, folders: SuggestedNavigationsFoldersMap): string[] | undefined {
function findPath(currentFolders: SuggestedNavigationsFoldersMap, currentPath: string[]): string[] | undefined {
for (const [key, folder] of Object.entries(currentFolders)) {
const newPath = [...currentPath, key];
if (folder.subScopeName === subScope) {
return newPath;
}
// Search in nested folders
const nestedPath = findPath(folder.folders, newPath);
if (nestedPath) {
return nestedPath;
}
}
return undefined;
}
return findPath(folders, []);
}
export function normalizePath(path: string): string {
// Remove query + hash + trailing slash (except root)
const noQuery = path.split('?')[0].split('#')[0];
return noQuery !== '/' && noQuery.endsWith('/') ? noQuery.slice(0, -1) : noQuery;
}
/**
* Deserializes a comma-separated folder path string into an array.
* Handles URL-encoded strings.
*/
export function deserializeFolderPath(navScopePath: string): string[] {
if (!navScopePath) {
return [];
}
try {
const decoded = decodeURIComponent(navScopePath);
return decoded.split(',').map((s) => s.trim());
} catch {
return navScopePath.split(',').map((s) => s.trim());
}
}
/**
* Serializes a folder path array into a comma-separated string.
*/
export function serializeFolderPath(path: string[]): string {
if (!path) {
return '';
}
return encodeURIComponent(path.join(','));
}
// Pathname comes from location.pathname
export function isCurrentPath(pathname: string, to: string): boolean {
const isDashboard = to.startsWith('/d/');

View File

@@ -444,7 +444,7 @@ describe('ScopesSelectorService', () => {
await service.selectScope('test-scope-node');
await service.apply();
await service.removeAllScopes();
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith(undefined);
expect(dashboardsService.setNavigationScope).toHaveBeenCalledWith(undefined, undefined, undefined);
});
});

View File

@@ -372,13 +372,7 @@ export class ScopesSelectorService extends ScopesServiceBase<ScopesSelectorServi
// Redirect to the scope node's redirect URL if it exists, otherwise redirect to the first scope navigation.
private redirectAfterApply = (scopeNode: ScopeNode | undefined) => {
// Check if the selected scope has a redirect path
if (scopeNode && scopeNode.spec.redirectPath && typeof scopeNode.spec.redirectPath === 'string') {
locationService.push(scopeNode.spec.redirectPath);
return;
}
// Redirect to first scopeNavigation if current URL isn't a scopeNavigation
// Check if we are currently on an active scope navigation
const currentPath = locationService.getLocation().pathname;
const activeScopeNavigation = this.dashboardsService.state.scopeNavigations.find((s) => {
if (!('url' in s.spec) || typeof s.spec.url !== 'string') {
@@ -387,6 +381,20 @@ export class ScopesSelectorService extends ScopesServiceBase<ScopesSelectorServi
return isCurrentPath(currentPath, s.spec.url);
});
// Only redirect to redirectPath if we are not currently on an active scope navigation
if (
!activeScopeNavigation &&
scopeNode &&
scopeNode.spec.redirectPath &&
typeof scopeNode.spec.redirectPath === 'string' &&
// Don't redirect if we're already on the target path
!isCurrentPath(currentPath, scopeNode.spec.redirectPath)
) {
locationService.push(scopeNode.spec.redirectPath);
return;
}
// Redirect to first scopeNavigation if current URL isn't a scopeNavigation
if (!activeScopeNavigation && this.dashboardsService.state.scopeNavigations.length > 0) {
// Redirect to the first available scopeNavigation
const firstScopeNavigation = this.dashboardsService.state.scopeNavigations[0];
@@ -396,7 +404,9 @@ export class ScopesSelectorService extends ScopesServiceBase<ScopesSelectorServi
'url' in firstScopeNavigation.spec &&
typeof firstScopeNavigation.spec.url === 'string' &&
// Only redirect to dashboards TODO: Remove this once Logs Drilldown has Scopes support
firstScopeNavigation.spec.url.includes('/d/')
firstScopeNavigation.spec.url.includes('/d/') &&
// Don't redirect if we're already on the target path
!isCurrentPath(currentPath, firstScopeNavigation.spec.url)
) {
locationService.push(firstScopeNavigation.spec.url);
}
@@ -405,7 +415,7 @@ export class ScopesSelectorService extends ScopesServiceBase<ScopesSelectorServi
public removeAllScopes = () => {
this.applyScopes([], false);
this.dashboardsService.setNavigationScope(undefined);
this.dashboardsService.setNavigationScope(undefined, undefined, undefined);
};
private addRecentScopes = (scopes: Scope[], parentNode?: ScopeNode, scopeNodeId?: string) => {

View File

@@ -8897,6 +8897,13 @@
"aria-label-default": "Vyberte barvu",
"aria-label-selected-color": "{{colorLabel}} barva"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Odstranit",
"cancel": "Zrušit",
@@ -9094,6 +9101,8 @@
"interactive-table": {
"aria-label-collapse-all": "Sbalit všechny řádky",
"aria-label-expand-all": "Rozbalit všechny řádky",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Řádek přepnutí rozbalen",
"tooltip-collapse-all": "Sbalit všechny řádky",
"tooltip-expand-all": "Rozbalit všechny řádky"
@@ -12529,13 +12538,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -8827,6 +8827,13 @@
"aria-label-default": "Wählen Sie eine Farbe",
"aria-label-selected-color": "{{colorLabel}} Farbe"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Löschen",
"cancel": "Abbrechen",
@@ -9024,6 +9031,8 @@
"interactive-table": {
"aria-label-collapse-all": "Alle Zeilen einklappen",
"aria-label-expand-all": "Alle Zeilen erweitern",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Zeile umschalten erweitert",
"tooltip-collapse-all": "Alle Zeilen einklappen",
"tooltip-expand-all": "Alle Zeilen erweitern"
@@ -12425,13 +12434,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -2334,8 +2334,10 @@
"label-tenant-sources": "Tenant sources"
},
"rule-details-matching-instances": {
"button-show-all": "Show all {{totalItemsCount}} alert instances",
"showing-count": "Showing {{visibleItemsCount}} out of {{totalItemsCount}} instances"
"button-show-all": "Show all",
"button-show-all-instances": "Show all {{totalItemsCount}} alert instances",
"showing-count": "Showing {{visibleItemsCount}} out of {{totalItemsCount}} instances",
"showing-count-with-state": "Showing {{visibleItemsCount}} {{alertState}} out of {{totalItemsCount}} instances"
},
"rule-editor": {
"get-content": {
@@ -11836,7 +11838,7 @@
"modal-title-set-up-public-access": "Set up public access",
"modal-title-set-up-required-features": "Set up required features",
"step-description-copy-url": "From the ngrok output, copy the https:// forwarding URL that looks like this:",
"step-description-enable-feature-toggles": "Add these settings to your custom.ini file to enable necessary features:",
"step-description-enable-feature-toggles": "Add the provisioning feature toggle to your custom.ini file. Note: kubernetesDashboards is enabled by default, but if you have explicitly disabled it, you will need to enable it in your Grafana settings or remove the override from your configuration.",
"step-description-start-ngrok": "Run this command to create a secure tunnel to your local Grafana:",
"step-description-update-grafana-config": "Add this to your custom.ini file, replacing the URL with your actual ngrok URL:",
"step-title-copy-url": "Copy your public URL",

View File

@@ -8827,6 +8827,13 @@
"aria-label-default": "Elegir un color",
"aria-label-selected-color": "{{colorLabel}} color"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Eliminar",
"cancel": "Cancelar",
@@ -9024,6 +9031,8 @@
"interactive-table": {
"aria-label-collapse-all": "Contraer todas las filas",
"aria-label-expand-all": "Expandir todas las filas",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Alternar fila expandida",
"tooltip-collapse-all": "Contraer todas las filas",
"tooltip-expand-all": "Expandir todas las filas"
@@ -12425,13 +12434,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -8827,6 +8827,13 @@
"aria-label-default": "Choisir une couleur",
"aria-label-selected-color": "Couleur {{colorLabel}}"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Supprimer",
"cancel": "Annuler",
@@ -9024,6 +9031,8 @@
"interactive-table": {
"aria-label-collapse-all": "Réduire toutes les lignes",
"aria-label-expand-all": "Développer toutes les lignes",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Basculer la ligne développée",
"tooltip-collapse-all": "Réduire toutes les lignes",
"tooltip-expand-all": "Développer toutes les lignes"
@@ -12425,13 +12434,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -8827,6 +8827,13 @@
"aria-label-default": "Válasszon színt",
"aria-label-selected-color": "{{colorLabel}} szín"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Törlés",
"cancel": "Mégse",
@@ -9024,6 +9031,8 @@
"interactive-table": {
"aria-label-collapse-all": "Minden sor összecsukása",
"aria-label-expand-all": "Minden sor kibontása",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Sorkibontás váltása",
"tooltip-collapse-all": "Minden sor összecsukása",
"tooltip-expand-all": "Minden sor kibontása"
@@ -12425,13 +12434,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -8792,6 +8792,13 @@
"aria-label-default": "Pilih warna",
"aria-label-selected-color": "warna {{colorLabel}}"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Hapus",
"cancel": "Batalkan",
@@ -8989,6 +8996,8 @@
"interactive-table": {
"aria-label-collapse-all": "Ciutkan semua baris",
"aria-label-expand-all": "Perluas semua baris",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Alihkan baris diperluas",
"tooltip-collapse-all": "Ciutkan semua baris",
"tooltip-expand-all": "Perluas semua baris"
@@ -12373,13 +12382,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

View File

@@ -8827,6 +8827,13 @@
"aria-label-default": "Scegli un colore",
"aria-label-selected-color": "Colore {{colorLabel}}"
},
"components": {
"sparkline": {
"warning": {
"too-few-values": ""
}
}
},
"confirm-button": {
"aria-label-delete": "Elimina",
"cancel": "Annulla",
@@ -9024,6 +9031,8 @@
"interactive-table": {
"aria-label-collapse-all": "Ridurre tutte le righe",
"aria-label-expand-all": "Espandi tutte le righe",
"aria-label-sort-column": "",
"expand-row-header": "",
"expand-row-tooltip": "Attiva/disattiva riga espansa",
"tooltip-collapse-all": "Ridurre tutte le righe",
"tooltip-expand-all": "Espandi tutte le righe"
@@ -12425,13 +12434,12 @@
"effects": {
"bar-glow": "",
"center-glow": "",
"gradient": "",
"label": "",
"rounded-bars": "",
"spotlight": "",
"spotlight-tooltip": ""
},
"gradient": "",
"gradient-auto": "",
"gradient-none": "",
"segment-count": "",
"segment-spacing": "",
"shape": "",

Some files were not shown because too many files have changed in this diff Show More