Compare commits
17 Commits
dependabot
...
gamab/auth
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2f71c8f562 | ||
|
|
d7a3d61726 | ||
|
|
347075bffe | ||
|
|
0db188e95d | ||
|
|
f38df468b5 | ||
|
|
c78c2d7231 | ||
|
|
8f4fa9ed05 | ||
|
|
0aae7e01bc | ||
|
|
58e9e4a56d | ||
|
|
dff9bea3e8 | ||
|
|
19cfab89f3 | ||
|
|
088bab8b38 | ||
|
|
9e8bdee283 | ||
|
|
bb5bb00e4d | ||
|
|
5fcc67837a | ||
|
|
79f2016a66 | ||
|
|
7858dcb9c1 |
@@ -111,3 +111,4 @@ After installing and configuring the Graphite data source you can:
|
||||
- Add [transformations](ref:transformations)
|
||||
- Add [annotations](ref:annotate-visualizations)
|
||||
- Set up [alerting](ref:alerting)
|
||||
- [Troubleshoot](troubleshooting/) common issues with the Graphite data source
|
||||
|
||||
174
docs/sources/datasources/graphite/troubleshooting/index.md
Normal file
174
docs/sources/datasources/graphite/troubleshooting/index.md
Normal file
@@ -0,0 +1,174 @@
|
||||
---
|
||||
description: Troubleshoot common issues with the Graphite data source.
|
||||
keywords:
|
||||
- grafana
|
||||
- graphite
|
||||
- troubleshooting
|
||||
- guide
|
||||
labels:
|
||||
products:
|
||||
- cloud
|
||||
- enterprise
|
||||
- oss
|
||||
menuTitle: Troubleshooting
|
||||
title: Troubleshoot Graphite data source issues
|
||||
weight: 400
|
||||
refs:
|
||||
configure-graphite:
|
||||
- pattern: /docs/grafana/
|
||||
destination: /docs/grafana/<GRAFANA_VERSION>/datasources/graphite/configure/
|
||||
- pattern: /docs/grafana-cloud/
|
||||
destination: /docs/grafana/<GRAFANA_VERSION>/datasources/graphite/configure/
|
||||
query-editor:
|
||||
- pattern: /docs/grafana/
|
||||
destination: /docs/grafana/<GRAFANA_VERSION>/datasources/graphite/query-editor/
|
||||
- pattern: /docs/grafana-cloud/
|
||||
destination: /docs/grafana/<GRAFANA_VERSION>/datasources/graphite/query-editor/
|
||||
---
|
||||
|
||||
# Troubleshoot Graphite data source issues
|
||||
|
||||
This document provides solutions for common issues you might encounter when using the Graphite data source.
|
||||
|
||||
## Connection issues
|
||||
|
||||
Use the following troubleshooting steps to resolve connection problems between Grafana and your Graphite server.
|
||||
|
||||
**Data source test fails with "Unable to connect":**
|
||||
|
||||
If the data source test fails, verify the following:
|
||||
|
||||
- The URL in your data source configuration is correct and accessible from the Grafana server.
|
||||
- The Graphite server is running and accepting connections.
|
||||
- Any firewall rules or network policies allow traffic between Grafana and the Graphite server.
|
||||
- If using TLS, ensure your certificates are valid and properly configured.
|
||||
|
||||
To test connectivity, run the following command from the Grafana server:
|
||||
|
||||
```sh
|
||||
curl -v <GRAPHITE_URL>/render
|
||||
```
|
||||
|
||||
Replace _`<GRAPHITE_URL>`_ with your Graphite server URL. A successful connection returns a response from the Graphite server.
|
||||
|
||||
**Authentication errors:**
|
||||
|
||||
If you receive 401 or 403 errors:
|
||||
|
||||
- Verify your Basic Auth username and password are correct.
|
||||
- Ensure the **With Credentials** toggle is enabled if your Graphite server requires cookies for authentication.
|
||||
- Check that your TLS client certificates are valid and match what the server expects.
|
||||
|
||||
For detailed authentication configuration, refer to [Configure the Graphite data source](ref:configure-graphite).
|
||||
|
||||
## Query issues
|
||||
|
||||
Use the following troubleshooting steps to resolve problems with Graphite queries.
|
||||
|
||||
**No data returned:**
|
||||
|
||||
If your query returns no data:
|
||||
|
||||
- Verify the metric path exists in your Graphite server by testing directly in the Graphite web interface.
|
||||
- Check that the time range in Grafana matches when data was collected.
|
||||
- Ensure wildcards in your query match existing metrics.
|
||||
- Confirm your query syntax is correct for your Graphite version.
|
||||
|
||||
**HTTP 500 errors with HTML content:**
|
||||
|
||||
Graphite-web versions before 1.6 return HTTP 500 errors with full HTML stack traces when a query fails. If you see error messages containing HTML tags:
|
||||
|
||||
- Check the Graphite server logs for the full error details.
|
||||
- Verify your query syntax is valid.
|
||||
- Ensure the requested time range doesn't exceed your Graphite server's capabilities.
|
||||
- Check that all functions used in your query are supported by your Graphite version.
|
||||
|
||||
**Parser errors in the query editor:**
|
||||
|
||||
If the query editor displays parser errors:
|
||||
|
||||
- Check for unbalanced parentheses in function calls.
|
||||
- Verify that function arguments are in the correct format.
|
||||
- Ensure metric paths don't contain unsupported characters.
|
||||
|
||||
For query syntax help, refer to [Graphite query editor](ref:query-editor).
|
||||
|
||||
## Version and feature issues
|
||||
|
||||
Use the following troubleshooting steps to resolve problems related to Graphite versions and features.
|
||||
|
||||
**Functions missing from the query editor:**
|
||||
|
||||
If expected functions don't appear in the query editor:
|
||||
|
||||
- Verify the correct Graphite version is selected in the data source configuration.
|
||||
- The available functions depend on the configured version. For example, tag-based functions require Graphite 1.1 or later.
|
||||
- If using a custom Graphite installation with additional functions, ensure the version setting matches your server.
|
||||
|
||||
**Tag-based queries not working:**
|
||||
|
||||
If `seriesByTag()` or other tag functions fail:
|
||||
|
||||
- Confirm your Graphite server is version 1.1 or later.
|
||||
- Verify the Graphite version setting in your data source configuration matches your actual server version.
|
||||
- Check that tags are properly configured in your Graphite server.
|
||||
|
||||
## Performance issues
|
||||
|
||||
Use the following troubleshooting steps to address slow queries or timeouts.
|
||||
|
||||
**Queries timing out:**
|
||||
|
||||
If queries consistently time out:
|
||||
|
||||
- Increase the **Timeout** setting in the data source configuration.
|
||||
- Reduce the time range of your query.
|
||||
- Use more specific metric paths instead of broad wildcards.
|
||||
- Consider using `summarize()` or `consolidateBy()` functions to reduce the amount of data returned.
|
||||
- Check your Graphite server's performance and resource utilization.
|
||||
|
||||
**Slow autocomplete in the query editor:**
|
||||
|
||||
If metric path autocomplete is slow:
|
||||
|
||||
- This often indicates a large number of metrics in your Graphite server.
|
||||
- Use more specific path prefixes to narrow the search scope.
|
||||
- Check your Graphite server's index performance.
|
||||
|
||||
## MetricTank-specific issues
|
||||
|
||||
If you're using MetricTank as your Graphite backend, use the following troubleshooting steps.
|
||||
|
||||
**Rollup indicator not appearing:**
|
||||
|
||||
If the rollup indicator doesn't display when expected:
|
||||
|
||||
- Verify **Metrictank** is selected as the Graphite backend type in the data source configuration.
|
||||
- Ensure the **Rollup indicator** toggle is enabled.
|
||||
- The indicator only appears when data aggregation actually occurs.
|
||||
|
||||
**Unexpected data aggregation:**
|
||||
|
||||
If you see unexpected aggregation in your data:
|
||||
|
||||
- Check the rollup configuration in your MetricTank instance.
|
||||
- Adjust the time range or use `consolidateBy()` to control aggregation behavior.
|
||||
- Review the query processing metadata in the panel inspector for details on how data was processed.
|
||||
|
||||
## Get additional help
|
||||
|
||||
If you continue to experience issues:
|
||||
|
||||
- Check the [Grafana community forums](https://community.grafana.com/) for similar issues and solutions.
|
||||
- Review the [Graphite documentation](https://graphite.readthedocs.io/) for additional configuration options.
|
||||
- Contact [Grafana Support](https://grafana.com/support/) if you're an Enterprise, Cloud Pro, or Cloud Advanced customer.
|
||||
|
||||
When reporting issues, include the following information:
|
||||
|
||||
- Grafana version
|
||||
- Graphite version (for example, 1.1.x) and backend type (Default or MetricTank)
|
||||
- Authentication method (Basic Auth, TLS, or none)
|
||||
- Error messages (redact sensitive information)
|
||||
- Steps to reproduce the issue
|
||||
- Relevant configuration such as data source settings, timeout values, and Graphite version setting (redact passwords and other credentials)
|
||||
- Sample query (if applicable, with sensitive data redacted)
|
||||
@@ -38,13 +38,6 @@ Users can now view anonymous usage statistics, including the count of devices an
|
||||
|
||||
The number of anonymous devices is not limited by default. The configuration option `device_limit` allows you to enforce a limit on the number of anonymous devices. This enables you to have greater control over the usage within your Grafana instance and keep the usage within the limits of your environment. Once the limit is reached, any new devices that try to access Grafana will be denied access.
|
||||
|
||||
To display anonymous users and devices for versions 10.2, 10.3, 10.4, you need to enable the feature toggle `displayAnonymousStats`
|
||||
|
||||
```bash
|
||||
[feature_toggles]
|
||||
enable = displayAnonymousStats
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Example:
|
||||
@@ -67,3 +60,15 @@ device_limit =
|
||||
```
|
||||
|
||||
If you change your organization name in the Grafana UI this setting needs to be updated to match the new name.
|
||||
|
||||
## Licensing for anonymous access
|
||||
|
||||
Grafana Enterprise (self-managed) licenses anonymous access as active users.
|
||||
|
||||
Anonymous access lets people use Grafana without login credentials. It was an early way to share dashboards, but Public dashboards gives you a more secure way to share dashboards.
|
||||
|
||||
### How anonymous usage is counted
|
||||
|
||||
Grafana estimates anonymous active users from anonymous devices:
|
||||
|
||||
- **Counting rule**: Grafana counts 1 anonymous user for every 3 anonymous devices detected.
|
||||
|
||||
@@ -5554,6 +5554,7 @@ export type ReportDashboard = {
|
||||
};
|
||||
export type Type = string;
|
||||
export type ReportOptions = {
|
||||
csvEncoding?: string;
|
||||
layout?: string;
|
||||
orientation?: string;
|
||||
pdfCombineOneFile?: boolean;
|
||||
|
||||
@@ -207,6 +207,10 @@ export interface FeatureToggles {
|
||||
*/
|
||||
reportingRetries?: boolean;
|
||||
/**
|
||||
* Enables CSV encoding options in the reporting feature
|
||||
*/
|
||||
reportingCsvEncodingOptions?: boolean;
|
||||
/**
|
||||
* Send query to the same datasource in a single request when using server side expressions. The `cloudWatchBatchQueries` feature toggle should be enabled if this used with CloudWatch.
|
||||
*/
|
||||
sseGroupByDatasource?: boolean;
|
||||
@@ -778,10 +782,6 @@ export interface FeatureToggles {
|
||||
*/
|
||||
elasticsearchCrossClusterSearch?: boolean;
|
||||
/**
|
||||
* Displays the navigation history so the user can navigate back to previous pages
|
||||
*/
|
||||
unifiedHistory?: boolean;
|
||||
/**
|
||||
* Defaults to using the Loki `/labels` API instead of `/series`
|
||||
* @default true
|
||||
*/
|
||||
|
||||
@@ -224,7 +224,7 @@ func (a *dashboardSqlAccess) CountResources(ctx context.Context, opts MigrateOpt
|
||||
case "folder.grafana.app/folders":
|
||||
summary := &resourcepb.BulkResponse_Summary{}
|
||||
summary.Group = folders.GROUP
|
||||
summary.Group = folders.RESOURCE
|
||||
summary.Resource = folders.RESOURCE
|
||||
_, err = sess.SQL("SELECT COUNT(*) FROM "+sql.Table("dashboard")+
|
||||
" WHERE is_folder=TRUE AND org_id=?", orgId).Get(&summary.Count)
|
||||
rsp.Summary = append(rsp.Summary, summary)
|
||||
|
||||
@@ -53,7 +53,7 @@ func newIAMAuthorizer(
|
||||
resourceAuthorizer[iamv0.RoleBindingInfo.GetName()] = authorizer
|
||||
resourceAuthorizer[iamv0.ServiceAccountResourceInfo.GetName()] = authorizer
|
||||
resourceAuthorizer[iamv0.UserResourceInfo.GetName()] = authorizer
|
||||
resourceAuthorizer[iamv0.ExternalGroupMappingResourceInfo.GetName()] = authorizer
|
||||
resourceAuthorizer[iamv0.ExternalGroupMappingResourceInfo.GetName()] = allowAuthorizer
|
||||
resourceAuthorizer[iamv0.TeamResourceInfo.GetName()] = authorizer
|
||||
resourceAuthorizer["searchUsers"] = serviceAuthorizer
|
||||
resourceAuthorizer["searchTeams"] = serviceAuthorizer
|
||||
|
||||
150
pkg/registry/apis/iam/authorizer/external_group_mapping.go
Normal file
150
pkg/registry/apis/iam/authorizer/external_group_mapping.go
Normal file
@@ -0,0 +1,150 @@
|
||||
package authorizer
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
"github.com/grafana/authlib/types"
|
||||
"k8s.io/apimachinery/pkg/runtime"
|
||||
|
||||
iamv0 "github.com/grafana/grafana/apps/iam/pkg/apis/iam/v0alpha1"
|
||||
"github.com/grafana/grafana/pkg/apimachinery/utils"
|
||||
"github.com/grafana/grafana/pkg/services/apiserver/auth/authorizer/storewrapper"
|
||||
apierrors "k8s.io/apimachinery/pkg/api/errors"
|
||||
)
|
||||
|
||||
type ExternalGroupMappingAuthorizer struct {
|
||||
accessClient types.AccessClient
|
||||
}
|
||||
|
||||
var _ storewrapper.ResourceStorageAuthorizer = (*ExternalGroupMappingAuthorizer)(nil)
|
||||
|
||||
func NewExternalGroupMappingAuthorizer(
|
||||
accessClient types.AccessClient,
|
||||
) *ExternalGroupMappingAuthorizer {
|
||||
return &ExternalGroupMappingAuthorizer{
|
||||
accessClient: accessClient,
|
||||
}
|
||||
}
|
||||
|
||||
// AfterGet implements ResourceStorageAuthorizer.
|
||||
func (r *ExternalGroupMappingAuthorizer) AfterGet(ctx context.Context, obj runtime.Object) error {
|
||||
authInfo, ok := types.AuthInfoFrom(ctx)
|
||||
if !ok {
|
||||
return storewrapper.ErrUnauthenticated
|
||||
}
|
||||
|
||||
concreteObj, ok := obj.(*iamv0.ExternalGroupMapping)
|
||||
if !ok {
|
||||
return apierrors.NewInternalError(fmt.Errorf("expected ExternalGroupMapping, got %T: %w", obj, storewrapper.ErrUnexpectedType))
|
||||
}
|
||||
|
||||
teamName := concreteObj.Spec.TeamRef.Name
|
||||
checkReq := types.CheckRequest{
|
||||
Namespace: authInfo.GetNamespace(),
|
||||
Group: iamv0.GROUP,
|
||||
Resource: iamv0.TeamResourceInfo.GetName(),
|
||||
Verb: utils.VerbGetPermissions,
|
||||
Name: teamName,
|
||||
}
|
||||
res, err := r.accessClient.Check(ctx, authInfo, checkReq, "")
|
||||
if err != nil {
|
||||
return apierrors.NewInternalError(err)
|
||||
}
|
||||
|
||||
if !res.Allowed {
|
||||
return apierrors.NewForbidden(
|
||||
iamv0.ExternalGroupMappingResourceInfo.GroupResource(),
|
||||
concreteObj.Name,
|
||||
fmt.Errorf("user cannot access team %s", teamName),
|
||||
)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// BeforeCreate implements ResourceStorageAuthorizer.
|
||||
func (r *ExternalGroupMappingAuthorizer) BeforeCreate(ctx context.Context, obj runtime.Object) error {
|
||||
return r.beforeWrite(ctx, obj)
|
||||
}
|
||||
|
||||
// BeforeDelete implements ResourceStorageAuthorizer.
|
||||
func (r *ExternalGroupMappingAuthorizer) BeforeDelete(ctx context.Context, obj runtime.Object) error {
|
||||
return r.beforeWrite(ctx, obj)
|
||||
}
|
||||
|
||||
// BeforeUpdate implements ResourceStorageAuthorizer.
|
||||
func (r *ExternalGroupMappingAuthorizer) BeforeUpdate(ctx context.Context, obj runtime.Object) error {
|
||||
// Update is not supported for ExternalGroupMapping resources and update attempts are blocked at a lower level,
|
||||
// so this is just a safeguard.
|
||||
return apierrors.NewMethodNotSupported(iamv0.ExternalGroupMappingResourceInfo.GroupResource(), "PUT/PATCH")
|
||||
}
|
||||
|
||||
func (r *ExternalGroupMappingAuthorizer) beforeWrite(ctx context.Context, obj runtime.Object) error {
|
||||
authInfo, ok := types.AuthInfoFrom(ctx)
|
||||
if !ok {
|
||||
return storewrapper.ErrUnauthenticated
|
||||
}
|
||||
|
||||
concreteObj, ok := obj.(*iamv0.ExternalGroupMapping)
|
||||
if !ok {
|
||||
return apierrors.NewInternalError(fmt.Errorf("expected ExternalGroupMapping, got %T: %w", obj, storewrapper.ErrUnexpectedType))
|
||||
}
|
||||
|
||||
teamName := concreteObj.Spec.TeamRef.Name
|
||||
checkReq := types.CheckRequest{
|
||||
Namespace: authInfo.GetNamespace(),
|
||||
Group: iamv0.GROUP,
|
||||
Resource: iamv0.TeamResourceInfo.GetName(),
|
||||
Verb: utils.VerbSetPermissions,
|
||||
Name: teamName,
|
||||
}
|
||||
|
||||
res, err := r.accessClient.Check(ctx, authInfo, checkReq, "")
|
||||
if err != nil {
|
||||
return apierrors.NewInternalError(err)
|
||||
}
|
||||
|
||||
if !res.Allowed {
|
||||
return apierrors.NewForbidden(
|
||||
iamv0.ExternalGroupMappingResourceInfo.GroupResource(),
|
||||
concreteObj.Name,
|
||||
fmt.Errorf("user cannot write team %s", teamName),
|
||||
)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// FilterList implements ResourceStorageAuthorizer.
|
||||
func (r *ExternalGroupMappingAuthorizer) FilterList(ctx context.Context, list runtime.Object) (runtime.Object, error) {
|
||||
authInfo, ok := types.AuthInfoFrom(ctx)
|
||||
if !ok {
|
||||
return nil, storewrapper.ErrUnauthenticated
|
||||
}
|
||||
|
||||
l, ok := list.(*iamv0.ExternalGroupMappingList)
|
||||
if !ok {
|
||||
return nil, apierrors.NewInternalError(fmt.Errorf("expected ExternalGroupMappingList, got %T: %w", list, storewrapper.ErrUnexpectedType))
|
||||
}
|
||||
|
||||
var filteredItems []iamv0.ExternalGroupMapping
|
||||
|
||||
listReq := types.ListRequest{
|
||||
Namespace: authInfo.GetNamespace(),
|
||||
Group: iamv0.GROUP,
|
||||
Resource: iamv0.TeamResourceInfo.GetName(),
|
||||
Verb: utils.VerbGetPermissions,
|
||||
}
|
||||
canView, _, err := r.accessClient.Compile(ctx, authInfo, listReq)
|
||||
if err != nil {
|
||||
return nil, apierrors.NewInternalError(err)
|
||||
}
|
||||
|
||||
for _, item := range l.Items {
|
||||
if canView(item.Spec.TeamRef.Name, "") {
|
||||
filteredItems = append(filteredItems, item)
|
||||
}
|
||||
}
|
||||
|
||||
l.Items = filteredItems
|
||||
return l, nil
|
||||
}
|
||||
229
pkg/registry/apis/iam/authorizer/external_group_mapping_test.go
Normal file
229
pkg/registry/apis/iam/authorizer/external_group_mapping_test.go
Normal file
@@ -0,0 +1,229 @@
|
||||
package authorizer
|
||||
|
||||
import (
|
||||
"context"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/require"
|
||||
apierrors "k8s.io/apimachinery/pkg/api/errors"
|
||||
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
|
||||
|
||||
"github.com/grafana/authlib/types"
|
||||
iamv0 "github.com/grafana/grafana/apps/iam/pkg/apis/iam/v0alpha1"
|
||||
"github.com/grafana/grafana/pkg/apimachinery/utils"
|
||||
)
|
||||
|
||||
func newExternalGroupMapping(teamName, name string) *iamv0.ExternalGroupMapping {
|
||||
return &iamv0.ExternalGroupMapping{
|
||||
ObjectMeta: metav1.ObjectMeta{Namespace: "org-2", Name: name},
|
||||
Spec: iamv0.ExternalGroupMappingSpec{
|
||||
TeamRef: iamv0.ExternalGroupMappingTeamRef{
|
||||
Name: teamName,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func TestExternalGroupMapping_AfterGet(t *testing.T) {
|
||||
mapping := newExternalGroupMapping("team-1", "mapping-1")
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
shouldAllow bool
|
||||
}{
|
||||
{
|
||||
name: "allow access",
|
||||
shouldAllow: true,
|
||||
},
|
||||
{
|
||||
name: "deny access",
|
||||
shouldAllow: false,
|
||||
},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
checkFunc := func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
require.NotNil(t, id)
|
||||
require.Equal(t, "user:u001", id.GetUID())
|
||||
require.Equal(t, "org-2", id.GetNamespace())
|
||||
|
||||
require.Equal(t, "org-2", req.Namespace)
|
||||
require.Equal(t, iamv0.GROUP, req.Group)
|
||||
require.Equal(t, iamv0.TeamResourceInfo.GetName(), req.Resource)
|
||||
require.Equal(t, "team-1", req.Name)
|
||||
require.Equal(t, utils.VerbGetPermissions, req.Verb)
|
||||
require.Equal(t, "", folder)
|
||||
|
||||
return types.CheckResponse{Allowed: tt.shouldAllow}, nil
|
||||
}
|
||||
|
||||
accessClient := &fakeAccessClient{checkFunc: checkFunc}
|
||||
authz := NewExternalGroupMappingAuthorizer(accessClient)
|
||||
ctx := types.WithAuthInfo(context.Background(), user)
|
||||
|
||||
err := authz.AfterGet(ctx, mapping)
|
||||
if tt.shouldAllow {
|
||||
require.NoError(t, err)
|
||||
} else {
|
||||
require.Error(t, err)
|
||||
}
|
||||
require.True(t, accessClient.checkCalled)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestExternalGroupMapping_FilterList(t *testing.T) {
|
||||
list := &iamv0.ExternalGroupMappingList{
|
||||
Items: []iamv0.ExternalGroupMapping{
|
||||
*newExternalGroupMapping("team-1", "mapping-1"),
|
||||
*newExternalGroupMapping("team-2", "mapping-2"),
|
||||
},
|
||||
ListMeta: metav1.ListMeta{
|
||||
SelfLink: "/apis/iam.grafana.app/v0alpha1/namespaces/org-2/externalgroupmappings",
|
||||
},
|
||||
}
|
||||
|
||||
compileFunc := func(id types.AuthInfo, req types.ListRequest) (types.ItemChecker, types.Zookie, error) {
|
||||
require.NotNil(t, id)
|
||||
require.Equal(t, "user:u001", id.GetUID())
|
||||
require.Equal(t, "org-2", id.GetNamespace())
|
||||
|
||||
require.Equal(t, "org-2", req.Namespace)
|
||||
require.Equal(t, iamv0.GROUP, req.Group)
|
||||
require.Equal(t, iamv0.TeamResourceInfo.GetName(), req.Resource)
|
||||
require.Equal(t, utils.VerbGetPermissions, req.Verb)
|
||||
|
||||
return func(name, folder string) bool {
|
||||
return name == "team-1"
|
||||
}, &types.NoopZookie{}, nil
|
||||
}
|
||||
|
||||
accessClient := &fakeAccessClient{compileFunc: compileFunc}
|
||||
authz := NewExternalGroupMappingAuthorizer(accessClient)
|
||||
ctx := types.WithAuthInfo(context.Background(), user)
|
||||
|
||||
obj, err := authz.FilterList(ctx, list)
|
||||
require.NoError(t, err)
|
||||
require.NotNil(t, list)
|
||||
require.True(t, accessClient.compileCalled)
|
||||
|
||||
filtered, ok := obj.(*iamv0.ExternalGroupMappingList)
|
||||
require.True(t, ok)
|
||||
require.Len(t, filtered.Items, 1)
|
||||
require.Equal(t, "mapping-1", filtered.Items[0].Name)
|
||||
}
|
||||
|
||||
func TestExternalGroupMapping_BeforeCreate(t *testing.T) {
|
||||
mapping := newExternalGroupMapping("team-1", "mapping-1")
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
shouldAllow bool
|
||||
}{
|
||||
{
|
||||
name: "allow create",
|
||||
shouldAllow: true,
|
||||
},
|
||||
{
|
||||
name: "deny create",
|
||||
shouldAllow: false,
|
||||
},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
checkFunc := func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
require.NotNil(t, id)
|
||||
require.Equal(t, "user:u001", id.GetUID())
|
||||
require.Equal(t, "org-2", id.GetNamespace())
|
||||
|
||||
require.Equal(t, "org-2", req.Namespace)
|
||||
require.Equal(t, iamv0.GROUP, req.Group)
|
||||
require.Equal(t, iamv0.TeamResourceInfo.GetName(), req.Resource)
|
||||
require.Equal(t, "team-1", req.Name)
|
||||
require.Equal(t, utils.VerbSetPermissions, req.Verb)
|
||||
require.Equal(t, "", folder)
|
||||
|
||||
return types.CheckResponse{Allowed: tt.shouldAllow}, nil
|
||||
}
|
||||
|
||||
accessClient := &fakeAccessClient{checkFunc: checkFunc}
|
||||
authz := NewExternalGroupMappingAuthorizer(accessClient)
|
||||
ctx := types.WithAuthInfo(context.Background(), user)
|
||||
|
||||
err := authz.BeforeCreate(ctx, mapping)
|
||||
if tt.shouldAllow {
|
||||
require.NoError(t, err)
|
||||
} else {
|
||||
require.Error(t, err)
|
||||
}
|
||||
require.True(t, accessClient.checkCalled)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestExternalGroupMapping_BeforeUpdate(t *testing.T) {
|
||||
mapping := newExternalGroupMapping("team-1", "mapping-1")
|
||||
|
||||
accessClient := &fakeAccessClient{
|
||||
checkFunc: func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
require.Fail(t, "check should not be called")
|
||||
return types.CheckResponse{}, nil
|
||||
},
|
||||
}
|
||||
authz := NewExternalGroupMappingAuthorizer(accessClient)
|
||||
ctx := types.WithAuthInfo(context.Background(), user)
|
||||
|
||||
err := authz.BeforeUpdate(ctx, mapping)
|
||||
require.Error(t, err)
|
||||
require.True(t, apierrors.IsMethodNotSupported(err))
|
||||
require.Contains(t, err.Error(), "PUT/PATCH")
|
||||
require.False(t, accessClient.checkCalled)
|
||||
}
|
||||
|
||||
func TestExternalGroupMapping_BeforeDelete(t *testing.T) {
|
||||
mapping := newExternalGroupMapping("team-1", "mapping-1")
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
shouldAllow bool
|
||||
}{
|
||||
{
|
||||
name: "allow delete",
|
||||
shouldAllow: true,
|
||||
},
|
||||
{
|
||||
name: "deny delete",
|
||||
shouldAllow: false,
|
||||
},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
checkFunc := func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
require.NotNil(t, id)
|
||||
require.Equal(t, "user:u001", id.GetUID())
|
||||
require.Equal(t, "org-2", id.GetNamespace())
|
||||
|
||||
require.Equal(t, "org-2", req.Namespace)
|
||||
require.Equal(t, iamv0.GROUP, req.Group)
|
||||
require.Equal(t, iamv0.TeamResourceInfo.GetName(), req.Resource)
|
||||
require.Equal(t, "team-1", req.Name)
|
||||
require.Equal(t, utils.VerbSetPermissions, req.Verb)
|
||||
require.Equal(t, "", folder)
|
||||
|
||||
return types.CheckResponse{Allowed: tt.shouldAllow}, nil
|
||||
}
|
||||
|
||||
accessClient := &fakeAccessClient{checkFunc: checkFunc}
|
||||
authz := NewExternalGroupMappingAuthorizer(accessClient)
|
||||
ctx := types.WithAuthInfo(context.Background(), user)
|
||||
|
||||
err := authz.BeforeDelete(ctx, mapping)
|
||||
if tt.shouldAllow {
|
||||
require.NoError(t, err)
|
||||
} else {
|
||||
require.Error(t, err)
|
||||
}
|
||||
require.True(t, accessClient.checkCalled)
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -170,42 +170,56 @@ func (r *ResourcePermissionsAuthorizer) FilterList(ctx context.Context, list run
|
||||
if !ok {
|
||||
return nil, storewrapper.ErrUnauthenticated
|
||||
}
|
||||
r.logger.Debug("filtering resource permissions list with auth info",
|
||||
"namespace", authInfo.GetNamespace(),
|
||||
"identity Subject", authInfo.GetSubject(),
|
||||
"identity UID", authInfo.GetUID(),
|
||||
"identity type", authInfo.GetIdentityType(),
|
||||
)
|
||||
|
||||
switch l := list.(type) {
|
||||
case *iamv0.ResourcePermissionList:
|
||||
|
||||
r.logger.Debug("filtering list of length", "length", len(l.Items))
|
||||
var (
|
||||
filteredItems []iamv0.ResourcePermission
|
||||
err error
|
||||
canViewFuncs = map[schema.GroupResource]types.ItemChecker{}
|
||||
)
|
||||
for _, item := range l.Items {
|
||||
gr := schema.GroupResource{
|
||||
Group: item.Spec.Resource.ApiGroup,
|
||||
Resource: item.Spec.Resource.Resource,
|
||||
}
|
||||
target := item.Spec.Resource
|
||||
targetGR := schema.GroupResource{Group: target.ApiGroup, Resource: target.Resource}
|
||||
|
||||
r.logger.Debug("target resource",
|
||||
"group", target.ApiGroup,
|
||||
"resource", target.Resource,
|
||||
"name", target.Name,
|
||||
)
|
||||
|
||||
// Reuse the same canView for items with the same resource
|
||||
canView, found := canViewFuncs[gr]
|
||||
canView, found := canViewFuncs[targetGR]
|
||||
|
||||
if !found {
|
||||
listReq := types.ListRequest{
|
||||
Namespace: item.Namespace,
|
||||
Group: item.Spec.Resource.ApiGroup,
|
||||
Resource: item.Spec.Resource.Resource,
|
||||
Group: target.ApiGroup,
|
||||
Resource: target.Resource,
|
||||
Verb: utils.VerbGetPermissions,
|
||||
}
|
||||
|
||||
r.logger.Debug("compiling list request",
|
||||
"namespace", item.Namespace,
|
||||
"group", target.ApiGroup,
|
||||
"resource", target.Resource,
|
||||
"verb", utils.VerbGetPermissions,
|
||||
)
|
||||
canView, _, err = r.accessClient.Compile(ctx, authInfo, listReq)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
canViewFuncs[gr] = canView
|
||||
canViewFuncs[targetGR] = canView
|
||||
}
|
||||
|
||||
target := item.Spec.Resource
|
||||
targetGR := schema.GroupResource{Group: target.ApiGroup, Resource: target.Resource}
|
||||
|
||||
parent := ""
|
||||
// Fetch the parent of the resource
|
||||
// It's not efficient to do for every item in the list, but it's a good starting point.
|
||||
@@ -223,6 +237,13 @@ func (r *ResourcePermissionsAuthorizer) FilterList(ctx context.Context, list run
|
||||
)
|
||||
continue
|
||||
}
|
||||
r.logger.Debug("fetched parent",
|
||||
"parent", p,
|
||||
"namespace", item.Namespace,
|
||||
"group", target.ApiGroup,
|
||||
"resource", target.Resource,
|
||||
"name", target.Name,
|
||||
)
|
||||
parent = p
|
||||
}
|
||||
|
||||
|
||||
@@ -4,35 +4,15 @@ import (
|
||||
"context"
|
||||
"testing"
|
||||
|
||||
"github.com/go-jose/go-jose/v4/jwt"
|
||||
"github.com/stretchr/testify/require"
|
||||
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
|
||||
"github.com/grafana/authlib/authn"
|
||||
"github.com/grafana/authlib/types"
|
||||
iamv0 "github.com/grafana/grafana/apps/iam/pkg/apis/iam/v0alpha1"
|
||||
"github.com/grafana/grafana/pkg/apimachinery/identity"
|
||||
"github.com/grafana/grafana/pkg/apimachinery/utils"
|
||||
)
|
||||
|
||||
var (
|
||||
user = authn.NewIDTokenAuthInfo(
|
||||
authn.Claims[authn.AccessTokenClaims]{
|
||||
Claims: jwt.Claims{Issuer: "grafana",
|
||||
Subject: types.NewTypeID(types.TypeAccessPolicy, "grafana"), Audience: []string{"iam.grafana.app"}},
|
||||
Rest: authn.AccessTokenClaims{
|
||||
Namespace: "*",
|
||||
Permissions: identity.ServiceIdentityClaims.Rest.Permissions,
|
||||
DelegatedPermissions: identity.ServiceIdentityClaims.Rest.DelegatedPermissions,
|
||||
},
|
||||
}, &authn.Claims[authn.IDTokenClaims]{
|
||||
Claims: jwt.Claims{Subject: types.NewTypeID(types.TypeUser, "u001")},
|
||||
Rest: authn.IDTokenClaims{Namespace: "org-2", Identifier: "u001", Type: types.TypeUser},
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
func newResourcePermission(apiGroup, resource, name string) *iamv0.ResourcePermission {
|
||||
return &iamv0.ResourcePermission{
|
||||
ObjectMeta: metav1.ObjectMeta{Namespace: "org-2"},
|
||||
@@ -222,26 +202,6 @@ func TestResourcePermissions_beforeWrite(t *testing.T) {
|
||||
}
|
||||
}
|
||||
|
||||
// fakeAccessClient is a mock implementation of claims.AccessClient
|
||||
type fakeAccessClient struct {
|
||||
checkCalled bool
|
||||
checkFunc func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error)
|
||||
compileCalled bool
|
||||
compileFunc func(id types.AuthInfo, req types.ListRequest) (types.ItemChecker, types.Zookie, error)
|
||||
}
|
||||
|
||||
func (m *fakeAccessClient) Check(ctx context.Context, id types.AuthInfo, req types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
m.checkCalled = true
|
||||
return m.checkFunc(id, &req, folder)
|
||||
}
|
||||
|
||||
func (m *fakeAccessClient) Compile(ctx context.Context, id types.AuthInfo, req types.ListRequest) (types.ItemChecker, types.Zookie, error) {
|
||||
m.compileCalled = true
|
||||
return m.compileFunc(id, req)
|
||||
}
|
||||
|
||||
var _ types.AccessClient = (*fakeAccessClient)(nil)
|
||||
|
||||
type fakeParentProvider struct {
|
||||
hasParent bool
|
||||
getParentCalled bool
|
||||
|
||||
48
pkg/registry/apis/iam/authorizer/testutil.go
Normal file
48
pkg/registry/apis/iam/authorizer/testutil.go
Normal file
@@ -0,0 +1,48 @@
|
||||
package authorizer
|
||||
|
||||
import (
|
||||
"context"
|
||||
|
||||
"github.com/go-jose/go-jose/v4/jwt"
|
||||
"github.com/grafana/authlib/authn"
|
||||
"github.com/grafana/authlib/types"
|
||||
"github.com/grafana/grafana/pkg/apimachinery/identity"
|
||||
)
|
||||
|
||||
var (
|
||||
// Shared test user identity
|
||||
user = authn.NewIDTokenAuthInfo(
|
||||
authn.Claims[authn.AccessTokenClaims]{
|
||||
Claims: jwt.Claims{Issuer: "grafana",
|
||||
Subject: types.NewTypeID(types.TypeAccessPolicy, "grafana"), Audience: []string{"iam.grafana.app"}},
|
||||
Rest: authn.AccessTokenClaims{
|
||||
Namespace: "*",
|
||||
Permissions: identity.ServiceIdentityClaims.Rest.Permissions,
|
||||
DelegatedPermissions: identity.ServiceIdentityClaims.Rest.DelegatedPermissions,
|
||||
},
|
||||
}, &authn.Claims[authn.IDTokenClaims]{
|
||||
Claims: jwt.Claims{Subject: types.NewTypeID(types.TypeUser, "u001")},
|
||||
Rest: authn.IDTokenClaims{Namespace: "org-2", Identifier: "u001", Type: types.TypeUser},
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
var _ types.AccessClient = (*fakeAccessClient)(nil)
|
||||
|
||||
// fakeAccessClient is a mock implementation of claims.AccessClient
|
||||
type fakeAccessClient struct {
|
||||
checkCalled bool
|
||||
checkFunc func(id types.AuthInfo, req *types.CheckRequest, folder string) (types.CheckResponse, error)
|
||||
compileCalled bool
|
||||
compileFunc func(id types.AuthInfo, req types.ListRequest) (types.ItemChecker, types.Zookie, error)
|
||||
}
|
||||
|
||||
func (m *fakeAccessClient) Check(ctx context.Context, id types.AuthInfo, req types.CheckRequest, folder string) (types.CheckResponse, error) {
|
||||
m.checkCalled = true
|
||||
return m.checkFunc(id, &req, folder)
|
||||
}
|
||||
|
||||
func (m *fakeAccessClient) Compile(ctx context.Context, id types.AuthInfo, req types.ListRequest) (types.ItemChecker, types.Zookie, error) {
|
||||
m.compileCalled = true
|
||||
return m.compileFunc(id, req)
|
||||
}
|
||||
@@ -353,7 +353,8 @@ func (b *IdentityAccessManagementAPIBuilder) UpdateAPIGroupInfo(apiGroupInfo *ge
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
storage[extGroupMappingResource.StoragePath()] = extGroupMappingUniStore
|
||||
|
||||
var extGroupMappingStore storewrapper.K8sStorage = extGroupMappingUniStore
|
||||
|
||||
if b.externalGroupMappingStorage != nil {
|
||||
extGroupMappingLegacyStore, err := NewLocalStore(extGroupMappingResource, apiGroupInfo.Scheme, opts.OptsGetter, b.reg, b.accessClient, b.externalGroupMappingStorage)
|
||||
@@ -365,9 +366,17 @@ func (b *IdentityAccessManagementAPIBuilder) UpdateAPIGroupInfo(apiGroupInfo *ge
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
storage[extGroupMappingResource.StoragePath()] = dw
|
||||
|
||||
var ok bool
|
||||
extGroupMappingStore, ok = dw.(storewrapper.K8sStorage)
|
||||
if !ok {
|
||||
return fmt.Errorf("expected storewrapper.K8sStorage, got %T", dw)
|
||||
}
|
||||
}
|
||||
|
||||
authzWrapper := storewrapper.New(extGroupMappingStore, iamauthorizer.NewExternalGroupMappingAuthorizer(b.accessClient))
|
||||
storage[extGroupMappingResource.StoragePath()] = authzWrapper
|
||||
|
||||
//nolint:staticcheck // not yet migrated to OpenFeature
|
||||
if b.features.IsEnabledGlobally(featuremgmt.FlagKubernetesAuthzApis) {
|
||||
// v0alpha1
|
||||
|
||||
@@ -182,25 +182,6 @@ func newFolderTranslation() translation {
|
||||
return folderTranslation
|
||||
}
|
||||
|
||||
func newExternalGroupMappingTranslation() translation {
|
||||
return translation{
|
||||
resource: "teams.permissions",
|
||||
attribute: "uid",
|
||||
verbMapping: map[string]string{
|
||||
utils.VerbGet: "teams.permissions:read",
|
||||
utils.VerbList: "teams.permissions:read",
|
||||
utils.VerbWatch: "teams.permissions:read",
|
||||
utils.VerbCreate: "teams.permissions:write",
|
||||
utils.VerbUpdate: "teams.permissions:write",
|
||||
utils.VerbPatch: "teams.permissions:write",
|
||||
utils.VerbDelete: "teams.permissions:write",
|
||||
utils.VerbGetPermissions: "teams.permissions:write",
|
||||
utils.VerbSetPermissions: "teams.permissions:write",
|
||||
},
|
||||
folderSupport: false,
|
||||
}
|
||||
}
|
||||
|
||||
func NewMapperRegistry() MapperRegistry {
|
||||
skipScopeOnAllVerbs := map[string]bool{
|
||||
utils.VerbCreate: true,
|
||||
@@ -229,8 +210,6 @@ func NewMapperRegistry() MapperRegistry {
|
||||
"serviceaccounts": newResourceTranslation("serviceaccounts", "uid", false, map[string]bool{utils.VerbCreate: true}),
|
||||
// Teams is a special case. We translate user permissions from id to uid based.
|
||||
"teams": newResourceTranslation("teams", "uid", false, map[string]bool{utils.VerbCreate: true}),
|
||||
// ExternalGroupMappings is a special case. We translate team permissions from id to uid based.
|
||||
"externalgroupmappings": newExternalGroupMappingTranslation(),
|
||||
"coreroles": translation{
|
||||
resource: "roles",
|
||||
attribute: "uid",
|
||||
|
||||
@@ -90,7 +90,7 @@ func ProvideZanzanaClient(cfg *setting.Cfg, db db.DB, tracer tracing.Tracer, fea
|
||||
authzv1.RegisterAuthzServiceServer(channel, srv)
|
||||
authzextv1.RegisterAuthzExtentionServiceServer(channel, srv)
|
||||
|
||||
client, err := zClient.New(channel)
|
||||
client, err := zClient.New(channel, reg)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to initialize zanzana client: %w", err)
|
||||
}
|
||||
@@ -169,7 +169,7 @@ func NewRemoteZanzanaClient(cfg ZanzanaClientConfig, reg prometheus.Registerer)
|
||||
return nil, fmt.Errorf("failed to create zanzana client to remote server: %w", err)
|
||||
}
|
||||
|
||||
client, err := zClient.New(conn)
|
||||
client, err := zClient.New(conn, reg)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to initialize zanzana client: %w", err)
|
||||
}
|
||||
|
||||
@@ -9,6 +9,7 @@ import (
|
||||
authzlib "github.com/grafana/authlib/authz"
|
||||
authzv1 "github.com/grafana/authlib/authz/proto/v1"
|
||||
authlib "github.com/grafana/authlib/types"
|
||||
"github.com/prometheus/client_golang/prometheus"
|
||||
|
||||
"github.com/grafana/grafana/pkg/infra/log"
|
||||
authzextv1 "github.com/grafana/grafana/pkg/services/authz/proto/v1"
|
||||
@@ -25,15 +26,17 @@ type Client struct {
|
||||
authz authzv1.AuthzServiceClient
|
||||
authzext authzextv1.AuthzExtentionServiceClient
|
||||
authzlibclient *authzlib.ClientImpl
|
||||
metrics *clientMetrics
|
||||
}
|
||||
|
||||
func New(cc grpc.ClientConnInterface) (*Client, error) {
|
||||
func New(cc grpc.ClientConnInterface, reg prometheus.Registerer) (*Client, error) {
|
||||
authzlibclient := authzlib.NewClient(cc, authzlib.WithTracerClientOption(tracer))
|
||||
c := &Client{
|
||||
authzlibclient: authzlibclient,
|
||||
authz: authzv1.NewAuthzServiceClient(cc),
|
||||
authzext: authzextv1.NewAuthzExtentionServiceClient(cc),
|
||||
logger: log.New("zanzana.client"),
|
||||
metrics: newClientMetrics(reg),
|
||||
}
|
||||
|
||||
return c, nil
|
||||
@@ -43,6 +46,9 @@ func (c *Client) Check(ctx context.Context, id authlib.AuthInfo, req authlib.Che
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Check")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("Check", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
return c.authzlibclient.Check(ctx, id, req, folder)
|
||||
}
|
||||
|
||||
@@ -50,6 +56,9 @@ func (c *Client) Compile(ctx context.Context, id authlib.AuthInfo, req authlib.L
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Compile")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("Compile", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
return c.authzlibclient.Compile(ctx, id, req)
|
||||
}
|
||||
|
||||
@@ -64,6 +73,9 @@ func (c *Client) Write(ctx context.Context, req *authzextv1.WriteRequest) error
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Write")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("Write", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
_, err := c.authzext.Write(ctx, req)
|
||||
return err
|
||||
}
|
||||
@@ -72,6 +84,9 @@ func (c *Client) BatchCheck(ctx context.Context, req *authzextv1.BatchCheckReque
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Check")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("BatchCheck", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
return c.authzext.BatchCheck(ctx, req)
|
||||
}
|
||||
|
||||
@@ -87,6 +102,9 @@ func (c *Client) Mutate(ctx context.Context, req *authzextv1.MutateRequest) erro
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Mutate")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("Mutate", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
_, err := c.authzext.Mutate(ctx, req)
|
||||
return err
|
||||
}
|
||||
@@ -95,5 +113,8 @@ func (c *Client) Query(ctx context.Context, req *authzextv1.QueryRequest) (*auth
|
||||
ctx, span := tracer.Start(ctx, "authlib.zanzana.client.Query")
|
||||
defer span.End()
|
||||
|
||||
timer := prometheus.NewTimer(c.metrics.requestDurationSeconds.WithLabelValues("Query", req.Namespace))
|
||||
defer timer.ObserveDuration()
|
||||
|
||||
return c.authzext.Query(ctx, req)
|
||||
}
|
||||
|
||||
@@ -7,10 +7,10 @@ import (
|
||||
|
||||
const (
|
||||
metricsNamespace = "iam"
|
||||
metricsSubSystem = "authz_zanzana"
|
||||
metricsSubSystem = "authz_zanzana_client"
|
||||
)
|
||||
|
||||
type metrics struct {
|
||||
type shadowClientMetrics struct {
|
||||
// evaluationsSeconds is a summary for evaluating access for a specific engine (RBAC and zanzana)
|
||||
evaluationsSeconds *prometheus.HistogramVec
|
||||
// compileSeconds is a summary for compiling item checker for a specific engine (RBAC and zanzana)
|
||||
@@ -19,8 +19,13 @@ type metrics struct {
|
||||
evaluationStatusTotal *prometheus.CounterVec
|
||||
}
|
||||
|
||||
func newShadowClientMetrics(reg prometheus.Registerer) *metrics {
|
||||
return &metrics{
|
||||
type clientMetrics struct {
|
||||
// requestDurationSeconds is a summary for zanzana client request duration
|
||||
requestDurationSeconds *prometheus.HistogramVec
|
||||
}
|
||||
|
||||
func newShadowClientMetrics(reg prometheus.Registerer) *shadowClientMetrics {
|
||||
return &shadowClientMetrics{
|
||||
evaluationsSeconds: promauto.With(reg).NewHistogramVec(
|
||||
prometheus.HistogramOpts{
|
||||
Name: "engine_evaluations_seconds",
|
||||
@@ -52,3 +57,18 @@ func newShadowClientMetrics(reg prometheus.Registerer) *metrics {
|
||||
),
|
||||
}
|
||||
}
|
||||
|
||||
func newClientMetrics(reg prometheus.Registerer) *clientMetrics {
|
||||
return &clientMetrics{
|
||||
requestDurationSeconds: promauto.With(reg).NewHistogramVec(
|
||||
prometheus.HistogramOpts{
|
||||
Name: "request_duration_seconds",
|
||||
Help: "Histogram for zanzana client request duration",
|
||||
Namespace: metricsNamespace,
|
||||
Subsystem: metricsSubSystem,
|
||||
Buckets: prometheus.ExponentialBuckets(0.00001, 4, 10),
|
||||
},
|
||||
[]string{"method", "request_namespace"},
|
||||
),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -20,7 +20,7 @@ type ShadowClient struct {
|
||||
logger log.Logger
|
||||
accessClient authlib.AccessClient
|
||||
zanzanaClient authlib.AccessClient
|
||||
metrics *metrics
|
||||
metrics *shadowClientMetrics
|
||||
}
|
||||
|
||||
// WithShadowClient returns a new access client that runs zanzana checks in the background.
|
||||
|
||||
@@ -322,6 +322,13 @@ var (
|
||||
Owner: grafanaOperatorExperienceSquad,
|
||||
RequiresRestart: true,
|
||||
},
|
||||
{
|
||||
Name: "reportingCsvEncodingOptions",
|
||||
Description: "Enables CSV encoding options in the reporting feature",
|
||||
Stage: FeatureStageExperimental,
|
||||
FrontendOnly: false,
|
||||
Owner: grafanaOperatorExperienceSquad,
|
||||
},
|
||||
{
|
||||
Name: "sseGroupByDatasource",
|
||||
Description: "Send query to the same datasource in a single request when using server side expressions. The `cloudWatchBatchQueries` feature toggle should be enabled if this used with CloudWatch.",
|
||||
@@ -1283,13 +1290,6 @@ var (
|
||||
Owner: grafanaPartnerPluginsSquad,
|
||||
Expression: "false",
|
||||
},
|
||||
{
|
||||
Name: "unifiedHistory",
|
||||
Description: "Displays the navigation history so the user can navigate back to previous pages",
|
||||
Stage: FeatureStageExperimental,
|
||||
Owner: grafanaFrontendSearchNavOrganise,
|
||||
FrontendOnly: true,
|
||||
},
|
||||
{
|
||||
// Remove this flag once Loki v4 is released and the min supported version is v3.0+,
|
||||
// since users on v2.9 need it to disable the feature, as it doesn't work for them.
|
||||
|
||||
2
pkg/services/featuremgmt/toggles_gen.csv
generated
2
pkg/services/featuremgmt/toggles_gen.csv
generated
@@ -43,6 +43,7 @@ configurableSchedulerTick,experimental,@grafana/alerting-squad,false,true,false
|
||||
dashgpt,GA,@grafana/dashboards-squad,false,false,true
|
||||
aiGeneratedDashboardChanges,experimental,@grafana/dashboards-squad,false,false,true
|
||||
reportingRetries,preview,@grafana/grafana-operator-experience-squad,false,true,false
|
||||
reportingCsvEncodingOptions,experimental,@grafana/grafana-operator-experience-squad,false,false,false
|
||||
sseGroupByDatasource,experimental,@grafana/grafana-datasources-core-services,false,false,false
|
||||
lokiRunQueriesInParallel,privatePreview,@grafana/observability-logs,false,false,false
|
||||
externalServiceAccounts,preview,@grafana/identity-access-team,false,false,false
|
||||
@@ -177,7 +178,6 @@ alertingAIAnalyzeCentralStateHistory,experimental,@grafana/alerting-squad,false,
|
||||
alertingNotificationsStepMode,GA,@grafana/alerting-squad,false,false,true
|
||||
unifiedStorageSearchUI,experimental,@grafana/search-and-storage,false,false,false
|
||||
elasticsearchCrossClusterSearch,GA,@grafana/partner-datasources,false,false,false
|
||||
unifiedHistory,experimental,@grafana/grafana-search-navigate-organise,false,false,true
|
||||
lokiLabelNamesQueryApi,GA,@grafana/observability-logs,false,false,false
|
||||
k8SFolderCounts,experimental,@grafana/search-and-storage,false,false,false
|
||||
k8SFolderMove,experimental,@grafana/search-and-storage,false,false,false
|
||||
|
||||
|
4
pkg/services/featuremgmt/toggles_gen.go
generated
4
pkg/services/featuremgmt/toggles_gen.go
generated
@@ -135,6 +135,10 @@ const (
|
||||
// Enables rendering retries for the reporting feature
|
||||
FlagReportingRetries = "reportingRetries"
|
||||
|
||||
// FlagReportingCsvEncodingOptions
|
||||
// Enables CSV encoding options in the reporting feature
|
||||
FlagReportingCsvEncodingOptions = "reportingCsvEncodingOptions"
|
||||
|
||||
// FlagSseGroupByDatasource
|
||||
// Send query to the same datasource in a single request when using server side expressions. The `cloudWatchBatchQueries` feature toggle should be enabled if this used with CloudWatch.
|
||||
FlagSseGroupByDatasource = "sseGroupByDatasource"
|
||||
|
||||
20
pkg/services/featuremgmt/toggles_gen.json
generated
20
pkg/services/featuremgmt/toggles_gen.json
generated
@@ -3137,6 +3137,18 @@
|
||||
"hideFromDocs": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"name": "reportingCsvEncodingOptions",
|
||||
"resourceVersion": "1766080709938",
|
||||
"creationTimestamp": "2025-12-18T17:58:29Z"
|
||||
},
|
||||
"spec": {
|
||||
"description": "Enables CSV encoding options in the reporting feature",
|
||||
"stage": "experimental",
|
||||
"codeowner": "@grafana/grafana-operator-experience-squad"
|
||||
}
|
||||
},
|
||||
{
|
||||
"metadata": {
|
||||
"name": "reportingRetries",
|
||||
@@ -3572,8 +3584,12 @@
|
||||
{
|
||||
"metadata": {
|
||||
"name": "unifiedHistory",
|
||||
"resourceVersion": "1764664939750",
|
||||
"creationTimestamp": "2024-12-13T10:41:18Z"
|
||||
"resourceVersion": "1762958248290",
|
||||
"creationTimestamp": "2024-12-13T10:41:18Z",
|
||||
"deletionTimestamp": "2025-11-13T16:25:53Z",
|
||||
"annotations": {
|
||||
"grafana.app/updatedTimestamp": "2025-11-12 14:37:28.29086 +0000 UTC"
|
||||
}
|
||||
},
|
||||
"spec": {
|
||||
"description": "Displays the navigation history so the user can navigate back to previous pages",
|
||||
|
||||
@@ -440,7 +440,7 @@ func (s *ServiceImpl) buildAlertNavLinks(c *contextmodel.ReqContext) *navtree.Na
|
||||
if s.features.IsEnabled(c.Req.Context(), featuremgmt.FlagAlertingTriage) {
|
||||
if hasAccess(ac.EvalAny(ac.EvalPermission(ac.ActionAlertingRuleRead), ac.EvalPermission(ac.ActionAlertingRuleExternalRead))) {
|
||||
alertChildNavs = append(alertChildNavs, &navtree.NavLink{
|
||||
Text: "Alerts", SubTitle: "Visualize active and pending alerts", Id: "alert-alerts", Url: s.cfg.AppSubURL + "/alerting/alerts", Icon: "bell", IsNew: true,
|
||||
Text: "Alert activity", SubTitle: "Visualize active and pending alerts", Id: "alert-alerts", Url: s.cfg.AppSubURL + "/alerting/alerts", Icon: "bell", IsNew: true,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -637,6 +637,8 @@ type UnifiedStorageConfig struct {
|
||||
// EnableMigration indicates whether migration is enabled for the resource.
|
||||
// If not set, will use the default from MigratedUnifiedResources.
|
||||
EnableMigration bool
|
||||
// AutoMigrationThreshold is the threshold below which a resource is automatically migrated.
|
||||
AutoMigrationThreshold int
|
||||
}
|
||||
|
||||
type InstallPlugin struct {
|
||||
|
||||
@@ -8,6 +8,10 @@ import (
|
||||
"github.com/grafana/grafana/pkg/util/osutil"
|
||||
)
|
||||
|
||||
// DefaultAutoMigrationThreshold is the default threshold for auto migration switching.
|
||||
// If a resource has entries at or below this count, it will be migrated.
|
||||
const DefaultAutoMigrationThreshold = 10
|
||||
|
||||
const (
|
||||
PlaylistResource = "playlists.playlist.grafana.app"
|
||||
FolderResource = "folders.folder.grafana.app"
|
||||
@@ -21,6 +25,13 @@ var MigratedUnifiedResources = map[string]bool{
|
||||
DashboardResource: false,
|
||||
}
|
||||
|
||||
// AutoMigratedUnifiedResources maps resources that support auto-migration
|
||||
// TODO: remove this before Grafana 13 GA: https://github.com/grafana/search-and-storage-team/issues/613
|
||||
var AutoMigratedUnifiedResources = map[string]bool{
|
||||
FolderResource: true,
|
||||
DashboardResource: true,
|
||||
}
|
||||
|
||||
// read storage configs from ini file. They look like:
|
||||
// [unified_storage.<group>.<resource>]
|
||||
// <field> = <value>
|
||||
@@ -59,6 +70,13 @@ func (cfg *Cfg) setUnifiedStorageConfig() {
|
||||
enableMigration = section.Key("enableMigration").MustBool(MigratedUnifiedResources[resourceName])
|
||||
}
|
||||
|
||||
// parse autoMigrationThreshold from resource section
|
||||
autoMigrationThreshold := 0
|
||||
autoMigrate := AutoMigratedUnifiedResources[resourceName]
|
||||
if autoMigrate {
|
||||
autoMigrationThreshold = section.Key("autoMigrationThreshold").MustInt(DefaultAutoMigrationThreshold)
|
||||
}
|
||||
|
||||
storageConfig[resourceName] = UnifiedStorageConfig{
|
||||
DualWriterMode: rest.DualWriterMode(dualWriterMode),
|
||||
DualWriterPeriodicDataSyncJobEnabled: dualWriterPeriodicDataSyncJobEnabled,
|
||||
@@ -66,6 +84,7 @@ func (cfg *Cfg) setUnifiedStorageConfig() {
|
||||
DataSyncerRecordsLimit: dataSyncerRecordsLimit,
|
||||
DataSyncerInterval: dataSyncerInterval,
|
||||
EnableMigration: enableMigration,
|
||||
AutoMigrationThreshold: autoMigrationThreshold,
|
||||
}
|
||||
}
|
||||
cfg.UnifiedStorage = storageConfig
|
||||
@@ -73,13 +92,13 @@ func (cfg *Cfg) setUnifiedStorageConfig() {
|
||||
// Set indexer config for unified storage
|
||||
section := cfg.Raw.Section("unified_storage")
|
||||
cfg.DisableDataMigrations = section.Key("disable_data_migrations").MustBool(false)
|
||||
if !cfg.DisableDataMigrations && cfg.getUnifiedStorageType() == "unified" {
|
||||
if !cfg.DisableDataMigrations && cfg.UnifiedStorageType() == "unified" {
|
||||
// Helper log to find instances running migrations in the future
|
||||
cfg.Logger.Info("Unified migration configs enforced")
|
||||
cfg.enforceMigrationToUnifiedConfigs()
|
||||
} else {
|
||||
// Helper log to find instances disabling migration
|
||||
cfg.Logger.Info("Unified migration configs enforcement disabled", "storage_type", cfg.getUnifiedStorageType(), "disable_data_migrations", cfg.DisableDataMigrations)
|
||||
cfg.Logger.Info("Unified migration configs enforcement disabled", "storage_type", cfg.UnifiedStorageType(), "disable_data_migrations", cfg.DisableDataMigrations)
|
||||
}
|
||||
cfg.EnableSearch = section.Key("enable_search").MustBool(false)
|
||||
cfg.MaxPageSizeBytes = section.Key("max_page_size_bytes").MustInt(0)
|
||||
@@ -147,14 +166,15 @@ func (cfg *Cfg) enforceMigrationToUnifiedConfigs() {
|
||||
DualWriterMode: 5,
|
||||
DualWriterMigrationDataSyncDisabled: true,
|
||||
EnableMigration: true,
|
||||
AutoMigrationThreshold: resourceCfg.AutoMigrationThreshold,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// getUnifiedStorageType returns the configured storage type without creating or mutating keys.
|
||||
// UnifiedStorageType returns the configured storage type without creating or mutating keys.
|
||||
// Precedence: env > ini > default ("unified").
|
||||
// Used to decide unified storage behavior early without side effects.
|
||||
func (cfg *Cfg) getUnifiedStorageType() string {
|
||||
func (cfg *Cfg) UnifiedStorageType() string {
|
||||
const (
|
||||
grafanaAPIServerSectionName = "grafana-apiserver"
|
||||
storageTypeKeyName = "storage_type"
|
||||
@@ -168,3 +188,23 @@ func (cfg *Cfg) getUnifiedStorageType() string {
|
||||
}
|
||||
return defaultStorageType
|
||||
}
|
||||
|
||||
// UnifiedStorageConfig returns the UnifiedStorageConfig for a resource.
|
||||
func (cfg *Cfg) UnifiedStorageConfig(resource string) UnifiedStorageConfig {
|
||||
if cfg.UnifiedStorage == nil {
|
||||
return UnifiedStorageConfig{}
|
||||
}
|
||||
return cfg.UnifiedStorage[resource]
|
||||
}
|
||||
|
||||
// EnableMode5 enables migration and sets mode 5 for a resource.
|
||||
func (cfg *Cfg) EnableMode5(resource string) {
|
||||
if cfg.UnifiedStorage == nil {
|
||||
cfg.UnifiedStorage = make(map[string]UnifiedStorageConfig)
|
||||
}
|
||||
config := cfg.UnifiedStorage[resource]
|
||||
config.DualWriterMode = rest.Mode5
|
||||
config.DualWriterMigrationDataSyncDisabled = true
|
||||
config.EnableMigration = true
|
||||
cfg.UnifiedStorage[resource] = config
|
||||
}
|
||||
|
||||
@@ -43,10 +43,16 @@ func TestCfg_setUnifiedStorageConfig(t *testing.T) {
|
||||
}
|
||||
assert.Equal(t, exists, true, migratedResource)
|
||||
|
||||
expectedThreshold := 0
|
||||
if AutoMigratedUnifiedResources[migratedResource] {
|
||||
expectedThreshold = DefaultAutoMigrationThreshold
|
||||
}
|
||||
|
||||
assert.Equal(t, UnifiedStorageConfig{
|
||||
DualWriterMode: 5,
|
||||
DualWriterMigrationDataSyncDisabled: true,
|
||||
EnableMigration: isEnabled,
|
||||
AutoMigrationThreshold: expectedThreshold,
|
||||
}, resourceCfg, migratedResource)
|
||||
}
|
||||
}
|
||||
@@ -71,6 +77,7 @@ func TestCfg_setUnifiedStorageConfig(t *testing.T) {
|
||||
DualWriterPeriodicDataSyncJobEnabled: true,
|
||||
DataSyncerRecordsLimit: 1001,
|
||||
DataSyncerInterval: time.Minute * 10,
|
||||
AutoMigrationThreshold: 0,
|
||||
})
|
||||
|
||||
validateMigratedResources(false)
|
||||
|
||||
@@ -214,8 +214,18 @@ func runMigrationTestSuite(t *testing.T, testCases []resourceMigratorTestCase) {
|
||||
|
||||
for _, state := range testStates {
|
||||
t.Run(state.tc.name(), func(t *testing.T) {
|
||||
// Verify resources now exist in unified storage after migration
|
||||
state.tc.verify(t, helper, true)
|
||||
shouldExist := true
|
||||
for _, gvr := range state.tc.resources() {
|
||||
resourceKey := fmt.Sprintf("%s.%s", gvr.Resource, gvr.Group)
|
||||
// Resources exist if they're either:
|
||||
// 1. In MigratedUnifiedResources (enabled by default), OR
|
||||
// 2. In AutoMigratedUnifiedResources (auto-migrated because count is below threshold)
|
||||
if !setting.MigratedUnifiedResources[resourceKey] && !setting.AutoMigratedUnifiedResources[resourceKey] {
|
||||
shouldExist = false
|
||||
break
|
||||
}
|
||||
}
|
||||
state.tc.verify(t, helper, shouldExist)
|
||||
})
|
||||
}
|
||||
|
||||
@@ -270,7 +280,7 @@ const (
|
||||
|
||||
var migrationIDsToDefault = map[string]bool{
|
||||
playlistsID: true,
|
||||
foldersAndDashboardsID: false,
|
||||
foldersAndDashboardsID: true, // Auto-migrated when resource count is below threshold
|
||||
}
|
||||
|
||||
func verifyRegisteredMigrations(t *testing.T, helper *apis.K8sTestHelper, onlyDefault bool, optOut bool) {
|
||||
|
||||
@@ -10,9 +10,11 @@ import (
|
||||
"github.com/grafana/grafana/pkg/infra/log"
|
||||
"github.com/grafana/grafana/pkg/registry/apis/dashboard/legacy"
|
||||
"github.com/grafana/grafana/pkg/services/sqlstore/migrator"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/resourcepb"
|
||||
"github.com/grafana/grafana/pkg/util/xorm"
|
||||
"github.com/grafana/grafana/pkg/util/xorm/core"
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
)
|
||||
|
||||
@@ -31,6 +33,20 @@ type ResourceMigration struct {
|
||||
migrationID string
|
||||
validators []Validator // Optional: custom validation logic for this migration
|
||||
log log.Logger
|
||||
cfg *setting.Cfg
|
||||
autoMigrate bool // If true, auto-migrate resource if count is below threshold
|
||||
hadErrors bool // Tracks if errors occurred during migration (used with ignoreErrors)
|
||||
}
|
||||
|
||||
// ResourceMigrationOption is a functional option for configuring ResourceMigration.
|
||||
type ResourceMigrationOption func(*ResourceMigration)
|
||||
|
||||
// WithAutoMigrate configures the migration to auto-migrate resource if count is below threshold.
|
||||
func WithAutoMigrate(cfg *setting.Cfg) ResourceMigrationOption {
|
||||
return func(m *ResourceMigration) {
|
||||
m.cfg = cfg
|
||||
m.autoMigrate = true
|
||||
}
|
||||
}
|
||||
|
||||
// NewResourceMigration creates a new migration for the specified resources.
|
||||
@@ -39,14 +55,24 @@ func NewResourceMigration(
|
||||
resources []schema.GroupResource,
|
||||
migrationID string,
|
||||
validators []Validator,
|
||||
opts ...ResourceMigrationOption,
|
||||
) *ResourceMigration {
|
||||
return &ResourceMigration{
|
||||
m := &ResourceMigration{
|
||||
migrator: migrator,
|
||||
resources: resources,
|
||||
migrationID: migrationID,
|
||||
validators: validators,
|
||||
log: log.New("storage.unified.resource_migration." + migrationID),
|
||||
}
|
||||
for _, opt := range opts {
|
||||
opt(m)
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
func (m *ResourceMigration) SkipMigrationLog() bool {
|
||||
// Skip populating the log table if auto-migrate is enabled and errors occurred
|
||||
return m.autoMigrate && m.hadErrors
|
||||
}
|
||||
|
||||
var _ migrator.CodeMigration = (*ResourceMigration)(nil)
|
||||
@@ -57,7 +83,23 @@ func (m *ResourceMigration) SQL(_ migrator.Dialect) string {
|
||||
}
|
||||
|
||||
// Exec implements migrator.CodeMigration interface. Executes the migration across all organizations.
|
||||
func (m *ResourceMigration) Exec(sess *xorm.Session, mg *migrator.Migrator) error {
|
||||
func (m *ResourceMigration) Exec(sess *xorm.Session, mg *migrator.Migrator) (err error) {
|
||||
// Track any errors that occur during migration
|
||||
defer func() {
|
||||
if err != nil {
|
||||
if m.autoMigrate {
|
||||
m.log.Warn(
|
||||
`[WARN] Resource migration failed and is currently skipped.
|
||||
This migration will be enforced in the next major Grafana release, where failures will block startup or resource loading.
|
||||
|
||||
This warning is intended to help you detect and report issues early.
|
||||
Please investigate the failure and report it to the Grafana team so it can be addressed before the next major release.`,
|
||||
"error", err)
|
||||
}
|
||||
m.hadErrors = true
|
||||
}
|
||||
}()
|
||||
|
||||
ctx := context.Background()
|
||||
|
||||
orgs, err := m.getAllOrgs(sess)
|
||||
@@ -75,7 +117,8 @@ func (m *ResourceMigration) Exec(sess *xorm.Session, mg *migrator.Migrator) erro
|
||||
|
||||
if mg.Dialect.DriverName() == migrator.SQLite {
|
||||
// reuse transaction in SQLite to avoid "database is locked" errors
|
||||
tx, err := sess.Tx()
|
||||
var tx *core.Tx
|
||||
tx, err = sess.Tx()
|
||||
if err != nil {
|
||||
m.log.Error("Failed to get transaction from session", "error", err)
|
||||
return fmt.Errorf("failed to get transaction: %w", err)
|
||||
@@ -85,12 +128,22 @@ func (m *ResourceMigration) Exec(sess *xorm.Session, mg *migrator.Migrator) erro
|
||||
}
|
||||
|
||||
for _, org := range orgs {
|
||||
if err := m.migrateOrg(ctx, sess, org); err != nil {
|
||||
if err = m.migrateOrg(ctx, sess, org); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// Auto-enable mode 5 for resources after successful migration
|
||||
// TODO: remove this before Grafana 13 GA: https://github.com/grafana/search-and-storage-team/issues/613
|
||||
if m.autoMigrate {
|
||||
for _, gr := range m.resources {
|
||||
m.log.Info("Auto-enabling mode 5 for resource", "resource", gr.Resource+"."+gr.Group)
|
||||
m.cfg.EnableMode5(gr.Resource + "." + gr.Group)
|
||||
}
|
||||
}
|
||||
|
||||
m.log.Info("Migration completed successfully for all organizations", "org_count", len(orgs))
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
package migrations
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
v1beta1 "github.com/grafana/grafana/apps/dashboard/pkg/apis/dashboard/v1beta1"
|
||||
folders "github.com/grafana/grafana/apps/folder/pkg/apis/folder/v1beta1"
|
||||
playlists "github.com/grafana/grafana/apps/playlist/pkg/apis/playlist/v0alpha1"
|
||||
"github.com/grafana/grafana/pkg/infra/db"
|
||||
"github.com/grafana/grafana/pkg/registry/apis/dashboard/legacy"
|
||||
sqlstoremigrator "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
@@ -14,69 +16,70 @@ import (
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
)
|
||||
|
||||
type ResourceDefinition struct {
|
||||
GroupResource schema.GroupResource
|
||||
MigratorFunc string // Name of the method: "MigrateFolders", "MigrateDashboards", etc.
|
||||
type resourceDefinition struct {
|
||||
groupResource schema.GroupResource
|
||||
migratorFunc string // Name of the method: "MigrateFolders", "MigrateDashboards", etc.
|
||||
}
|
||||
|
||||
type migrationDefinition struct {
|
||||
name string
|
||||
migrationID string // The ID stored in the migration log table (e.g., "playlists migration")
|
||||
resources []string
|
||||
registerFunc func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient)
|
||||
registerFunc func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient, opts ...ResourceMigrationOption)
|
||||
}
|
||||
|
||||
var resourceRegistry = []ResourceDefinition{
|
||||
var resourceRegistry = []resourceDefinition{
|
||||
{
|
||||
GroupResource: schema.GroupResource{Group: folders.GROUP, Resource: folders.RESOURCE},
|
||||
MigratorFunc: "MigrateFolders",
|
||||
groupResource: schema.GroupResource{Group: folders.GROUP, Resource: folders.RESOURCE},
|
||||
migratorFunc: "MigrateFolders",
|
||||
},
|
||||
{
|
||||
GroupResource: schema.GroupResource{Group: v1beta1.GROUP, Resource: v1beta1.LIBRARY_PANEL_RESOURCE},
|
||||
MigratorFunc: "MigrateLibraryPanels",
|
||||
groupResource: schema.GroupResource{Group: v1beta1.GROUP, Resource: v1beta1.LIBRARY_PANEL_RESOURCE},
|
||||
migratorFunc: "MigrateLibraryPanels",
|
||||
},
|
||||
{
|
||||
GroupResource: schema.GroupResource{Group: v1beta1.GROUP, Resource: v1beta1.DASHBOARD_RESOURCE},
|
||||
MigratorFunc: "MigrateDashboards",
|
||||
groupResource: schema.GroupResource{Group: v1beta1.GROUP, Resource: v1beta1.DASHBOARD_RESOURCE},
|
||||
migratorFunc: "MigrateDashboards",
|
||||
},
|
||||
{
|
||||
GroupResource: schema.GroupResource{Group: playlists.APIGroup, Resource: "playlists"},
|
||||
MigratorFunc: "MigratePlaylists",
|
||||
groupResource: schema.GroupResource{Group: playlists.APIGroup, Resource: "playlists"},
|
||||
migratorFunc: "MigratePlaylists",
|
||||
},
|
||||
}
|
||||
|
||||
var migrationRegistry = []migrationDefinition{
|
||||
{
|
||||
name: "playlists",
|
||||
migrationID: "playlists migration",
|
||||
resources: []string{setting.PlaylistResource},
|
||||
registerFunc: registerPlaylistMigration,
|
||||
},
|
||||
{
|
||||
name: "folders and dashboards",
|
||||
migrationID: "folders and dashboards migration",
|
||||
resources: []string{setting.FolderResource, setting.DashboardResource},
|
||||
registerFunc: registerDashboardAndFolderMigration,
|
||||
},
|
||||
}
|
||||
|
||||
func registerMigrations(cfg *setting.Cfg, mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient) error {
|
||||
func registerMigrations(ctx context.Context,
|
||||
cfg *setting.Cfg,
|
||||
mg *sqlstoremigrator.Migrator,
|
||||
migrator UnifiedMigrator,
|
||||
client resource.ResourceClient,
|
||||
sqlStore db.DB,
|
||||
) error {
|
||||
for _, migration := range migrationRegistry {
|
||||
var (
|
||||
hasValue bool
|
||||
allEnabled bool
|
||||
)
|
||||
|
||||
for _, res := range migration.resources {
|
||||
enabled := cfg.UnifiedStorage[res].EnableMigration
|
||||
if !hasValue {
|
||||
allEnabled = enabled
|
||||
hasValue = true
|
||||
continue
|
||||
}
|
||||
if enabled != allEnabled {
|
||||
return fmt.Errorf("cannot migrate resources separately: %v migration must be either all enabled or all disabled", migration.resources)
|
||||
}
|
||||
if shouldAutoMigrate(ctx, migration, cfg, sqlStore) {
|
||||
migration.registerFunc(mg, migrator, client, WithAutoMigrate(cfg))
|
||||
continue
|
||||
}
|
||||
|
||||
if !allEnabled {
|
||||
enabled, err := isMigrationEnabled(migration, cfg)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if !enabled {
|
||||
logger.Info("Migration is disabled in config, skipping", "migration", migration.name)
|
||||
continue
|
||||
}
|
||||
@@ -85,10 +88,193 @@ func registerMigrations(cfg *setting.Cfg, mg *sqlstoremigrator.Migrator, migrato
|
||||
return nil
|
||||
}
|
||||
|
||||
func getResourceDefinition(group, resource string) *ResourceDefinition {
|
||||
func registerDashboardAndFolderMigration(mg *sqlstoremigrator.Migrator,
|
||||
migrator UnifiedMigrator,
|
||||
client resource.ResourceClient,
|
||||
opts ...ResourceMigrationOption,
|
||||
) {
|
||||
foldersDef := getResourceDefinition("folder.grafana.app", "folders")
|
||||
dashboardsDef := getResourceDefinition("dashboard.grafana.app", "dashboards")
|
||||
driverName := mg.Dialect.DriverName()
|
||||
|
||||
folderCountValidator := NewCountValidator(
|
||||
client,
|
||||
foldersDef.groupResource,
|
||||
"dashboard",
|
||||
"org_id = ? and is_folder = true",
|
||||
driverName,
|
||||
)
|
||||
|
||||
dashboardCountValidator := NewCountValidator(
|
||||
client,
|
||||
dashboardsDef.groupResource,
|
||||
"dashboard",
|
||||
"org_id = ? and is_folder = false",
|
||||
driverName,
|
||||
)
|
||||
|
||||
folderTreeValidator := NewFolderTreeValidator(client, foldersDef.groupResource, driverName)
|
||||
|
||||
dashboardsAndFolders := NewResourceMigration(
|
||||
migrator,
|
||||
[]schema.GroupResource{foldersDef.groupResource, dashboardsDef.groupResource},
|
||||
"folders-dashboards",
|
||||
[]Validator{folderCountValidator, dashboardCountValidator, folderTreeValidator},
|
||||
opts...,
|
||||
)
|
||||
mg.AddMigration("folders and dashboards migration", dashboardsAndFolders)
|
||||
}
|
||||
|
||||
func registerPlaylistMigration(mg *sqlstoremigrator.Migrator,
|
||||
migrator UnifiedMigrator,
|
||||
client resource.ResourceClient,
|
||||
opts ...ResourceMigrationOption,
|
||||
) {
|
||||
playlistsDef := getResourceDefinition("playlist.grafana.app", "playlists")
|
||||
driverName := mg.Dialect.DriverName()
|
||||
|
||||
playlistCountValidator := NewCountValidator(
|
||||
client,
|
||||
playlistsDef.groupResource,
|
||||
"playlist",
|
||||
"org_id = ?",
|
||||
driverName,
|
||||
)
|
||||
|
||||
playlistsMigration := NewResourceMigration(
|
||||
migrator,
|
||||
[]schema.GroupResource{playlistsDef.groupResource},
|
||||
"playlists",
|
||||
[]Validator{playlistCountValidator},
|
||||
opts...,
|
||||
)
|
||||
mg.AddMigration("playlists migration", playlistsMigration)
|
||||
}
|
||||
|
||||
// TODO: remove this before Grafana 13 GA: https://github.com/grafana/search-and-storage-team/issues/613
|
||||
func shouldAutoMigrate(ctx context.Context, migration migrationDefinition, cfg *setting.Cfg, sqlStore db.DB) bool {
|
||||
autoMigrate := false
|
||||
|
||||
for _, res := range migration.resources {
|
||||
config := cfg.UnifiedStorageConfig(res)
|
||||
|
||||
if config.DualWriterMode == 5 {
|
||||
return false
|
||||
}
|
||||
|
||||
if !setting.AutoMigratedUnifiedResources[res] {
|
||||
continue
|
||||
}
|
||||
|
||||
if checkIfAlreadyMigrated(ctx, migration, sqlStore) {
|
||||
for _, res := range migration.resources {
|
||||
cfg.EnableMode5(res)
|
||||
}
|
||||
logger.Info("Auto-migration already completed, enabling mode 5 for resources", "migration", migration.name)
|
||||
return true
|
||||
}
|
||||
|
||||
autoMigrate = true
|
||||
threshold := int64(setting.DefaultAutoMigrationThreshold)
|
||||
if config.AutoMigrationThreshold > 0 {
|
||||
threshold = int64(config.AutoMigrationThreshold)
|
||||
}
|
||||
|
||||
count, err := countResource(ctx, sqlStore, res)
|
||||
if err != nil {
|
||||
logger.Warn("Failed to count resource for auto migration check", "resource", res, "error", err)
|
||||
return false
|
||||
}
|
||||
|
||||
logger.Info("Resource count for auto migration check", "resource", res, "count", count, "threshold", threshold)
|
||||
|
||||
if count > threshold {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
if !autoMigrate {
|
||||
return false
|
||||
}
|
||||
|
||||
logger.Info("Auto-migration enabled for migration", "migration", migration.name)
|
||||
return true
|
||||
}
|
||||
|
||||
func checkIfAlreadyMigrated(ctx context.Context, migration migrationDefinition, sqlStore db.DB) bool {
|
||||
if migration.migrationID == "" {
|
||||
return false
|
||||
}
|
||||
|
||||
exists, err := migrationExists(ctx, sqlStore, migration.migrationID)
|
||||
if err != nil {
|
||||
logger.Warn("Failed to check if migration exists", "migration", migration.name, "error", err)
|
||||
return false
|
||||
}
|
||||
|
||||
return exists
|
||||
}
|
||||
|
||||
func isMigrationEnabled(migration migrationDefinition, cfg *setting.Cfg) (bool, error) {
|
||||
var (
|
||||
hasValue bool
|
||||
allEnabled bool
|
||||
)
|
||||
|
||||
for _, res := range migration.resources {
|
||||
enabled := cfg.UnifiedStorage[res].EnableMigration
|
||||
if !hasValue {
|
||||
allEnabled = enabled
|
||||
hasValue = true
|
||||
continue
|
||||
}
|
||||
if enabled != allEnabled {
|
||||
return false, fmt.Errorf("cannot migrate resources separately: %v migration must be either all enabled or all disabled", migration.resources)
|
||||
}
|
||||
}
|
||||
|
||||
return allEnabled, nil
|
||||
}
|
||||
|
||||
// TODO: remove this before Grafana 13 GA: https://github.com/grafana/search-and-storage-team/issues/613
|
||||
func countResource(ctx context.Context, sqlStore db.DB, resourceName string) (int64, error) {
|
||||
var count int64
|
||||
err := sqlStore.WithDbSession(ctx, func(sess *db.Session) error {
|
||||
switch resourceName {
|
||||
case setting.DashboardResource:
|
||||
var err error
|
||||
count, err = sess.Table("dashboard").Where("is_folder = ?", false).Count()
|
||||
return err
|
||||
case setting.FolderResource:
|
||||
var err error
|
||||
count, err = sess.Table("dashboard").Where("is_folder = ?", true).Count()
|
||||
return err
|
||||
default:
|
||||
return fmt.Errorf("unknown resource: %s", resourceName)
|
||||
}
|
||||
})
|
||||
return count, err
|
||||
}
|
||||
|
||||
const migrationLogTableName = "unifiedstorage_migration_log"
|
||||
|
||||
func migrationExists(ctx context.Context, sqlStore db.DB, migrationID string) (bool, error) {
|
||||
var count int64
|
||||
err := sqlStore.WithDbSession(ctx, func(sess *db.Session) error {
|
||||
var err error
|
||||
count, err = sess.Table(migrationLogTableName).Where("migration_id = ?", migrationID).Count()
|
||||
return err
|
||||
})
|
||||
if err != nil {
|
||||
return false, fmt.Errorf("failed to check migration existence: %w", err)
|
||||
}
|
||||
return count > 0, nil
|
||||
}
|
||||
|
||||
func getResourceDefinition(group, resource string) *resourceDefinition {
|
||||
for i := range resourceRegistry {
|
||||
r := &resourceRegistry[i]
|
||||
if r.GroupResource.Group == group && r.GroupResource.Resource == resource {
|
||||
if r.groupResource.Group == group && r.groupResource.Resource == resource {
|
||||
return r
|
||||
}
|
||||
}
|
||||
@@ -102,8 +288,8 @@ func buildResourceKey(group, resource, namespace string) *resourcepb.ResourceKey
|
||||
}
|
||||
return &resourcepb.ResourceKey{
|
||||
Namespace: namespace,
|
||||
Group: def.GroupResource.Group,
|
||||
Resource: def.GroupResource.Resource,
|
||||
Group: def.groupResource.Group,
|
||||
Resource: def.groupResource.Resource,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -113,7 +299,7 @@ func getMigratorFunc(accessor legacy.MigrationDashboardAccessor, group, resource
|
||||
return nil
|
||||
}
|
||||
|
||||
switch def.MigratorFunc {
|
||||
switch def.migratorFunc {
|
||||
case "MigrateFolders":
|
||||
return accessor.MigrateFolders
|
||||
case "MigrateLibraryPanels":
|
||||
@@ -130,7 +316,7 @@ func getMigratorFunc(accessor legacy.MigrationDashboardAccessor, group, resource
|
||||
func validateRegisteredResources() error {
|
||||
registeredMap := make(map[string]bool)
|
||||
for _, gr := range resourceRegistry {
|
||||
key := fmt.Sprintf("%s.%s", gr.GroupResource.Resource, gr.GroupResource.Group)
|
||||
key := fmt.Sprintf("%s.%s", gr.groupResource.Resource, gr.groupResource.Group)
|
||||
registeredMap[key] = true
|
||||
}
|
||||
|
||||
|
||||
@@ -1,12 +1,15 @@
|
||||
package migrations
|
||||
|
||||
import (
|
||||
"context"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
sqlstoremigrator "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||
"github.com/stretchr/testify/require"
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
)
|
||||
|
||||
// TestRegisterMigrations exercises registerMigrations with various EnableMigration configs using a table-driven test.
|
||||
@@ -14,20 +17,28 @@ func TestRegisterMigrations(t *testing.T) {
|
||||
origRegistry := migrationRegistry
|
||||
t.Cleanup(func() { migrationRegistry = origRegistry })
|
||||
|
||||
// Use fake resource names that are NOT in setting.AutoMigratedUnifiedResources
|
||||
// to avoid triggering the auto-migrate code path which requires a non-nil sqlStore.
|
||||
const (
|
||||
fakePlaylistResource = "fake.playlists.resource"
|
||||
fakeFolderResource = "fake.folders.resource"
|
||||
fakeDashboardResource = "fake.dashboards.resource"
|
||||
)
|
||||
|
||||
// helper to build a fake registry with custom register funcs that bump counters
|
||||
makeFakeRegistry := func(migrationCalls map[string]int) []migrationDefinition {
|
||||
return []migrationDefinition{
|
||||
{
|
||||
name: "playlists",
|
||||
resources: []string{setting.PlaylistResource},
|
||||
registerFunc: func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient) {
|
||||
resources: []string{fakePlaylistResource},
|
||||
registerFunc: func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient, opts ...ResourceMigrationOption) {
|
||||
migrationCalls["playlists"]++
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "folders and dashboards",
|
||||
resources: []string{setting.FolderResource, setting.DashboardResource},
|
||||
registerFunc: func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient) {
|
||||
resources: []string{fakeFolderResource, fakeDashboardResource},
|
||||
registerFunc: func(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient, opts ...ResourceMigrationOption) {
|
||||
migrationCalls["folders and dashboards"]++
|
||||
},
|
||||
},
|
||||
@@ -38,7 +49,9 @@ func TestRegisterMigrations(t *testing.T) {
|
||||
makeCfg := func(vals map[string]bool) *setting.Cfg {
|
||||
cfg := &setting.Cfg{UnifiedStorage: make(map[string]setting.UnifiedStorageConfig)}
|
||||
for k, v := range vals {
|
||||
cfg.UnifiedStorage[k] = setting.UnifiedStorageConfig{EnableMigration: v}
|
||||
cfg.UnifiedStorage[k] = setting.UnifiedStorageConfig{
|
||||
EnableMigration: v,
|
||||
}
|
||||
}
|
||||
return cfg
|
||||
}
|
||||
@@ -71,13 +84,13 @@ func TestRegisterMigrations(t *testing.T) {
|
||||
migrationRegistry = makeFakeRegistry(migrationCalls)
|
||||
|
||||
cfg := makeCfg(map[string]bool{
|
||||
setting.PlaylistResource: tt.enablePlaylist,
|
||||
setting.FolderResource: tt.enableFolder,
|
||||
setting.DashboardResource: tt.enableDashboard,
|
||||
fakePlaylistResource: tt.enablePlaylist,
|
||||
fakeFolderResource: tt.enableFolder,
|
||||
fakeDashboardResource: tt.enableDashboard,
|
||||
})
|
||||
|
||||
// We pass nils for migrator dependencies because our fake registerFuncs don't use them
|
||||
err := registerMigrations(cfg, nil, nil, nil)
|
||||
err := registerMigrations(context.Background(), cfg, nil, nil, nil, nil)
|
||||
|
||||
if tt.wantErr {
|
||||
require.Error(t, err, "expected error for mismatched enablement")
|
||||
@@ -90,3 +103,176 @@ func TestRegisterMigrations(t *testing.T) {
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestResourceMigration_AutoMigrateEnablesMode5 verifies the autoMigrate behavior:
|
||||
// - When autoMigrate=true AND cfg is set AND storage type is "unified", mode 5 should be enabled
|
||||
// - In all other cases, mode 5 should NOT be enabled
|
||||
func TestResourceMigration_AutoMigrateEnablesMode5(t *testing.T) {
|
||||
// Helper to create a cfg with unified storage type
|
||||
makeUnifiedCfg := func() *setting.Cfg {
|
||||
cfg := setting.NewCfg()
|
||||
cfg.Raw.Section("grafana-apiserver").Key("storage_type").SetValue("unified")
|
||||
cfg.UnifiedStorage = make(map[string]setting.UnifiedStorageConfig)
|
||||
return cfg
|
||||
}
|
||||
|
||||
// Helper to create a cfg with legacy storage type
|
||||
makeLegacyCfg := func() *setting.Cfg {
|
||||
cfg := setting.NewCfg()
|
||||
cfg.Raw.Section("grafana-apiserver").Key("storage_type").SetValue("legacy")
|
||||
cfg.UnifiedStorage = make(map[string]setting.UnifiedStorageConfig)
|
||||
return cfg
|
||||
}
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
autoMigrate bool
|
||||
cfg *setting.Cfg
|
||||
resources []string
|
||||
wantMode5Enabled bool
|
||||
description string
|
||||
}{
|
||||
{
|
||||
name: "autoMigrate enabled with unified storage",
|
||||
autoMigrate: true,
|
||||
cfg: makeUnifiedCfg(),
|
||||
resources: []string{setting.DashboardResource},
|
||||
wantMode5Enabled: true,
|
||||
description: "Should enable mode 5 when autoMigrate=true and storage type is unified",
|
||||
},
|
||||
{
|
||||
name: "autoMigrate disabled with unified storage",
|
||||
autoMigrate: false,
|
||||
cfg: makeUnifiedCfg(),
|
||||
resources: []string{setting.DashboardResource},
|
||||
wantMode5Enabled: false,
|
||||
description: "Should NOT enable mode 5 when autoMigrate=false",
|
||||
},
|
||||
{
|
||||
name: "autoMigrate enabled with legacy storage",
|
||||
autoMigrate: true,
|
||||
cfg: makeLegacyCfg(),
|
||||
resources: []string{setting.DashboardResource},
|
||||
wantMode5Enabled: false,
|
||||
description: "Should NOT enable mode 5 when storage type is legacy",
|
||||
},
|
||||
{
|
||||
name: "autoMigrate enabled with nil cfg",
|
||||
autoMigrate: true,
|
||||
cfg: nil,
|
||||
resources: []string{setting.DashboardResource},
|
||||
wantMode5Enabled: false,
|
||||
description: "Should NOT enable mode 5 when cfg is nil",
|
||||
},
|
||||
{
|
||||
name: "autoMigrate enabled with multiple resources",
|
||||
autoMigrate: true,
|
||||
cfg: makeUnifiedCfg(),
|
||||
resources: []string{setting.FolderResource, setting.DashboardResource},
|
||||
wantMode5Enabled: true,
|
||||
description: "Should enable mode 5 for all resources when autoMigrate=true",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
// Build schema.GroupResource from resource strings
|
||||
resources := make([]schema.GroupResource, 0, len(tt.resources))
|
||||
for _, r := range tt.resources {
|
||||
parts := strings.SplitN(r, ".", 2)
|
||||
resources = append(resources, schema.GroupResource{
|
||||
Resource: parts[0],
|
||||
Group: parts[1],
|
||||
})
|
||||
}
|
||||
|
||||
// Create the migration with options
|
||||
var opts []ResourceMigrationOption
|
||||
if tt.autoMigrate {
|
||||
opts = append(opts, WithAutoMigrate(tt.cfg))
|
||||
}
|
||||
|
||||
m := NewResourceMigration(nil, resources, "test-auto-migrate", nil, opts...)
|
||||
|
||||
// Simulate what happens at the end of a successful migration
|
||||
// This is the logic from Exec() that we're testing
|
||||
if m.autoMigrate && m.cfg != nil && m.cfg.UnifiedStorageType() == "unified" {
|
||||
for _, gr := range m.resources {
|
||||
m.cfg.EnableMode5(gr.Resource + "." + gr.Group)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify mode 5 was enabled (or not) for each resource
|
||||
for _, resourceName := range tt.resources {
|
||||
if tt.cfg == nil {
|
||||
// If cfg is nil, we can't check - just verify we didn't panic
|
||||
continue
|
||||
}
|
||||
config := tt.cfg.UnifiedStorageConfig(resourceName)
|
||||
if tt.wantMode5Enabled {
|
||||
require.Equal(t, 5, int(config.DualWriterMode), "%s: %s", tt.description, resourceName)
|
||||
require.True(t, config.EnableMigration, "%s: EnableMigration should be true for %s", tt.description, resourceName)
|
||||
require.True(t, config.DualWriterMigrationDataSyncDisabled, "%s: DualWriterMigrationDataSyncDisabled should be true for %s", tt.description, resourceName)
|
||||
} else {
|
||||
require.Equal(t, 0, int(config.DualWriterMode), "%s: mode should be 0 for %s", tt.description, resourceName)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestResourceMigration_SkipMigrationLog verifies the SkipMigrationLog behavior:
|
||||
// - When ignoreErrors=true AND errors occurred (hadErrors=true), skip writing to migration log
|
||||
// This allows the migration to be re-run on the next startup
|
||||
// - In all other cases, write to migration log normally
|
||||
//
|
||||
// This is important for the folders/dashboards migration which uses WithIgnoreErrors() to handle
|
||||
// partial failures gracefully while still allowing retry on next startup.
|
||||
func TestResourceMigration_SkipMigrationLog(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
autoMigrate bool
|
||||
hadErrors bool
|
||||
want bool
|
||||
description string
|
||||
}{
|
||||
{
|
||||
name: "normal migration success",
|
||||
autoMigrate: false,
|
||||
hadErrors: false,
|
||||
want: false,
|
||||
description: "Normal successful migration should write to log",
|
||||
},
|
||||
{
|
||||
name: "ignoreErrors migration success",
|
||||
autoMigrate: true,
|
||||
hadErrors: false,
|
||||
want: false,
|
||||
description: "Migration with ignoreErrors that succeeds should still write to log",
|
||||
},
|
||||
{
|
||||
name: "normal migration with errors",
|
||||
autoMigrate: false,
|
||||
hadErrors: true,
|
||||
want: false,
|
||||
description: "Migration that fails without ignoreErrors should write error to log",
|
||||
},
|
||||
{
|
||||
name: "ignoreErrors migration with errors - skip log",
|
||||
autoMigrate: true,
|
||||
hadErrors: true,
|
||||
want: true,
|
||||
description: "Migration with ignoreErrors that has errors should SKIP log to allow retry",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
m := &ResourceMigration{
|
||||
autoMigrate: tt.autoMigrate,
|
||||
hadErrors: tt.hadErrors,
|
||||
}
|
||||
require.Equal(t, tt.want, m.SkipMigrationLog(), tt.description)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -14,7 +14,6 @@ import (
|
||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||
"github.com/prometheus/client_golang/prometheus"
|
||||
"go.opentelemetry.io/otel"
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
)
|
||||
|
||||
var tracer = otel.Tracer("github.com/grafana/grafana/pkg/storage/unified/migrations")
|
||||
@@ -54,6 +53,7 @@ func (p *UnifiedStorageMigrationServiceImpl) Run(ctx context.Context) error {
|
||||
logger.Info("Data migrations are disabled, skipping")
|
||||
return nil
|
||||
}
|
||||
|
||||
logger.Info("Running migrations for unified storage")
|
||||
metrics.MUnifiedStorageMigrationStatus.Set(3)
|
||||
return RegisterMigrations(ctx, p.migrator, p.cfg, p.sqlStore, p.client)
|
||||
@@ -79,7 +79,7 @@ func RegisterMigrations(
|
||||
return err
|
||||
}
|
||||
|
||||
if err := registerMigrations(cfg, mg, migrator, client); err != nil {
|
||||
if err := registerMigrations(ctx, cfg, mg, migrator, client, sqlStore); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
@@ -92,65 +92,13 @@ func RegisterMigrations(
|
||||
db.SetMaxOpenConns(3)
|
||||
defer db.SetMaxOpenConns(maxOpenConns)
|
||||
}
|
||||
if err := mg.RunMigrations(ctx,
|
||||
err := mg.RunMigrations(ctx,
|
||||
sec.Key("migration_locking").MustBool(true),
|
||||
sec.Key("locking_attempt_timeout_sec").MustInt()); err != nil {
|
||||
sec.Key("locking_attempt_timeout_sec").MustInt())
|
||||
if err != nil {
|
||||
return fmt.Errorf("unified storage data migration failed: %w", err)
|
||||
}
|
||||
|
||||
logger.Info("Unified storage migrations completed successfully")
|
||||
return nil
|
||||
}
|
||||
|
||||
func registerDashboardAndFolderMigration(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient) {
|
||||
foldersDef := getResourceDefinition("folder.grafana.app", "folders")
|
||||
dashboardsDef := getResourceDefinition("dashboard.grafana.app", "dashboards")
|
||||
driverName := mg.Dialect.DriverName()
|
||||
|
||||
folderCountValidator := NewCountValidator(
|
||||
client,
|
||||
foldersDef.GroupResource,
|
||||
"dashboard",
|
||||
"org_id = ? and is_folder = true",
|
||||
driverName,
|
||||
)
|
||||
|
||||
dashboardCountValidator := NewCountValidator(
|
||||
client,
|
||||
dashboardsDef.GroupResource,
|
||||
"dashboard",
|
||||
"org_id = ? and is_folder = false",
|
||||
driverName,
|
||||
)
|
||||
|
||||
folderTreeValidator := NewFolderTreeValidator(client, foldersDef.GroupResource, driverName)
|
||||
|
||||
dashboardsAndFolders := NewResourceMigration(
|
||||
migrator,
|
||||
[]schema.GroupResource{foldersDef.GroupResource, dashboardsDef.GroupResource},
|
||||
"folders-dashboards",
|
||||
[]Validator{folderCountValidator, dashboardCountValidator, folderTreeValidator},
|
||||
)
|
||||
mg.AddMigration("folders and dashboards migration", dashboardsAndFolders)
|
||||
}
|
||||
|
||||
func registerPlaylistMigration(mg *sqlstoremigrator.Migrator, migrator UnifiedMigrator, client resource.ResourceClient) {
|
||||
playlistsDef := getResourceDefinition("playlist.grafana.app", "playlists")
|
||||
driverName := mg.Dialect.DriverName()
|
||||
|
||||
playlistCountValidator := NewCountValidator(
|
||||
client,
|
||||
playlistsDef.GroupResource,
|
||||
"playlist",
|
||||
"org_id = ?",
|
||||
driverName,
|
||||
)
|
||||
|
||||
playlistsMigration := NewResourceMigration(
|
||||
migrator,
|
||||
[]schema.GroupResource{playlistsDef.GroupResource},
|
||||
"playlists",
|
||||
[]Validator{playlistCountValidator},
|
||||
)
|
||||
mg.AddMigration("playlists migration", playlistsMigration)
|
||||
}
|
||||
|
||||
@@ -0,0 +1,211 @@
|
||||
package threshold
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"os"
|
||||
"testing"
|
||||
|
||||
authlib "github.com/grafana/authlib/types"
|
||||
"github.com/grafana/grafana/pkg/infra/db"
|
||||
"github.com/grafana/grafana/pkg/services/folder"
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
"github.com/grafana/grafana/pkg/tests/apis"
|
||||
"github.com/grafana/grafana/pkg/tests/testinfra"
|
||||
"github.com/grafana/grafana/pkg/tests/testsuite"
|
||||
"github.com/grafana/grafana/pkg/util/testutil"
|
||||
"github.com/stretchr/testify/require"
|
||||
"k8s.io/apimachinery/pkg/api/meta"
|
||||
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
|
||||
"k8s.io/apimachinery/pkg/runtime/schema"
|
||||
)
|
||||
|
||||
// TODO: remove this test before Grafana 13 GA
|
||||
func TestMain(m *testing.M) {
|
||||
testsuite.Run(m)
|
||||
}
|
||||
|
||||
// TestIntegrationAutoMigrateThresholdExceeded verifies that auto-migration is skipped when
|
||||
// resource count exceeds the configured threshold.
|
||||
// TODO: remove this test before Grafana 13 GA
|
||||
func TestIntegrationAutoMigrateThresholdExceeded(t *testing.T) {
|
||||
testutil.SkipIntegrationTestInShortMode(t)
|
||||
|
||||
if db.IsTestDbSQLite() {
|
||||
// Share the same SQLite DB file between steps
|
||||
tmpDir := t.TempDir()
|
||||
dbPath := tmpDir + "/shared-threshold-test.db"
|
||||
|
||||
oldVal := os.Getenv("SQLITE_TEST_DB")
|
||||
require.NoError(t, os.Setenv("SQLITE_TEST_DB", dbPath))
|
||||
t.Cleanup(func() {
|
||||
if oldVal == "" {
|
||||
_ = os.Unsetenv("SQLITE_TEST_DB")
|
||||
} else {
|
||||
_ = os.Setenv("SQLITE_TEST_DB", oldVal)
|
||||
}
|
||||
})
|
||||
t.Logf("Using shared database path: %s", dbPath)
|
||||
}
|
||||
|
||||
var org1 *apis.OrgUsers
|
||||
var orgB *apis.OrgUsers
|
||||
|
||||
dashboardGVR := schema.GroupVersionResource{
|
||||
Group: "dashboard.grafana.app",
|
||||
Version: "v1beta1",
|
||||
Resource: "dashboards",
|
||||
}
|
||||
folderGVR := schema.GroupVersionResource{
|
||||
Group: "folder.grafana.app",
|
||||
Version: "v1beta1",
|
||||
Resource: "folders",
|
||||
}
|
||||
|
||||
dashboardKey := fmt.Sprintf("%s.%s", dashboardGVR.Resource, dashboardGVR.Group)
|
||||
folderKey := fmt.Sprintf("%s.%s", folderGVR.Resource, folderGVR.Group)
|
||||
playlistKey := "playlists.playlist.grafana.app"
|
||||
|
||||
// Step 1: Create resources exceeding the threshold (3 resources, threshold=1)
|
||||
t.Run("Step 1: Create resources exceeding threshold", func(t *testing.T) {
|
||||
unifiedConfig := map[string]setting.UnifiedStorageConfig{}
|
||||
helper := apis.NewK8sTestHelper(t, testinfra.GrafanaOpts{
|
||||
AppModeProduction: true,
|
||||
DisableAnonymous: true,
|
||||
DisableDataMigrations: true,
|
||||
DisableDBCleanup: true,
|
||||
APIServerStorageType: "unified",
|
||||
UnifiedStorageConfig: unifiedConfig,
|
||||
})
|
||||
org1 = &helper.Org1
|
||||
orgB = &helper.OrgB
|
||||
|
||||
// Create 3 dashboards
|
||||
for i := 1; i <= 3; i++ {
|
||||
createTestDashboard(t, helper, fmt.Sprintf("Threshold Dashboard %d", i))
|
||||
}
|
||||
|
||||
// Create 3 folders
|
||||
for i := 1; i <= 3; i++ {
|
||||
createTestFolder(t, helper, fmt.Sprintf("folder-%d", i), fmt.Sprintf("Threshold Folder %d", i), "")
|
||||
}
|
||||
|
||||
// Explicitly shutdown helper before Step 1 ends to ensure database is properly closed
|
||||
helper.Shutdown()
|
||||
})
|
||||
|
||||
// Set SKIP_DB_TRUNCATE to prevent truncation in subsequent steps
|
||||
oldSkipTruncate := os.Getenv("SKIP_DB_TRUNCATE")
|
||||
require.NoError(t, os.Setenv("SKIP_DB_TRUNCATE", "true"))
|
||||
t.Cleanup(func() {
|
||||
if oldSkipTruncate == "" {
|
||||
_ = os.Unsetenv("SKIP_DB_TRUNCATE")
|
||||
} else {
|
||||
_ = os.Setenv("SKIP_DB_TRUNCATE", oldSkipTruncate)
|
||||
}
|
||||
})
|
||||
|
||||
// Step 2: Verify auto-migration is skipped due to threshold
|
||||
t.Run("Step 2: Verify auto-migration skipped (threshold exceeded)", func(t *testing.T) {
|
||||
// Set threshold=1, but we have 3 resources of each type, so migration should be skipped
|
||||
// Disable playlists migration since we're only testing dashboard/folder threshold behavior
|
||||
unifiedConfig := map[string]setting.UnifiedStorageConfig{
|
||||
dashboardKey: {AutoMigrationThreshold: 1, EnableMigration: false},
|
||||
folderKey: {AutoMigrationThreshold: 1, EnableMigration: false},
|
||||
playlistKey: {EnableMigration: false},
|
||||
}
|
||||
helper := apis.NewK8sTestHelperWithOpts(t, apis.K8sTestHelperOpts{
|
||||
GrafanaOpts: testinfra.GrafanaOpts{
|
||||
AppModeProduction: true,
|
||||
DisableAnonymous: true,
|
||||
DisableDataMigrations: false, // Allow migration system to run
|
||||
APIServerStorageType: "unified",
|
||||
UnifiedStorageConfig: unifiedConfig,
|
||||
},
|
||||
Org1Users: org1,
|
||||
OrgBUsers: orgB,
|
||||
})
|
||||
t.Cleanup(helper.Shutdown)
|
||||
|
||||
namespace := authlib.OrgNamespaceFormatter(helper.Org1.OrgID)
|
||||
|
||||
dashCli := helper.GetResourceClient(apis.ResourceClientArgs{
|
||||
User: helper.Org1.Admin,
|
||||
Namespace: namespace,
|
||||
GVR: dashboardGVR,
|
||||
})
|
||||
verifyResourceCount(t, dashCli, 3)
|
||||
|
||||
folderCli := helper.GetResourceClient(apis.ResourceClientArgs{
|
||||
User: helper.Org1.Admin,
|
||||
Namespace: namespace,
|
||||
GVR: folderGVR,
|
||||
})
|
||||
verifyResourceCount(t, folderCli, 3)
|
||||
|
||||
// Verify migration did NOT run by checking the migration log
|
||||
count, err := helper.GetEnv().SQLStore.GetEngine().Table("unifiedstorage_migration_log").
|
||||
Where("migration_id = ?", "folders and dashboards migration").
|
||||
Count()
|
||||
require.NoError(t, err)
|
||||
require.Equal(t, int64(0), count, "Migration should not have run")
|
||||
})
|
||||
}
|
||||
|
||||
func createTestDashboard(t *testing.T, helper *apis.K8sTestHelper, title string) string {
|
||||
t.Helper()
|
||||
|
||||
payload := fmt.Sprintf(`{"dashboard": {"title": "%s", "panels": []}, "overwrite": false}`, title)
|
||||
|
||||
result := apis.DoRequest(helper, apis.RequestParams{
|
||||
User: helper.Org1.Admin,
|
||||
Method: "POST",
|
||||
Path: "/api/dashboards/db",
|
||||
Body: []byte(payload),
|
||||
}, &map[string]interface{}{})
|
||||
|
||||
require.NotNil(t, result.Response)
|
||||
require.Equal(t, 200, result.Response.StatusCode)
|
||||
|
||||
uid := (*result.Result)["uid"].(string)
|
||||
require.NotEmpty(t, uid)
|
||||
return uid
|
||||
}
|
||||
|
||||
func createTestFolder(t *testing.T, helper *apis.K8sTestHelper, uid, title, parentUID string) *folder.Folder {
|
||||
t.Helper()
|
||||
|
||||
payload := fmt.Sprintf(`{
|
||||
"title": "%s",
|
||||
"uid": "%s"`, title, uid)
|
||||
|
||||
if parentUID != "" {
|
||||
payload += fmt.Sprintf(`,
|
||||
"parentUid": "%s"`, parentUID)
|
||||
}
|
||||
|
||||
payload += "}"
|
||||
|
||||
folderCreate := apis.DoRequest(helper, apis.RequestParams{
|
||||
User: helper.Org1.Admin,
|
||||
Method: http.MethodPost,
|
||||
Path: "/api/folders",
|
||||
Body: []byte(payload),
|
||||
}, &folder.Folder{})
|
||||
|
||||
require.NotNil(t, folderCreate.Result)
|
||||
return folderCreate.Result
|
||||
}
|
||||
|
||||
// verifyResourceCount verifies that the expected number of resources exist in K8s storage
|
||||
func verifyResourceCount(t *testing.T, client *apis.K8sResourceClient, expectedCount int) {
|
||||
t.Helper()
|
||||
|
||||
l, err := client.Resource.List(context.Background(), metav1.ListOptions{})
|
||||
require.NoError(t, err)
|
||||
|
||||
resources, err := meta.ExtractList(l)
|
||||
require.NoError(t, err)
|
||||
require.Equal(t, expectedCount, len(resources))
|
||||
}
|
||||
@@ -12,7 +12,7 @@ INSERT INTO {{ .Ident .TableName }}
|
||||
VALUES (
|
||||
{{ .Arg .GUID }},
|
||||
{{ .Arg .KeyPath }},
|
||||
COALESCE({{ .Arg .Value }}, ""),
|
||||
{{ .Arg .Value }},
|
||||
{{ .Arg .Group }},
|
||||
{{ .Arg .Resource }},
|
||||
{{ .Arg .Namespace }},
|
||||
|
||||
@@ -10,7 +10,7 @@ INSERT INTO {{ .Ident "resource_history" }}
|
||||
{{ .Ident "folder" }}
|
||||
)
|
||||
VALUES (
|
||||
COALESCE({{ .Arg .Value }}, ""),
|
||||
{{ .Arg .Value }},
|
||||
{{ .Arg .GUID }},
|
||||
{{ .Arg .Group }},
|
||||
{{ .Arg .Resource }},
|
||||
|
||||
@@ -5,7 +5,7 @@ INSERT INTO {{ .Ident .TableName }}
|
||||
)
|
||||
VALUES (
|
||||
{{ .Arg .KeyPath }},
|
||||
COALESCE({{ .Arg .Value }}, "")
|
||||
{{ .Arg .Value }}
|
||||
)
|
||||
{{- if eq .DialectName "mysql" }}
|
||||
ON DUPLICATE KEY UPDATE {{ .Ident "value" }} = {{ .Arg .Value }}
|
||||
|
||||
@@ -349,6 +349,11 @@ func (w *sqlWriteCloser) Close() error {
|
||||
}
|
||||
|
||||
w.closed = true
|
||||
value := w.buf.Bytes()
|
||||
if value == nil {
|
||||
// to prevent NOT NULL constraint violations
|
||||
value = []byte{}
|
||||
}
|
||||
|
||||
// do regular kv save: simple key_path + value insert with conflict check.
|
||||
// can only do this on resource_events for now, until we drop the columns in resource_history
|
||||
@@ -356,7 +361,7 @@ func (w *sqlWriteCloser) Close() error {
|
||||
_, err := dbutil.Exec(w.ctx, w.kv.db, sqlKVSaveEvent, sqlKVSaveRequest{
|
||||
SQLTemplate: sqltemplate.New(w.kv.dialect),
|
||||
sqlKVSectionKey: w.sectionKey,
|
||||
Value: w.buf.Bytes(),
|
||||
Value: value,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -380,7 +385,7 @@ func (w *sqlWriteCloser) Close() error {
|
||||
SQLTemplate: sqltemplate.New(w.kv.dialect),
|
||||
sqlKVSectionKey: w.sectionKey,
|
||||
GUID: uuid.New().String(),
|
||||
Value: w.buf.Bytes(),
|
||||
Value: value,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -397,7 +402,7 @@ func (w *sqlWriteCloser) Close() error {
|
||||
_, err = dbutil.Exec(w.ctx, w.kv.db, sqlKVUpdateData, sqlKVSaveRequest{
|
||||
SQLTemplate: sqltemplate.New(w.kv.dialect),
|
||||
sqlKVSectionKey: w.sectionKey,
|
||||
Value: w.buf.Bytes(),
|
||||
Value: value,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -433,7 +438,7 @@ func (w *sqlWriteCloser) Close() error {
|
||||
_, err = dbutil.Exec(w.ctx, tx, sqlKVInsertLegacyResourceHistory, sqlKVSaveRequest{
|
||||
SQLTemplate: sqltemplate.New(w.kv.dialect),
|
||||
sqlKVSectionKey: w.sectionKey,
|
||||
Value: w.buf.Bytes(),
|
||||
Value: value,
|
||||
GUID: dataKey.GUID,
|
||||
Group: dataKey.Group,
|
||||
Resource: dataKey.Resource,
|
||||
|
||||
@@ -217,5 +217,8 @@ func initResourceTables(mg *migrator.Migrator) string {
|
||||
|
||||
migrator.ConvertUniqueKeyToPrimaryKey(mg, oldResourceVersionUniqueKey, updatedResourceVersionTable)
|
||||
|
||||
mg.AddMigration("Change key_path collation of resource_history in postgres", migrator.NewRawSQLMigration("").Postgres(`ALTER TABLE resource_history ALTER COLUMN key_path TYPE VARCHAR(2048) COLLATE "C";`))
|
||||
mg.AddMigration("Change key_path collation of resource_events in postgres", migrator.NewRawSQLMigration("").Postgres(`ALTER TABLE resource_events ALTER COLUMN key_path TYPE VARCHAR(2048) COLLATE "C";`))
|
||||
|
||||
return marker
|
||||
}
|
||||
|
||||
@@ -87,7 +87,7 @@
|
||||
"tags": [
|
||||
"ExternalGroupMapping"
|
||||
],
|
||||
"description": "list or watch objects of kind ExternalGroupMapping",
|
||||
"description": "list objects of kind ExternalGroupMapping",
|
||||
"operationId": "listExternalGroupMapping",
|
||||
"parameters": [
|
||||
{
|
||||
@@ -8690,32 +8690,6 @@
|
||||
"description": "Time is a wrapper around time.Time which supports correct marshaling to YAML and JSON. Wrappers are provided for many of the factory methods that the time package offers.",
|
||||
"type": "string",
|
||||
"format": "date-time"
|
||||
},
|
||||
"io.k8s.apimachinery.pkg.apis.meta.v1.WatchEvent": {
|
||||
"description": "Event represents a single event to a watched resource.",
|
||||
"type": "object",
|
||||
"required": [
|
||||
"type",
|
||||
"object"
|
||||
],
|
||||
"properties": {
|
||||
"object": {
|
||||
"description": "Object is:\n * If Type is Added or Modified: the new state of the object.\n * If Type is Deleted: the state of the object immediately before deletion.\n * If Type is Error: *Status is recommended; other types may make sense\n depending on context.",
|
||||
"allOf": [
|
||||
{
|
||||
"$ref": "#/components/schemas/io.k8s.apimachinery.pkg.runtime.RawExtension"
|
||||
}
|
||||
]
|
||||
},
|
||||
"type": {
|
||||
"type": "string",
|
||||
"default": ""
|
||||
}
|
||||
}
|
||||
},
|
||||
"io.k8s.apimachinery.pkg.runtime.RawExtension": {
|
||||
"description": "RawExtension is used to hold extensions in external versions.\n\nTo use this, make a field which has RawExtension as its type in your external, versioned struct, and Object in your internal struct. You also need to register your various plugin types.\n\n// Internal package:\n\n\ttype MyAPIObject struct {\n\t\truntime.TypeMeta `json:\",inline\"`\n\t\tMyPlugin runtime.Object `json:\"myPlugin\"`\n\t}\n\n\ttype PluginA struct {\n\t\tAOption string `json:\"aOption\"`\n\t}\n\n// External package:\n\n\ttype MyAPIObject struct {\n\t\truntime.TypeMeta `json:\",inline\"`\n\t\tMyPlugin runtime.RawExtension `json:\"myPlugin\"`\n\t}\n\n\ttype PluginA struct {\n\t\tAOption string `json:\"aOption\"`\n\t}\n\n// On the wire, the JSON will look something like this:\n\n\t{\n\t\t\"kind\":\"MyAPIObject\",\n\t\t\"apiVersion\":\"v1\",\n\t\t\"myPlugin\": {\n\t\t\t\"kind\":\"PluginA\",\n\t\t\t\"aOption\":\"foo\",\n\t\t},\n\t}\n\nSo what happens? Decode first uses json or yaml to unmarshal the serialized data into your external MyAPIObject. That causes the raw JSON to be stored, but not unpacked. The next step is to copy (using pkg/conversion) into the internal struct. The runtime package's DefaultScheme has conversion functions installed which will unpack the JSON stored in RawExtension, turning it into the correct object type, and storing it in the Object. (TODO: In the case where the object is of an unknown type, a runtime.Unknown object will be created and stored.)",
|
||||
"type": "object"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -557,6 +557,8 @@ func CreateGrafDir(t *testing.T, opts GrafanaOpts) (string, string) {
|
||||
require.NoError(t, err)
|
||||
_, err = section.NewKey("enableMigration", fmt.Sprintf("%t", v.EnableMigration))
|
||||
require.NoError(t, err)
|
||||
_, err = section.NewKey("autoMigrationThreshold", fmt.Sprintf("%d", v.AutoMigrationThreshold))
|
||||
require.NoError(t, err)
|
||||
}
|
||||
}
|
||||
if opts.UnifiedStorageEnableSearch {
|
||||
|
||||
3
public/api-enterprise-spec.json
generated
3
public/api-enterprise-spec.json
generated
@@ -6990,6 +6990,9 @@
|
||||
"ReportOptions": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"csvEncoding": {
|
||||
"type": "string"
|
||||
},
|
||||
"layout": {
|
||||
"type": "string"
|
||||
},
|
||||
|
||||
3
public/api-merged.json
generated
3
public/api-merged.json
generated
@@ -20504,6 +20504,9 @@
|
||||
"ReportOptions": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"csvEncoding": {
|
||||
"type": "string"
|
||||
},
|
||||
"layout": {
|
||||
"type": "string"
|
||||
},
|
||||
|
||||
@@ -10,11 +10,8 @@ import { isShallowEqual } from 'app/core/utils/isShallowEqual';
|
||||
import { KioskMode } from 'app/types/dashboard';
|
||||
|
||||
import { RouteDescriptor } from '../../navigation/types';
|
||||
import { buildBreadcrumbs } from '../Breadcrumbs/utils';
|
||||
|
||||
import { logDuplicateUnifiedHistoryEntryEvent } from './History/eventsTracking';
|
||||
import { ReturnToPreviousProps } from './ReturnToPrevious/ReturnToPrevious';
|
||||
import { HistoryEntry } from './types';
|
||||
|
||||
export interface AppChromeState {
|
||||
chromeless?: boolean;
|
||||
@@ -34,7 +31,6 @@ export interface AppChromeState {
|
||||
|
||||
export const DOCKED_LOCAL_STORAGE_KEY = 'grafana.navigation.docked';
|
||||
export const DOCKED_MENU_OPEN_LOCAL_STORAGE_KEY = 'grafana.navigation.open';
|
||||
export const HISTORY_LOCAL_STORAGE_KEY = 'grafana.navigation.history';
|
||||
|
||||
export class AppChromeService {
|
||||
searchBarStorageKey = 'SearchBar_Hidden';
|
||||
@@ -88,8 +84,6 @@ export class AppChromeService {
|
||||
newState.chromeless = newState.kioskMode === KioskMode.Full || this.currentRoute?.chromeless;
|
||||
|
||||
if (!this.ignoreStateUpdate(newState, current)) {
|
||||
config.featureToggles.unifiedHistory &&
|
||||
store.setObject(HISTORY_LOCAL_STORAGE_KEY, this.getUpdatedHistory(newState));
|
||||
this.state.next(newState);
|
||||
}
|
||||
}
|
||||
@@ -118,40 +112,6 @@ export class AppChromeService {
|
||||
window.sessionStorage.removeItem('returnToPrevious');
|
||||
};
|
||||
|
||||
private getUpdatedHistory(newState: AppChromeState): HistoryEntry[] {
|
||||
const breadcrumbs = buildBreadcrumbs(newState.sectionNav.node, newState.pageNav, { text: 'Home', url: '/' });
|
||||
const newPageNav = newState.pageNav || newState.sectionNav.node;
|
||||
|
||||
let entries = store.getObject<HistoryEntry[]>(HISTORY_LOCAL_STORAGE_KEY, []);
|
||||
const clickedHistory = store.getObject<boolean>('CLICKING_HISTORY');
|
||||
if (clickedHistory) {
|
||||
store.setObject('CLICKING_HISTORY', false);
|
||||
return entries;
|
||||
}
|
||||
if (!newPageNav) {
|
||||
return entries;
|
||||
}
|
||||
|
||||
const lastEntry = entries[0];
|
||||
const newEntry = { name: newPageNav.text, views: [], breadcrumbs, time: Date.now(), url: window.location.href };
|
||||
const isSamePath = lastEntry && newEntry.url.split('?')[0] === lastEntry.url.split('?')[0];
|
||||
|
||||
// To avoid adding an entry with the same path twice, we always use the latest one
|
||||
if (isSamePath) {
|
||||
entries[0] = newEntry;
|
||||
} else {
|
||||
if (lastEntry && lastEntry.name === newEntry.name) {
|
||||
logDuplicateUnifiedHistoryEntryEvent({
|
||||
entryName: newEntry.name,
|
||||
lastEntryURL: lastEntry.url,
|
||||
newEntryURL: newEntry.url,
|
||||
});
|
||||
}
|
||||
entries = [newEntry, ...entries];
|
||||
}
|
||||
|
||||
return entries;
|
||||
}
|
||||
private ignoreStateUpdate(newState: AppChromeState, current: AppChromeState) {
|
||||
if (isShallowEqual(newState, current)) {
|
||||
return true;
|
||||
|
||||
@@ -1,87 +0,0 @@
|
||||
import { css } from '@emotion/css';
|
||||
import { useEffect } from 'react';
|
||||
import { useToggle } from 'react-use';
|
||||
|
||||
import { GrafanaTheme2, store } from '@grafana/data';
|
||||
import { t } from '@grafana/i18n';
|
||||
import { Drawer, ToolbarButton, useStyles2 } from '@grafana/ui';
|
||||
import { appEvents } from 'app/core/app_events';
|
||||
import { RecordHistoryEntryEvent } from 'app/types/events';
|
||||
|
||||
import { HISTORY_LOCAL_STORAGE_KEY } from '../AppChromeService';
|
||||
import { NavToolbarSeparator } from '../NavToolbar/NavToolbarSeparator';
|
||||
import { HistoryEntry } from '../types';
|
||||
|
||||
import { HistoryWrapper } from './HistoryWrapper';
|
||||
import { logUnifiedHistoryDrawerInteractionEvent } from './eventsTracking';
|
||||
|
||||
export function HistoryContainer() {
|
||||
const [showHistoryDrawer, onToggleShowHistoryDrawer] = useToggle(false);
|
||||
const styles = useStyles2(getStyles);
|
||||
|
||||
useEffect(() => {
|
||||
const sub = appEvents.subscribe(RecordHistoryEntryEvent, (ev) => {
|
||||
const clickedHistory = store.getObject<boolean>('CLICKING_HISTORY');
|
||||
if (clickedHistory) {
|
||||
store.setObject('CLICKING_HISTORY', false);
|
||||
return;
|
||||
}
|
||||
const history = store.getObject<HistoryEntry[]>(HISTORY_LOCAL_STORAGE_KEY, []);
|
||||
let lastEntry = history[0];
|
||||
const newUrl = ev.payload.url;
|
||||
const lastUrl = lastEntry.views[0]?.url;
|
||||
if (lastUrl !== newUrl) {
|
||||
lastEntry.views = [
|
||||
{
|
||||
name: ev.payload.name,
|
||||
description: ev.payload.description,
|
||||
url: newUrl,
|
||||
time: Date.now(),
|
||||
},
|
||||
...lastEntry.views,
|
||||
];
|
||||
store.setObject(HISTORY_LOCAL_STORAGE_KEY, [...history]);
|
||||
}
|
||||
return () => {
|
||||
sub.unsubscribe();
|
||||
};
|
||||
});
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<>
|
||||
<ToolbarButton
|
||||
onClick={() => {
|
||||
onToggleShowHistoryDrawer();
|
||||
logUnifiedHistoryDrawerInteractionEvent({ type: 'open' });
|
||||
}}
|
||||
iconOnly
|
||||
icon="history"
|
||||
aria-label={t('nav.history-container.drawer-tittle', 'History')}
|
||||
/>
|
||||
<NavToolbarSeparator className={styles.separator} />
|
||||
{showHistoryDrawer && (
|
||||
<Drawer
|
||||
title={t('nav.history-container.drawer-tittle', 'History')}
|
||||
onClose={() => {
|
||||
onToggleShowHistoryDrawer();
|
||||
logUnifiedHistoryDrawerInteractionEvent({ type: 'close' });
|
||||
}}
|
||||
size="sm"
|
||||
>
|
||||
<HistoryWrapper onClose={() => onToggleShowHistoryDrawer(false)} />
|
||||
</Drawer>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
const getStyles = (theme: GrafanaTheme2) => {
|
||||
return {
|
||||
separator: css({
|
||||
[theme.breakpoints.down('sm')]: {
|
||||
display: 'none',
|
||||
},
|
||||
}),
|
||||
};
|
||||
};
|
||||
@@ -1,291 +0,0 @@
|
||||
import { css, cx } from '@emotion/css';
|
||||
import moment from 'moment';
|
||||
import { useState } from 'react';
|
||||
|
||||
import { FieldType, GrafanaTheme2, store } from '@grafana/data';
|
||||
import { t } from '@grafana/i18n';
|
||||
import { Box, Button, Card, Icon, IconButton, Space, Sparkline, Stack, Text, useStyles2, useTheme2 } from '@grafana/ui';
|
||||
import { formatDate } from 'app/core/internationalization/dates';
|
||||
|
||||
import { HISTORY_LOCAL_STORAGE_KEY } from '../AppChromeService';
|
||||
import { HistoryEntry } from '../types';
|
||||
|
||||
import { logClickUnifiedHistoryEntryEvent, logUnifiedHistoryShowMoreEvent } from './eventsTracking';
|
||||
|
||||
export function HistoryWrapper({ onClose }: { onClose: () => void }) {
|
||||
const history = store.getObject<HistoryEntry[]>(HISTORY_LOCAL_STORAGE_KEY, []).filter((entry) => {
|
||||
return moment(entry.time).isAfter(moment().subtract(2, 'day').startOf('day'));
|
||||
});
|
||||
const [numItemsToShow, setNumItemsToShow] = useState(5);
|
||||
|
||||
const selectedTime = history.find((entry) => {
|
||||
return entry.url === window.location.href || entry.views.some((view) => view.url === window.location.href);
|
||||
})?.time;
|
||||
|
||||
const hist = history.slice(0, numItemsToShow).reduce((acc: { [key: string]: HistoryEntry[] }, entry) => {
|
||||
const date = moment(entry.time);
|
||||
let key = '';
|
||||
if (date.isSame(moment(), 'day')) {
|
||||
key = t('nav.history-wrapper.today', 'Today');
|
||||
} else if (date.isSame(moment().subtract(1, 'day'), 'day')) {
|
||||
key = t('nav.history-wrapper.yesterday', 'Yesterday');
|
||||
} else {
|
||||
key = date.format('YYYY-MM-DD');
|
||||
}
|
||||
acc[key] = [...(acc[key] || []), entry];
|
||||
return acc;
|
||||
}, {});
|
||||
const styles = useStyles2(getStyles);
|
||||
return (
|
||||
<Stack direction="column" alignItems="flex-start">
|
||||
<Box width="100%">
|
||||
{Object.keys(hist).map((entries, date) => {
|
||||
return (
|
||||
<Stack key={date} direction="column" gap={1}>
|
||||
<Box paddingLeft={2}>
|
||||
<Text color="secondary">{entries}</Text>
|
||||
</Box>
|
||||
<div className={styles.timeline}>
|
||||
{hist[entries].map((entry, index) => {
|
||||
return (
|
||||
<HistoryEntryAppView
|
||||
key={index}
|
||||
entry={entry}
|
||||
isSelected={entry.time === selectedTime}
|
||||
onClick={() => onClose()}
|
||||
/>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</Stack>
|
||||
);
|
||||
})}
|
||||
</Box>
|
||||
{history.length > numItemsToShow && (
|
||||
<Box paddingLeft={2}>
|
||||
<Button
|
||||
variant="secondary"
|
||||
fill="text"
|
||||
onClick={() => {
|
||||
setNumItemsToShow(numItemsToShow + 5);
|
||||
logUnifiedHistoryShowMoreEvent();
|
||||
}}
|
||||
>
|
||||
{t('nav.history-wrapper.show-more', 'Show more')}
|
||||
</Button>
|
||||
</Box>
|
||||
)}
|
||||
</Stack>
|
||||
);
|
||||
}
|
||||
interface ItemProps {
|
||||
entry: HistoryEntry;
|
||||
isSelected: boolean;
|
||||
onClick: () => void;
|
||||
}
|
||||
|
||||
function HistoryEntryAppView({ entry, isSelected, onClick }: ItemProps) {
|
||||
const styles = useStyles2(getStyles);
|
||||
const theme = useTheme2();
|
||||
const [isExpanded, setIsExpanded] = useState(isSelected && entry.views.length > 0);
|
||||
|
||||
const { breadcrumbs, views, time, url, sparklineData } = entry;
|
||||
const expandedLabel = isExpanded
|
||||
? t('nav.history-wrapper.collapse', 'Collapse')
|
||||
: t('nav.history-wrapper.expand', 'Expand');
|
||||
const entryIconLabel = isExpanded
|
||||
? t('nav.history-wrapper.icon-selected', 'Selected Entry')
|
||||
: t('nav.history-wrapper.icon-unselected', 'Normal Entry');
|
||||
const selectedViewTime =
|
||||
isSelected &&
|
||||
entry.views.find((entry) => {
|
||||
return entry.url === window.location.href;
|
||||
})?.time;
|
||||
|
||||
return (
|
||||
<Box marginBottom={1}>
|
||||
<Stack direction="column" gap={1}>
|
||||
<Stack alignItems="baseline">
|
||||
{views.length > 0 ? (
|
||||
<IconButton
|
||||
name={isExpanded ? 'angle-down' : 'angle-right'}
|
||||
onClick={() => setIsExpanded(!isExpanded)}
|
||||
aria-label={expandedLabel}
|
||||
className={styles.iconButton}
|
||||
/>
|
||||
) : (
|
||||
<Space h={2} />
|
||||
)}
|
||||
<Icon
|
||||
size="sm"
|
||||
name={isSelected ? 'circle-mono' : 'circle'}
|
||||
aria-label={entryIconLabel}
|
||||
className={isExpanded ? styles.iconButtonDot : styles.iconButtonCircle}
|
||||
/>
|
||||
<Card
|
||||
noMargin
|
||||
onClick={() => {
|
||||
store.setObject('CLICKING_HISTORY', true);
|
||||
onClick();
|
||||
logClickUnifiedHistoryEntryEvent({ entryURL: url });
|
||||
}}
|
||||
href={url}
|
||||
isCompact={true}
|
||||
className={isSelected ? styles.card : cx(styles.card, styles.cardSelected)}
|
||||
>
|
||||
<Stack direction="column">
|
||||
<div>
|
||||
{breadcrumbs.map((breadcrumb, index) => (
|
||||
<Text key={index}>
|
||||
{breadcrumb.text}{' '}
|
||||
{index !== breadcrumbs.length - 1
|
||||
? // eslint-disable-next-line @grafana/i18n/no-untranslated-strings
|
||||
'> '
|
||||
: ''}
|
||||
</Text>
|
||||
))}
|
||||
</div>
|
||||
<Text variant="bodySmall" color="secondary">
|
||||
{formatDate(time, { timeStyle: 'short' })}
|
||||
</Text>
|
||||
{sparklineData && (
|
||||
<Sparkline
|
||||
theme={theme}
|
||||
width={240}
|
||||
height={40}
|
||||
config={{
|
||||
custom: {
|
||||
fillColor: 'rgba(130, 181, 216, 0.1)',
|
||||
lineColor: '#82B5D8',
|
||||
},
|
||||
}}
|
||||
sparkline={{
|
||||
y: {
|
||||
type: FieldType.number,
|
||||
name: 'test',
|
||||
config: {},
|
||||
values: sparklineData.values,
|
||||
state: {
|
||||
range: {
|
||||
...sparklineData.range,
|
||||
},
|
||||
},
|
||||
},
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</Stack>
|
||||
</Card>
|
||||
</Stack>
|
||||
{isExpanded && (
|
||||
<div className={styles.expanded}>
|
||||
{views.map((view, index) => {
|
||||
return (
|
||||
<Card
|
||||
key={index}
|
||||
noMargin
|
||||
href={view.url}
|
||||
onClick={() => {
|
||||
store.setObject('CLICKING_HISTORY', true);
|
||||
onClick();
|
||||
logClickUnifiedHistoryEntryEvent({ entryURL: view.url, subEntry: 'timeRange' });
|
||||
}}
|
||||
isCompact={true}
|
||||
className={view.time === selectedViewTime ? undefined : styles.subCard}
|
||||
>
|
||||
<Stack direction="column" gap={0}>
|
||||
<Text variant="bodySmall">{view.name}</Text>
|
||||
{view.description && (
|
||||
<Text color="secondary" variant="bodySmall">
|
||||
{view.description}
|
||||
</Text>
|
||||
)}
|
||||
</Stack>
|
||||
</Card>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
)}
|
||||
</Stack>
|
||||
</Box>
|
||||
);
|
||||
}
|
||||
const getStyles = (theme: GrafanaTheme2) => {
|
||||
return {
|
||||
card: css({
|
||||
label: 'card',
|
||||
background: 'none',
|
||||
margin: theme.spacing(0.5, 0),
|
||||
}),
|
||||
cardSelected: css({
|
||||
label: 'card-selected',
|
||||
background: 'none',
|
||||
}),
|
||||
subCard: css({
|
||||
label: 'subcard',
|
||||
background: 'none',
|
||||
margin: 0,
|
||||
}),
|
||||
iconButton: css({
|
||||
label: 'expand-button',
|
||||
margin: 0,
|
||||
}),
|
||||
iconButtonCircle: css({
|
||||
label: 'blue-circle-icon',
|
||||
margin: 0,
|
||||
background: theme.colors.background.primary,
|
||||
fill: theme.colors.primary.main,
|
||||
cursor: 'default',
|
||||
'&:hover:before': {
|
||||
background: 'none',
|
||||
},
|
||||
//Need this to place the icon on the line, otherwise the line will appear on top of the icon
|
||||
zIndex: 0,
|
||||
}),
|
||||
iconButtonDot: css({
|
||||
label: 'blue-dot-icon',
|
||||
margin: 0,
|
||||
color: theme.colors.primary.main,
|
||||
border: theme.shape.radius.circle,
|
||||
cursor: 'default',
|
||||
'&:hover:before': {
|
||||
background: 'none',
|
||||
},
|
||||
//Need this to place the icon on the line, otherwise the line will appear on top of the icon
|
||||
zIndex: 0,
|
||||
}),
|
||||
expanded: css({
|
||||
label: 'expanded',
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
marginLeft: theme.spacing(6),
|
||||
gap: theme.spacing(1),
|
||||
position: 'relative',
|
||||
'&:before': {
|
||||
content: '""',
|
||||
position: 'absolute',
|
||||
left: 0,
|
||||
top: 0,
|
||||
height: '100%',
|
||||
width: '1px',
|
||||
background: theme.colors.border.weak,
|
||||
},
|
||||
}),
|
||||
timeline: css({
|
||||
label: 'timeline',
|
||||
position: 'relative',
|
||||
height: '100%',
|
||||
width: '100%',
|
||||
paddingLeft: theme.spacing(2),
|
||||
'&:before': {
|
||||
content: '""',
|
||||
position: 'absolute',
|
||||
left: theme.spacing(5.75),
|
||||
top: 0,
|
||||
height: '100%',
|
||||
width: '1px',
|
||||
borderLeft: `1px dashed ${theme.colors.border.strong}`,
|
||||
},
|
||||
}),
|
||||
};
|
||||
};
|
||||
@@ -1,64 +0,0 @@
|
||||
import { reportInteraction } from '@grafana/runtime';
|
||||
|
||||
const UNIFIED_HISTORY_ENTRY_CLICKED = 'grafana_unified_history_entry_clicked';
|
||||
const UNIFIED_HISTORY_ENTRY_DUPLICATED = 'grafana_unified_history_duplicated_entry_rendered';
|
||||
const UNIFIED_HISTORY_DRAWER_INTERACTION = 'grafana_unified_history_drawer_interaction';
|
||||
const UNIFIED_HISTORY_DRAWER_SHOW_MORE = 'grafana_unified_history_show_more';
|
||||
|
||||
//Currently just 'timeRange' is supported
|
||||
//in short term, we could add 'templateVariables' for example
|
||||
type subEntryTypes = 'timeRange';
|
||||
|
||||
//Whether the user opens or closes the `HistoryDrawer`
|
||||
type UnifiedHistoryDrawerInteraction = 'open' | 'close';
|
||||
|
||||
interface UnifiedHistoryEntryClicked {
|
||||
//We will also work with the current URL but we will get this from Rudderstack data
|
||||
//URL to return to
|
||||
entryURL: string;
|
||||
//In the case we want to go back to a specific query param, currently just a specific time range
|
||||
subEntry?: subEntryTypes;
|
||||
}
|
||||
|
||||
interface UnifiedHistoryEntryDuplicated {
|
||||
// Common name of the history entries
|
||||
entryName: string;
|
||||
// URL of the last entry
|
||||
lastEntryURL: string;
|
||||
// URL of the new entry
|
||||
newEntryURL: string;
|
||||
}
|
||||
|
||||
//Event triggered when a user clicks on an entry of the `HistoryDrawer`
|
||||
export const logClickUnifiedHistoryEntryEvent = ({ entryURL, subEntry }: UnifiedHistoryEntryClicked) => {
|
||||
reportInteraction(UNIFIED_HISTORY_ENTRY_CLICKED, {
|
||||
entryURL,
|
||||
subEntry,
|
||||
});
|
||||
};
|
||||
|
||||
//Event triggered when history entry name matches the previous one
|
||||
//so we keep track of duplicated entries and be able to analyze them
|
||||
export const logDuplicateUnifiedHistoryEntryEvent = ({
|
||||
entryName,
|
||||
lastEntryURL,
|
||||
newEntryURL,
|
||||
}: UnifiedHistoryEntryDuplicated) => {
|
||||
reportInteraction(UNIFIED_HISTORY_ENTRY_DUPLICATED, {
|
||||
entryName,
|
||||
lastEntryURL,
|
||||
newEntryURL,
|
||||
});
|
||||
};
|
||||
|
||||
//We keep track of users open and closing the drawer
|
||||
export const logUnifiedHistoryDrawerInteractionEvent = ({ type }: { type: UnifiedHistoryDrawerInteraction }) => {
|
||||
reportInteraction(UNIFIED_HISTORY_DRAWER_INTERACTION, {
|
||||
type,
|
||||
});
|
||||
};
|
||||
|
||||
//We keep track of users clicking on the `Show more` button
|
||||
export const logUnifiedHistoryShowMoreEvent = () => {
|
||||
reportInteraction(UNIFIED_HISTORY_DRAWER_SHOW_MORE);
|
||||
};
|
||||
@@ -6,7 +6,6 @@ import { Components } from '@grafana/e2e-selectors';
|
||||
import { t } from '@grafana/i18n';
|
||||
import { ScopesContextValue } from '@grafana/runtime';
|
||||
import { Icon, Stack, ToolbarButton, useStyles2 } from '@grafana/ui';
|
||||
import { config } from 'app/core/config';
|
||||
import { MEGA_MENU_TOGGLE_ID } from 'app/core/constants';
|
||||
import { useGrafana } from 'app/core/context/GrafanaContext';
|
||||
import { useMediaQueryMinWidth } from 'app/core/hooks/useMediaQueryMinWidth';
|
||||
@@ -19,7 +18,6 @@ import { HomeLink } from '../../Branding/Branding';
|
||||
import { Breadcrumbs } from '../../Breadcrumbs/Breadcrumbs';
|
||||
import { buildBreadcrumbs } from '../../Breadcrumbs/utils';
|
||||
import { ExtensionToolbarItem } from '../ExtensionSidebar/ExtensionToolbarItem';
|
||||
import { HistoryContainer } from '../History/HistoryContainer';
|
||||
import { NavToolbarSeparator } from '../NavToolbar/NavToolbarSeparator';
|
||||
import { QuickAdd } from '../QuickAdd/QuickAdd';
|
||||
|
||||
@@ -60,7 +58,6 @@ export const SingleTopBar = memo(function SingleTopBar({
|
||||
const profileNode = useSelector((state) => state.navIndex['profile']);
|
||||
const homeNav = useSelector((state) => state.navIndex)[HOME_NAV_ID];
|
||||
const breadcrumbs = buildBreadcrumbs(sectionNav, pageNav, homeNav);
|
||||
const unifiedHistoryEnabled = config.featureToggles.unifiedHistory;
|
||||
const isSmallScreen = !useMediaQueryMinWidth('sm');
|
||||
const isLargeScreen = useMediaQueryMinWidth('lg');
|
||||
const topLevelScopes = !showToolbarLevel && isLargeScreen && scopes?.state.enabled;
|
||||
@@ -96,7 +93,6 @@ export const SingleTopBar = memo(function SingleTopBar({
|
||||
>
|
||||
<TopBarExtensionPoint />
|
||||
<TopSearchBarCommandPaletteTrigger />
|
||||
{unifiedHistoryEnabled && !isSmallScreen && <HistoryContainer />}
|
||||
{!isSmallScreen && <QuickAdd />}
|
||||
<HelpTopBarButton isSmallScreen={isSmallScreen} />
|
||||
<NavToolbarSeparator />
|
||||
|
||||
@@ -4,28 +4,3 @@ export interface ToolbarUpdateProps {
|
||||
pageNav?: NavModelItem;
|
||||
actions?: React.ReactNode;
|
||||
}
|
||||
|
||||
export interface HistoryEntryView {
|
||||
name: string;
|
||||
description: string;
|
||||
url: string;
|
||||
time: number;
|
||||
}
|
||||
|
||||
export interface HistoryEntrySparkline {
|
||||
values: number[];
|
||||
range: {
|
||||
min: number;
|
||||
max: number;
|
||||
delta: number;
|
||||
};
|
||||
}
|
||||
|
||||
export interface HistoryEntry {
|
||||
name: string;
|
||||
time: number;
|
||||
breadcrumbs: NavModelItem[];
|
||||
url: string;
|
||||
views: HistoryEntryView[];
|
||||
sparklineData?: HistoryEntrySparkline;
|
||||
}
|
||||
|
||||
@@ -36,7 +36,11 @@ import {
|
||||
isExpressionQueryInAlert,
|
||||
} from '../../../rule-editor/formProcessing';
|
||||
import { RuleFormType, RuleFormValues } from '../../../types/rule-form';
|
||||
import { GRAFANA_RULES_SOURCE_NAME, getDefaultOrFirstCompatibleDataSource } from '../../../utils/datasource';
|
||||
import {
|
||||
GRAFANA_RULES_SOURCE_NAME,
|
||||
getDefaultOrFirstCompatibleDataSource,
|
||||
getRulesDataSources,
|
||||
} from '../../../utils/datasource';
|
||||
import { PromOrLokiQuery, isPromOrLokiQuery } from '../../../utils/rule-form';
|
||||
import {
|
||||
isCloudAlertingRuleByType,
|
||||
@@ -417,7 +421,9 @@ export const QueryAndExpressionsStep = ({ editingExistingRule, onDataChange, mod
|
||||
]);
|
||||
|
||||
const { sectionTitle, helpLabel, helpContent, helpLink } = DESCRIPTIONS[type ?? RuleFormType.grafana];
|
||||
|
||||
// Only show the data source managed option if there are data sources with manageAlerts enabled
|
||||
const hasAlertEnabledDataSources = useMemo(() => getRulesDataSources().length > 0, []);
|
||||
const canSelectDataSourceManaged = onlyOneDSInQueries(queries) && hasAlertEnabledDataSources;
|
||||
if (!type) {
|
||||
return null;
|
||||
}
|
||||
@@ -437,8 +443,6 @@ export const QueryAndExpressionsStep = ({ editingExistingRule, onDataChange, mod
|
||||
}
|
||||
: undefined;
|
||||
|
||||
const canSelectDataSourceManaged = onlyOneDSInQueries(queries);
|
||||
|
||||
return (
|
||||
<>
|
||||
<RuleEditorSection
|
||||
@@ -506,7 +510,7 @@ export const QueryAndExpressionsStep = ({ editingExistingRule, onDataChange, mod
|
||||
}}
|
||||
/>
|
||||
</Field>
|
||||
{mode === 'edit' && (
|
||||
{mode === 'edit' && hasAlertEnabledDataSources && (
|
||||
<>
|
||||
<Divider />
|
||||
<SmartAlertTypeDetector
|
||||
|
||||
@@ -194,4 +194,53 @@ describe('RuleEditor grafana managed rules', () => {
|
||||
]),
|
||||
});
|
||||
});
|
||||
|
||||
it('should not show rule type switch when no data sources have manageAlerts enabled', async () => {
|
||||
// Setup data source with manageAlerts explicitly disabled
|
||||
setupDataSources(
|
||||
mockDataSource(
|
||||
{
|
||||
type: 'prometheus',
|
||||
name: 'Prom-disabled',
|
||||
uid: 'prometheus-disabled',
|
||||
isDefault: true,
|
||||
jsonData: { manageAlerts: false },
|
||||
},
|
||||
{ alerting: true, module: 'core:plugin/prometheus' }
|
||||
)
|
||||
);
|
||||
|
||||
renderRuleEditor();
|
||||
|
||||
// Wait for the form to load
|
||||
await screen.findByRole('textbox', { name: 'name' });
|
||||
|
||||
// The rule type switch should NOT be visible
|
||||
expect(screen.queryByText('Rule type')).not.toBeInTheDocument();
|
||||
expect(screen.queryByTestId('rule-type-radio-group')).not.toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should show rule type switch when data sources have manageAlerts enabled', async () => {
|
||||
// Setup data source with manageAlerts enabled
|
||||
setupDataSources(
|
||||
mockDataSource(
|
||||
{
|
||||
type: 'prometheus',
|
||||
name: 'Prom-enabled',
|
||||
uid: 'prometheus-enabled',
|
||||
isDefault: true,
|
||||
jsonData: { manageAlerts: true },
|
||||
},
|
||||
{ alerting: true, module: 'core:plugin/prometheus' }
|
||||
)
|
||||
);
|
||||
|
||||
renderRuleEditor();
|
||||
|
||||
// Wait for the form to load
|
||||
await screen.findByRole('textbox', { name: 'name' });
|
||||
|
||||
// The rule type section should be visible
|
||||
expect(await screen.findByText('Rule type')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -7,10 +7,11 @@ import { setPluginComponentsHook, setPluginLinksHook } from '@grafana/runtime';
|
||||
import { AccessControlAction } from 'app/types/accessControl';
|
||||
|
||||
import { setupMswServer } from '../mockApi';
|
||||
import { grantUserPermissions, grantUserRole } from '../mocks';
|
||||
import { grantUserPermissions, grantUserRole, mockDataSource } from '../mocks';
|
||||
import { setGrafanaRuleGroupExportResolver } from '../mocks/server/configure';
|
||||
import { alertingFactory } from '../mocks/server/db';
|
||||
import { RulesFilter } from '../search/rulesSearchParser';
|
||||
import { setupDataSources } from '../testSetup/datasources';
|
||||
|
||||
import RuleListPage, { RuleListActions } from './RuleList.v2';
|
||||
import { loadDefaultSavedSearch } from './filter/useSavedSearches';
|
||||
@@ -365,6 +366,51 @@ describe('RuleListActions', () => {
|
||||
expect(ui.exportDrawer.query()).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
describe('Data source options visibility', () => {
|
||||
it('should not show "New Data source recording rule" option when no data sources have manageAlerts enabled', async () => {
|
||||
// Set up only data sources with manageAlerts explicitly set to false
|
||||
// This replaces the default data sources that have manageAlerts defaulting to true
|
||||
setupDataSources(
|
||||
mockDataSource({
|
||||
name: 'Prometheus-disabled',
|
||||
uid: 'prometheus-disabled',
|
||||
type: 'prometheus',
|
||||
jsonData: { manageAlerts: false },
|
||||
})
|
||||
);
|
||||
|
||||
grantUserPermissions([AccessControlAction.AlertingRuleExternalWrite]);
|
||||
|
||||
const { user } = render(<RuleListActions />);
|
||||
|
||||
await user.click(ui.moreButton.get());
|
||||
const menu = await ui.moreMenu.find();
|
||||
|
||||
expect(ui.menuOptions.newDataSourceRecordingRule.query(menu)).not.toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should show "New Data source recording rule" option when data sources have manageAlerts enabled', async () => {
|
||||
// Set up data source with manageAlerts enabled
|
||||
setupDataSources(
|
||||
mockDataSource({
|
||||
name: 'Prometheus-enabled',
|
||||
uid: 'prometheus-enabled',
|
||||
type: 'prometheus',
|
||||
jsonData: { manageAlerts: true },
|
||||
})
|
||||
);
|
||||
|
||||
grantUserPermissions([AccessControlAction.AlertingRuleExternalWrite]);
|
||||
|
||||
const { user } = render(<RuleListActions />);
|
||||
|
||||
await user.click(ui.moreButton.get());
|
||||
const menu = await ui.moreMenu.find();
|
||||
|
||||
expect(ui.menuOptions.newDataSourceRecordingRule.query(menu)).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('RuleListPage v2 - View switching', () => {
|
||||
|
||||
@@ -13,6 +13,7 @@ import { useListViewMode } from '../components/rules/Filter/RulesViewModeSelecto
|
||||
import { AIAlertRuleButtonComponent } from '../enterprise-components/AI/AIGenAlertRuleButton/addAIAlertRuleButton';
|
||||
import { AlertingAction, useAlertingAbility } from '../hooks/useAbilities';
|
||||
import { useRulesFilter } from '../hooks/useFilteredRules';
|
||||
import { getRulesDataSources } from '../utils/datasource';
|
||||
|
||||
import { FilterView } from './FilterView';
|
||||
import { GroupedView } from './GroupedView';
|
||||
@@ -41,8 +42,11 @@ export function RuleListActions() {
|
||||
const [createCloudRuleSupported, createCloudRuleAllowed] = useAlertingAbility(AlertingAction.CreateExternalAlertRule);
|
||||
const [exportRulesSupported, exportRulesAllowed] = useAlertingAbility(AlertingAction.ExportGrafanaManagedRules);
|
||||
|
||||
// Check if there are any data sources with manageAlerts enabled
|
||||
const hasAlertEnabledDataSources = useMemo(() => getRulesDataSources().length > 0, []);
|
||||
|
||||
const canCreateGrafanaRules = createGrafanaRuleSupported && createGrafanaRuleAllowed;
|
||||
const canCreateCloudRules = createCloudRuleSupported && createCloudRuleAllowed;
|
||||
const canCreateCloudRules = createCloudRuleSupported && createCloudRuleAllowed && hasAlertEnabledDataSources;
|
||||
const canExportRules = exportRulesSupported && exportRulesAllowed;
|
||||
|
||||
const canCreateRules = canCreateGrafanaRules || canCreateCloudRules;
|
||||
|
||||
@@ -40,7 +40,11 @@ import { PanelEditor } from '../panel-edit/PanelEditor';
|
||||
import { DashboardScene } from '../scene/DashboardScene';
|
||||
import { buildNewDashboardSaveModel, buildNewDashboardSaveModelV2 } from '../serialization/buildNewDashboardSaveModel';
|
||||
import { transformSaveModelSchemaV2ToScene } from '../serialization/transformSaveModelSchemaV2ToScene';
|
||||
import { transformSaveModelToScene } from '../serialization/transformSaveModelToScene';
|
||||
import {
|
||||
createV2RowsLayout,
|
||||
SceneCreationOptions,
|
||||
transformSaveModelToScene,
|
||||
} from '../serialization/transformSaveModelToScene';
|
||||
import { restoreDashboardStateFromLocalStorage } from '../utils/dashboardSessionState';
|
||||
|
||||
import { processQueryParamsForDashboardLoad, updateNavModel } from './utils';
|
||||
@@ -106,6 +110,34 @@ interface DashboardScenePageStateManagerLike<T> {
|
||||
useState: () => DashboardScenePageState;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates scene creation options with appropriate layout creator
|
||||
* based on feature flags and dashboard type.
|
||||
*/
|
||||
export function getSceneCreationOptions(
|
||||
loadOptions?: LoadDashboardOptions,
|
||||
meta?: { isSnapshot?: boolean }
|
||||
): SceneCreationOptions | undefined {
|
||||
const isReport = loadOptions?.route === DashboardRoutes.Report;
|
||||
const isTemplate = loadOptions?.route === DashboardRoutes.Template;
|
||||
const isSnapshot = meta?.isSnapshot ?? false;
|
||||
|
||||
// Don't use v2 layout for reports or snapshots
|
||||
if (isReport || isSnapshot || isTemplate) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
// Use v2 layout creator when v2 API is enabled
|
||||
if (shouldForceV2API()) {
|
||||
return {
|
||||
createLayout: createV2RowsLayout,
|
||||
targetVersion: 'v2',
|
||||
};
|
||||
}
|
||||
|
||||
return undefined;
|
||||
}
|
||||
|
||||
abstract class DashboardScenePageStateManagerBase<T>
|
||||
extends StateManagerBase<DashboardScenePageState>
|
||||
implements DashboardScenePageStateManagerLike<T>
|
||||
@@ -155,7 +187,7 @@ abstract class DashboardScenePageStateManagerBase<T>
|
||||
private async loadHomeDashboard(): Promise<DashboardScene | null> {
|
||||
const rsp = await this.fetchHomeDashboard();
|
||||
if (rsp) {
|
||||
return transformSaveModelToScene(rsp);
|
||||
return transformSaveModelToScene(rsp, undefined, getSceneCreationOptions());
|
||||
}
|
||||
|
||||
return null;
|
||||
@@ -441,7 +473,8 @@ export class DashboardScenePageStateManager extends DashboardScenePageStateManag
|
||||
}
|
||||
|
||||
if (rsp?.dashboard) {
|
||||
const scene = transformSaveModelToScene(rsp, options);
|
||||
const sceneCreationOptions = getSceneCreationOptions(options, rsp.meta);
|
||||
const scene = transformSaveModelToScene(rsp, options, sceneCreationOptions);
|
||||
|
||||
// Special handling for Template route - set up edit mode and dirty state
|
||||
if (
|
||||
@@ -474,7 +507,8 @@ export class DashboardScenePageStateManager extends DashboardScenePageStateManag
|
||||
throw new DashboardVersionError('v2beta1', 'Using legacy snapshot API to get a V2 dashboard');
|
||||
}
|
||||
|
||||
const scene = transformSaveModelToScene(rsp);
|
||||
// Snapshots should use default v1 layout
|
||||
const scene = transformSaveModelToScene(rsp, undefined, getSceneCreationOptions(undefined, { isSnapshot: true }));
|
||||
return scene;
|
||||
}
|
||||
|
||||
@@ -755,7 +789,8 @@ export class DashboardScenePageStateManager extends DashboardScenePageStateManag
|
||||
return;
|
||||
}
|
||||
|
||||
const scene = transformSaveModelToScene(rsp);
|
||||
const sceneCreationOptions = getSceneCreationOptions(undefined, rsp.meta);
|
||||
const scene = transformSaveModelToScene(rsp, undefined, sceneCreationOptions);
|
||||
|
||||
// we need to call and restore dashboard state on every reload that pulls a new dashboard version
|
||||
if (config.featureToggles.preserveDashboardStateWhenNavigating && Boolean(uid)) {
|
||||
|
||||
@@ -27,6 +27,7 @@ import { createPanelSaveModel } from 'app/features/dashboard/state/__fixtures__/
|
||||
import { SHARED_DASHBOARD_QUERY, DASHBOARD_DATASOURCE_PLUGIN_ID } from 'app/plugins/datasource/dashboard/constants';
|
||||
import { DashboardDataDTO } from 'app/types/dashboard';
|
||||
|
||||
import { getSceneCreationOptions } from '../pages/DashboardScenePageStateManager';
|
||||
import { DashboardDataLayerSet } from '../scene/DashboardDataLayerSet';
|
||||
import { LibraryPanelBehavior } from '../scene/LibraryPanelBehavior';
|
||||
import { DashboardGridItem } from '../scene/layout-default/DashboardGridItem';
|
||||
@@ -822,10 +823,14 @@ describe('transformSaveModelToScene', () => {
|
||||
});
|
||||
|
||||
it('Should convert legacy rows to new rows', () => {
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: repeatingRowsAndPanelsDashboardJson as DashboardDataDTO,
|
||||
meta: {},
|
||||
});
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: repeatingRowsAndPanelsDashboardJson as DashboardDataDTO,
|
||||
meta: {},
|
||||
},
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
const layout = scene.state.body as RowsLayoutManager;
|
||||
const row1 = layout.state.rows[0];
|
||||
@@ -857,10 +862,14 @@ describe('transformSaveModelToScene', () => {
|
||||
});
|
||||
|
||||
it('Should convert legacy rows to new rows with free panels before first row', () => {
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: rowsAfterFreePanels as DashboardDataDTO,
|
||||
meta: {},
|
||||
});
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: rowsAfterFreePanels as DashboardDataDTO,
|
||||
meta: {},
|
||||
},
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
const layout = scene.state.body as RowsLayoutManager;
|
||||
const row1 = layout.state.rows[0];
|
||||
|
||||
@@ -29,11 +29,11 @@ import {
|
||||
} from 'app/features/dashboard/services/DashboardProfiler';
|
||||
import { DashboardModel } from 'app/features/dashboard/state/DashboardModel';
|
||||
import { PanelModel } from 'app/features/dashboard/state/PanelModel';
|
||||
import { DashboardDTO, DashboardDataDTO, DashboardRoutes } from 'app/types/dashboard';
|
||||
import { DashboardDTO, DashboardDataDTO } from 'app/types/dashboard';
|
||||
|
||||
import { addPanelsOnLoadBehavior } from '../addToDashboard/addPanelsOnLoadBehavior';
|
||||
import { dashboardAnalyticsInitializer } from '../behaviors/DashboardAnalyticsInitializerBehavior';
|
||||
import { LoadDashboardOptions, shouldForceV2API } from '../pages/DashboardScenePageStateManager';
|
||||
import { LoadDashboardOptions } from '../pages/DashboardScenePageStateManager';
|
||||
import { AlertStatesDataLayer } from '../scene/AlertStatesDataLayer';
|
||||
import { DashboardAnnotationsDataLayer } from '../scene/DashboardAnnotationsDataLayer';
|
||||
import { DashboardControls } from '../scene/DashboardControls';
|
||||
@@ -76,11 +76,53 @@ export interface SaveModelToSceneOptions {
|
||||
isEmbedded?: boolean;
|
||||
}
|
||||
|
||||
export function transformSaveModelToScene(rsp: DashboardDTO, options?: LoadDashboardOptions): DashboardScene {
|
||||
type LayoutCreator = (panels: PanelModel[], preload?: boolean) => DashboardLayoutManager;
|
||||
|
||||
export interface SceneCreationOptions {
|
||||
/**
|
||||
* When provided, this function is used to create the dashboard body/layout instead of the default v1 behavior.
|
||||
* This allows callers to inject v2 layout strategy.
|
||||
*/
|
||||
createLayout?: LayoutCreator;
|
||||
/**
|
||||
* Determines how the dashboard scene is serialized.
|
||||
* @default 'v1'
|
||||
*/
|
||||
targetVersion?: 'v1' | 'v2';
|
||||
}
|
||||
|
||||
// Rows as SceneGridRow within the grid.
|
||||
const createDefaultGridLayout: LayoutCreator = (panels, preload) => {
|
||||
return new DefaultGridLayoutManager({
|
||||
grid: new SceneGridLayout({
|
||||
isLazy: getIsLazy(preload),
|
||||
children: createSceneObjectsForPanels(panels),
|
||||
}),
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* V2 layout creator - uses RowsLayoutManager when dashboard has rows.
|
||||
* This creates a layout that can be properly serialized to v2 format.
|
||||
*/
|
||||
export const createV2RowsLayout: LayoutCreator = (panels, preload) => {
|
||||
const hasRows = panels.some((p) => p.type === 'row');
|
||||
if (hasRows) {
|
||||
return createRowsFromPanels(panels);
|
||||
}
|
||||
// Fall back to default grid layout when no rows
|
||||
return createDefaultGridLayout(panels, preload);
|
||||
};
|
||||
|
||||
export function transformSaveModelToScene(
|
||||
rsp: DashboardDTO,
|
||||
options?: LoadDashboardOptions,
|
||||
sceneOptions?: SceneCreationOptions
|
||||
): DashboardScene {
|
||||
// Just to have migrations run
|
||||
const oldModel = new DashboardModel(rsp.dashboard, rsp.meta);
|
||||
|
||||
const scene = createDashboardSceneFromDashboardModel(oldModel, rsp.dashboard, options);
|
||||
const scene = createDashboardSceneFromDashboardModel(oldModel, rsp.dashboard, options, sceneOptions);
|
||||
// TODO: refactor createDashboardSceneFromDashboardModel to work on Dashboard schema model
|
||||
|
||||
const apiVersion = config.featureToggles.kubernetesDashboards
|
||||
@@ -92,7 +134,7 @@ export function transformSaveModelToScene(rsp: DashboardDTO, options?: LoadDashb
|
||||
return scene;
|
||||
}
|
||||
|
||||
export function createRowsFromPanels(oldPanels: PanelModel[]): RowsLayoutManager {
|
||||
function createRowsFromPanels(oldPanels: PanelModel[]): RowsLayoutManager {
|
||||
const rowItems: RowItem[] = [];
|
||||
|
||||
let currentLegacyRow: PanelModel | null = null;
|
||||
@@ -143,7 +185,7 @@ export function createRowsFromPanels(oldPanels: PanelModel[]): RowsLayoutManager
|
||||
});
|
||||
}
|
||||
|
||||
export function createSceneObjectsForPanels(oldPanels: PanelModel[]): SceneGridItemLike[] {
|
||||
function createSceneObjectsForPanels(oldPanels: PanelModel[]): SceneGridItemLike[] {
|
||||
// collects all panels and rows
|
||||
const panels: SceneGridItemLike[] = [];
|
||||
|
||||
@@ -259,14 +301,14 @@ function createRowItemFromLegacyRow(row: PanelModel, panels: DashboardGridItem[]
|
||||
export function createDashboardSceneFromDashboardModel(
|
||||
oldModel: DashboardModel,
|
||||
dto: DashboardDataDTO,
|
||||
options?: LoadDashboardOptions
|
||||
options?: LoadDashboardOptions,
|
||||
sceneOptions?: SceneCreationOptions
|
||||
) {
|
||||
let variables: SceneVariableSet | undefined;
|
||||
let annotationLayers: SceneDataLayerProvider[] = [];
|
||||
let alertStatesLayer: AlertStatesDataLayer | undefined;
|
||||
const uid = oldModel.uid;
|
||||
const isReport = options?.route === DashboardRoutes.Report;
|
||||
const serializerVersion = shouldForceV2API() && !oldModel.meta.isSnapshot && !isReport ? 'v2' : 'v1';
|
||||
const targetVersion = sceneOptions?.targetVersion ?? 'v1';
|
||||
|
||||
if (oldModel.meta.isSnapshot) {
|
||||
variables = createVariablesForSnapshot(oldModel);
|
||||
@@ -354,9 +396,11 @@ export function createDashboardSceneFromDashboardModel(
|
||||
|
||||
let body: DashboardLayoutManager;
|
||||
|
||||
if (serializerVersion === 'v2' && oldModel.panels.some((p) => p.type === 'row')) {
|
||||
body = createRowsFromPanels(oldModel.panels);
|
||||
if (sceneOptions?.createLayout) {
|
||||
// Use injected layout creator (allows callers to specify v2 or custom layout strategy)
|
||||
body = sceneOptions.createLayout(oldModel.panels, dto.preload);
|
||||
} else {
|
||||
// Default v1 layout: DefaultGridLayoutManager
|
||||
body = new DefaultGridLayoutManager({
|
||||
grid: new SceneGridLayout({
|
||||
isLazy: getIsLazy(dto.preload),
|
||||
@@ -404,7 +448,7 @@ export function createDashboardSceneFromDashboardModel(
|
||||
hideTimeControls: oldModel.timepicker.hidden,
|
||||
}),
|
||||
},
|
||||
serializerVersion
|
||||
targetVersion
|
||||
);
|
||||
|
||||
// Enable panel profiling for this dashboard using the composed SceneRenderProfiler
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
import { readdirSync, readFileSync } from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { getSceneCreationOptions } from '../pages/DashboardScenePageStateManager';
|
||||
|
||||
import { normalizeBackendOutputForFrontendComparison } from './serialization-test-utils';
|
||||
import { transformSaveModelSchemaV2ToScene } from './transformSaveModelSchemaV2ToScene';
|
||||
import { transformSaveModelToScene } from './transformSaveModelToScene';
|
||||
@@ -207,22 +209,26 @@ describe('V1 to V2 Dashboard Transformation Comparison', () => {
|
||||
delete dashboardSpec.snapshot;
|
||||
|
||||
// Wrap in DashboardDTO structure that transformSaveModelToScene expects
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
},
|
||||
},
|
||||
});
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
const frontendOutput = transformSceneToSaveModelSchemaV2(scene, false);
|
||||
|
||||
@@ -279,22 +285,26 @@ describe('V1 to V2 Dashboard Transformation Comparison', () => {
|
||||
delete dashboardSpec.snapshot;
|
||||
|
||||
// Wrap in DashboardDTO structure that transformSaveModelToScene expects
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
},
|
||||
},
|
||||
});
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
const frontendOutput = transformSceneToSaveModelSchemaV2(scene, false);
|
||||
|
||||
|
||||
@@ -6,6 +6,8 @@ import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboa
|
||||
import { DashboardWithAccessInfo } from 'app/features/dashboard/api/types';
|
||||
import { DashboardDataDTO } from 'app/types/dashboard';
|
||||
|
||||
import { getSceneCreationOptions } from '../pages/DashboardScenePageStateManager';
|
||||
|
||||
import { transformSaveModelSchemaV2ToScene } from './transformSaveModelSchemaV2ToScene';
|
||||
import { transformSaveModelToScene } from './transformSaveModelToScene';
|
||||
import { transformSceneToSaveModel } from './transformSceneToSaveModel';
|
||||
@@ -228,22 +230,26 @@ function removeMetadata(spec: Dashboard): Partial<Dashboard> {
|
||||
* identical processing.
|
||||
*/
|
||||
function loadAndSerializeV1SaveModel(dashboard: Dashboard): Dashboard {
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: dashboard as DashboardDataDTO,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: dashboard as DashboardDataDTO,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
},
|
||||
},
|
||||
});
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
return transformSceneToSaveModel(scene, false);
|
||||
}
|
||||
|
||||
@@ -23,6 +23,7 @@ import { DashboardDataDTO } from 'app/types/dashboard';
|
||||
|
||||
import { DashboardScene } from '../scene/DashboardScene';
|
||||
import { makeExportableV1, makeExportableV2 } from '../scene/export/exporters';
|
||||
import { createV2RowsLayout, transformSaveModelToScene } from '../serialization/transformSaveModelToScene';
|
||||
import { transformSceneToSaveModel } from '../serialization/transformSceneToSaveModel';
|
||||
import { transformSceneToSaveModelSchemaV2 } from '../serialization/transformSceneToSaveModelSchemaV2';
|
||||
import { getVariablesCompatibility } from '../utils/getVariablesCompatibility';
|
||||
@@ -216,7 +217,27 @@ export class ShareExportTab extends SceneObjectBase<ShareExportTabState> impleme
|
||||
}
|
||||
|
||||
if (exportMode === ExportMode.V2Resource) {
|
||||
const spec = transformSceneToSaveModelSchemaV2(scene);
|
||||
let sceneForV2Export = scene;
|
||||
|
||||
// When exporting v1 dashboard as v2, we need to recreate the scene with v2 layout creator
|
||||
// to ensure rows are properly serialized. The v1 scene uses DefaultGridLayoutManager which
|
||||
// doesn't know about RowsLayoutManager structure needed for v2 serialization.
|
||||
if (initialSaveModelVersion === 'v1' && initialSaveModel && isV1ClassicDashboard(initialSaveModel)) {
|
||||
// Recreate scene with v2 layout creator to properly handle rows
|
||||
sceneForV2Export = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: { ...initialSaveModel, title: initialSaveModel.title ?? '', uid: initialSaveModel.uid ?? '' },
|
||||
meta: scene.state.meta,
|
||||
},
|
||||
undefined,
|
||||
{
|
||||
createLayout: createV2RowsLayout,
|
||||
targetVersion: 'v2',
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
const spec = transformSceneToSaveModelSchemaV2(sceneForV2Export);
|
||||
const specCopy = JSON.parse(JSON.stringify(spec));
|
||||
const statelessSpec = await makeExportableV2(specCopy, isSharingExternally);
|
||||
const exportableV2 = isSharingExternally ? statelessSpec : spec;
|
||||
|
||||
@@ -2,6 +2,7 @@ import { readdirSync, readFileSync } from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||
import { getSceneCreationOptions } from 'app/features/dashboard-scene/pages/DashboardScenePageStateManager';
|
||||
import { normalizeBackendOutputForFrontendComparison } from 'app/features/dashboard-scene/serialization/serialization-test-utils';
|
||||
import { transformSaveModelSchemaV2ToScene } from 'app/features/dashboard-scene/serialization/transformSaveModelSchemaV2ToScene';
|
||||
import { transformSaveModelToScene } from 'app/features/dashboard-scene/serialization/transformSaveModelToScene';
|
||||
@@ -193,22 +194,26 @@ describe('V1 to V2 Dashboard Transformation Comparison (ResponseTransformers)',
|
||||
delete dashboardSpec.snapshot;
|
||||
|
||||
// Wrap in DashboardDTO structure that transformSaveModelToScene expects
|
||||
const scene = transformSaveModelToScene({
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
const scene = transformSaveModelToScene(
|
||||
{
|
||||
dashboard: dashboardSpec,
|
||||
meta: {
|
||||
isNew: false,
|
||||
isFolder: false,
|
||||
canSave: true,
|
||||
canEdit: true,
|
||||
canDelete: false,
|
||||
canShare: false,
|
||||
canStar: false,
|
||||
canAdmin: false,
|
||||
isSnapshot: false,
|
||||
provisioned: false,
|
||||
version: 1,
|
||||
},
|
||||
},
|
||||
});
|
||||
undefined,
|
||||
getSceneCreationOptions()
|
||||
);
|
||||
|
||||
const frontendOutput = transformSceneToSaveModelSchemaV2(scene, false);
|
||||
|
||||
|
||||
@@ -200,7 +200,7 @@ describe('createSpanLinkFactory', () => {
|
||||
datasource: 'loki1_uid',
|
||||
queries: [
|
||||
{
|
||||
expr: '{cluster="cluster1", hostname="hostname1", service_namespace="namespace1"} | label_format log_line_contains_trace_id=`{{ contains "7946b05c2e2e4e5a" __line__ }}` | log_line_contains_trace_id="true" or trace_id="7946b05c2e2e4e5a" | label_format log_line_contains_span_id=`{{ contains "6605c7b08e715d6c" __line__ }}` | log_line_contains_span_id="true" or span_id="6605c7b08e715d6c"',
|
||||
expr: '{cluster="cluster1", hostname="hostname1", service_namespace="namespace1"} | logfmt | json | drop __error__, __error_details__ | trace_id="7946b05c2e2e4e5a" | span_id="6605c7b08e715d6c"',
|
||||
refId: '',
|
||||
datasource: { uid: 'loki1_uid' },
|
||||
},
|
||||
|
||||
@@ -446,12 +446,12 @@ function getQueryForLoki(
|
||||
|
||||
let expr = '{${__tags}}';
|
||||
if (filterByTraceID && span.traceID) {
|
||||
expr +=
|
||||
' | label_format log_line_contains_trace_id=`{{ contains "${__span.traceId}" __line__ }}` | log_line_contains_trace_id="true" or trace_id="${__span.traceId}"';
|
||||
}
|
||||
if (filterBySpanID && span.spanID) {
|
||||
expr +=
|
||||
' | label_format log_line_contains_span_id=`{{ contains "${__span.spanId}" __line__ }}` | log_line_contains_span_id="true" or span_id="${__span.spanId}"';
|
||||
expr += ' | logfmt | json | drop __error__, __error_details__ | trace_id="${__span.traceId}"';
|
||||
if (filterBySpanID && span.spanID) {
|
||||
expr += ' | span_id="${__span.spanId}"';
|
||||
}
|
||||
} else if (filterBySpanID && span.spanID) {
|
||||
expr += ' | logfmt | json | drop __error__, __error_details__ | span_id="${__span.spanId}"';
|
||||
}
|
||||
|
||||
return {
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { AnnotationQuery, BusEventBase, BusEventWithPayload, eventFactory } from '@grafana/data';
|
||||
import { IconName, ButtonVariant } from '@grafana/ui';
|
||||
import { HistoryEntryView } from 'app/core/components/AppChrome/types';
|
||||
|
||||
/**
|
||||
* Event Payloads
|
||||
@@ -217,7 +216,3 @@ export class PanelEditEnteredEvent extends BusEventWithPayload<number> {
|
||||
export class PanelEditExitedEvent extends BusEventWithPayload<number> {
|
||||
static type = 'panel-edit-finished';
|
||||
}
|
||||
|
||||
export class RecordHistoryEntryEvent extends BusEventWithPayload<HistoryEntryView> {
|
||||
static type = 'record-history-entry';
|
||||
}
|
||||
|
||||
@@ -10720,18 +10720,6 @@
|
||||
"help/documentation": "Documentation",
|
||||
"help/keyboard-shortcuts": "Keyboard shortcuts",
|
||||
"help/support": "Support",
|
||||
"history-container": {
|
||||
"drawer-tittle": "History"
|
||||
},
|
||||
"history-wrapper": {
|
||||
"collapse": "Collapse",
|
||||
"expand": "Expand",
|
||||
"icon-selected": "Selected Entry",
|
||||
"icon-unselected": "Normal Entry",
|
||||
"show-more": "Show more",
|
||||
"today": "Today",
|
||||
"yesterday": "Yesterday"
|
||||
},
|
||||
"home": {
|
||||
"title": "Home"
|
||||
},
|
||||
|
||||
3
public/openapi3.json
generated
3
public/openapi3.json
generated
@@ -10039,6 +10039,9 @@
|
||||
},
|
||||
"ReportOptions": {
|
||||
"properties": {
|
||||
"csvEncoding": {
|
||||
"type": "string"
|
||||
},
|
||||
"layout": {
|
||||
"type": "string"
|
||||
},
|
||||
|
||||
@@ -1161,56 +1161,6 @@ div.editor-option label {
|
||||
content: '\e902';
|
||||
}
|
||||
|
||||
.bootstrap-tagsinput {
|
||||
display: inline-block;
|
||||
padding: 0 0 0 6px;
|
||||
vertical-align: middle;
|
||||
max-width: 100%;
|
||||
line-height: 22px;
|
||||
background-color: $input-bg;
|
||||
border: 1px solid $input-border-color;
|
||||
|
||||
input {
|
||||
display: inline-block;
|
||||
border: none;
|
||||
margin: 0px;
|
||||
border-radius: 0;
|
||||
padding: 8px 6px;
|
||||
height: 100%;
|
||||
width: 70px;
|
||||
box-sizing: border-box;
|
||||
|
||||
&.gf-form-input--has-help-icon {
|
||||
padding-right: $space-xl;
|
||||
}
|
||||
}
|
||||
|
||||
.tag {
|
||||
margin-right: 2px;
|
||||
color: $white;
|
||||
|
||||
[data-role='remove'] {
|
||||
margin-left: 8px;
|
||||
cursor: pointer;
|
||||
|
||||
&::after {
|
||||
content: 'x';
|
||||
padding: 0px 2px;
|
||||
}
|
||||
|
||||
&:hover {
|
||||
box-shadow:
|
||||
inset 0 1px 0 rgba(255, 255, 255, 0.2),
|
||||
0 1px 2px rgba(0, 0, 0, 0.05);
|
||||
|
||||
&:active {
|
||||
box-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.page-header {
|
||||
margin-top: $space-md;
|
||||
|
||||
|
||||
512
public/vendor/tagsinput/bootstrap-tagsinput.js
vendored
512
public/vendor/tagsinput/bootstrap-tagsinput.js
vendored
@@ -1,512 +0,0 @@
|
||||
(function ($) {
|
||||
"use strict";
|
||||
|
||||
var defaultOptions = {
|
||||
tagClass: function(item) {
|
||||
return 'label label-info';
|
||||
},
|
||||
itemValue: function(item) {
|
||||
return item ? item.toString() : item;
|
||||
},
|
||||
itemText: function(item) {
|
||||
return this.itemValue(item);
|
||||
},
|
||||
freeInput: true,
|
||||
maxTags: undefined,
|
||||
confirmKeys: [13],
|
||||
onTagExists: function(item, $tag) {
|
||||
$tag.hide().fadeIn();
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Constructor function
|
||||
*/
|
||||
function TagsInput(element, options) {
|
||||
this.itemsArray = [];
|
||||
|
||||
this.$element = $(element);
|
||||
this.$element.hide();
|
||||
|
||||
this.widthClass = options.widthClass || 'width-9';
|
||||
this.isSelect = (element.tagName === 'SELECT');
|
||||
this.multiple = (this.isSelect && element.hasAttribute('multiple'));
|
||||
this.objectItems = options && options.itemValue;
|
||||
this.placeholderText = element.hasAttribute('placeholder') ? this.$element.attr('placeholder') : '';
|
||||
|
||||
this.$container = $('<div class="bootstrap-tagsinput"></div>');
|
||||
this.$input = $('<input class="gf-form-input ' + this.widthClass + '" type="text" placeholder="' + this.placeholderText + '"/>').appendTo(this.$container);
|
||||
|
||||
this.$element.after(this.$container);
|
||||
|
||||
this.build(options);
|
||||
}
|
||||
|
||||
TagsInput.prototype = {
|
||||
constructor: TagsInput,
|
||||
|
||||
/**
|
||||
* Adds the given item as a new tag. Pass true to dontPushVal to prevent
|
||||
* updating the elements val()
|
||||
*/
|
||||
add: function(item, dontPushVal) {
|
||||
var self = this;
|
||||
|
||||
if (self.options.maxTags && self.itemsArray.length >= self.options.maxTags)
|
||||
return;
|
||||
|
||||
// Ignore falsey values, except false
|
||||
if (item !== false && !item)
|
||||
return;
|
||||
|
||||
// Throw an error when trying to add an object while the itemValue option was not set
|
||||
if (typeof item === "object" && !self.objectItems)
|
||||
throw("Can't add objects when itemValue option is not set");
|
||||
|
||||
// Ignore strings only containg whitespace
|
||||
if (item.toString().match(/^\s*$/))
|
||||
return;
|
||||
|
||||
// If SELECT but not multiple, remove current tag
|
||||
if (self.isSelect && !self.multiple && self.itemsArray.length > 0)
|
||||
self.remove(self.itemsArray[0]);
|
||||
|
||||
if (typeof item === "string" && this.$element[0].tagName === 'INPUT') {
|
||||
var items = item.split(',');
|
||||
if (items.length > 1) {
|
||||
for (var i = 0; i < items.length; i++) {
|
||||
this.add(items[i], true);
|
||||
}
|
||||
|
||||
if (!dontPushVal)
|
||||
self.pushVal();
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
var itemValue = self.options.itemValue(item),
|
||||
itemText = self.options.itemText(item),
|
||||
tagClass = self.options.tagClass(item);
|
||||
|
||||
// Ignore items already added
|
||||
var existing = $.grep(self.itemsArray, function(item) { return self.options.itemValue(item) === itemValue; } )[0];
|
||||
if (existing) {
|
||||
// Invoke onTagExists
|
||||
if (self.options.onTagExists) {
|
||||
var $existingTag = $(".tag", self.$container).filter(function() { return $(this).data("item") === existing; });
|
||||
self.options.onTagExists(item, $existingTag);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// register item in internal array and map
|
||||
self.itemsArray.push(item);
|
||||
|
||||
// add a tag element
|
||||
var $tag = $('<span class="tag ' + htmlEncode(tagClass) + '">' + htmlEncode(itemText) + '<span data-role="remove"></span></span>');
|
||||
$tag.data('item', item);
|
||||
self.findInputWrapper().before($tag);
|
||||
$tag.after(' ');
|
||||
|
||||
// add <option /> if item represents a value not present in one of the <select />'s options
|
||||
if (self.isSelect && !$('option[value="' + escape(itemValue) + '"]',self.$element)[0]) {
|
||||
var $option = $('<option selected>' + htmlEncode(itemText) + '</option>');
|
||||
$option.data('item', item);
|
||||
$option.attr('value', itemValue);
|
||||
self.$element.append($option);
|
||||
}
|
||||
|
||||
if (!dontPushVal)
|
||||
self.pushVal();
|
||||
|
||||
// Add class when reached maxTags
|
||||
if (self.options.maxTags === self.itemsArray.length)
|
||||
self.$container.addClass('bootstrap-tagsinput-max');
|
||||
|
||||
self.$element.trigger($.Event('itemAdded', { item: item }));
|
||||
},
|
||||
|
||||
/**
|
||||
* Removes the given item. Pass true to dontPushVal to prevent updating the
|
||||
* elements val()
|
||||
*/
|
||||
remove: function(item, dontPushVal) {
|
||||
var self = this;
|
||||
|
||||
if (self.objectItems) {
|
||||
if (typeof item === "object")
|
||||
item = $.grep(self.itemsArray, function(other) { return self.options.itemValue(other) == self.options.itemValue(item); } )[0];
|
||||
else
|
||||
item = $.grep(self.itemsArray, function(other) { return self.options.itemValue(other) == item; } )[0];
|
||||
}
|
||||
|
||||
if (item) {
|
||||
$('.tag', self.$container).filter(function() { return $(this).data('item') === item; }).remove();
|
||||
$('option', self.$element).filter(function() { return $(this).data('item') === item; }).remove();
|
||||
self.itemsArray.splice($.inArray(item, self.itemsArray), 1);
|
||||
}
|
||||
|
||||
if (!dontPushVal)
|
||||
self.pushVal();
|
||||
|
||||
// Remove class when reached maxTags
|
||||
if (self.options.maxTags > self.itemsArray.length)
|
||||
self.$container.removeClass('bootstrap-tagsinput-max');
|
||||
|
||||
self.$element.trigger($.Event('itemRemoved', { item: item }));
|
||||
},
|
||||
|
||||
/**
|
||||
* Removes all items
|
||||
*/
|
||||
removeAll: function() {
|
||||
var self = this;
|
||||
|
||||
$('.tag', self.$container).remove();
|
||||
$('option', self.$element).remove();
|
||||
|
||||
while(self.itemsArray.length > 0)
|
||||
self.itemsArray.pop();
|
||||
|
||||
self.pushVal();
|
||||
|
||||
if (self.options.maxTags && !this.isEnabled())
|
||||
this.enable();
|
||||
},
|
||||
|
||||
/**
|
||||
* Refreshes the tags so they match the text/value of their corresponding
|
||||
* item.
|
||||
*/
|
||||
refresh: function() {
|
||||
var self = this;
|
||||
$('.tag', self.$container).each(function() {
|
||||
var $tag = $(this),
|
||||
item = $tag.data('item'),
|
||||
itemValue = self.options.itemValue(item),
|
||||
itemText = self.options.itemText(item),
|
||||
tagClass = self.options.tagClass(item);
|
||||
|
||||
// Update tag's class and inner text
|
||||
$tag.attr('class', null);
|
||||
$tag.addClass('tag ' + htmlEncode(tagClass));
|
||||
$tag.contents().filter(function() {
|
||||
return this.nodeType == 3;
|
||||
})[0].nodeValue = htmlEncode(itemText);
|
||||
|
||||
if (self.isSelect) {
|
||||
var option = $('option', self.$element).filter(function() { return $(this).data('item') === item; });
|
||||
option.attr('value', itemValue);
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
/**
|
||||
* Returns the items added as tags
|
||||
*/
|
||||
items: function() {
|
||||
return this.itemsArray;
|
||||
},
|
||||
|
||||
/**
|
||||
* Assembly value by retrieving the value of each item, and set it on the
|
||||
* element.
|
||||
*/
|
||||
pushVal: function() {
|
||||
var self = this,
|
||||
val = $.map(self.items(), function(item) {
|
||||
return self.options.itemValue(item).toString();
|
||||
});
|
||||
|
||||
self.$element.val(val, true).trigger('change');
|
||||
},
|
||||
|
||||
/**
|
||||
* Initializes the tags input behaviour on the element
|
||||
*/
|
||||
build: function(options) {
|
||||
var self = this;
|
||||
|
||||
self.options = $.extend({}, defaultOptions, options);
|
||||
var typeahead = self.options.typeahead || {};
|
||||
|
||||
// When itemValue is set, freeInput should always be false
|
||||
if (self.objectItems)
|
||||
self.options.freeInput = false;
|
||||
|
||||
makeOptionItemFunction(self.options, 'itemValue');
|
||||
makeOptionItemFunction(self.options, 'itemText');
|
||||
makeOptionItemFunction(self.options, 'tagClass');
|
||||
|
||||
// for backwards compatibility, self.options.source is deprecated
|
||||
if (self.options.source)
|
||||
typeahead.source = self.options.source;
|
||||
|
||||
if (typeahead.source && $.fn.typeahead) {
|
||||
makeOptionFunction(typeahead, 'source');
|
||||
|
||||
self.$input.typeahead({
|
||||
source: function (query, process) {
|
||||
function processItems(items) {
|
||||
var texts = [];
|
||||
|
||||
for (var i = 0; i < items.length; i++) {
|
||||
var text = self.options.itemText(items[i]);
|
||||
map[text] = items[i];
|
||||
texts.push(text);
|
||||
}
|
||||
process(texts);
|
||||
}
|
||||
|
||||
this.map = {};
|
||||
var map = this.map,
|
||||
data = typeahead.source(query);
|
||||
|
||||
if ($.isFunction(data.success)) {
|
||||
// support for Angular promises
|
||||
data.success(processItems);
|
||||
} else {
|
||||
// support for functions and jquery promises
|
||||
$.when(data)
|
||||
.then(processItems);
|
||||
}
|
||||
},
|
||||
updater: function (text) {
|
||||
self.add(this.map[text]);
|
||||
},
|
||||
matcher: function (text) {
|
||||
return (text.toLowerCase().indexOf(this.query.trim().toLowerCase()) !== -1);
|
||||
},
|
||||
sorter: function (texts) {
|
||||
return texts.sort();
|
||||
},
|
||||
highlighter: function (text) {
|
||||
var regex = new RegExp( '(' + this.query + ')', 'gi' );
|
||||
return text.replace( regex, "<strong>$1</strong>" );
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
self.$container.on('click', $.proxy(function(event) {
|
||||
self.$input.focus();
|
||||
}, self));
|
||||
|
||||
self.$container.on('blur', 'input', $.proxy(function(event) {
|
||||
var $input = $(event.target);
|
||||
self.add($input.val());
|
||||
$input.val('');
|
||||
event.preventDefault();
|
||||
}, self));
|
||||
|
||||
self.$container.on('keydown', 'input', $.proxy(function(event) {
|
||||
var $input = $(event.target),
|
||||
$inputWrapper = self.findInputWrapper();
|
||||
|
||||
switch (event.which) {
|
||||
// BACKSPACE
|
||||
case 8:
|
||||
if (doGetCaretPosition($input[0]) === 0) {
|
||||
var prev = $inputWrapper.prev();
|
||||
if (prev) {
|
||||
self.remove(prev.data('item'));
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
// DELETE
|
||||
case 46:
|
||||
if (doGetCaretPosition($input[0]) === 0) {
|
||||
var next = $inputWrapper.next();
|
||||
if (next) {
|
||||
self.remove(next.data('item'));
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
// LEFT ARROW
|
||||
case 37:
|
||||
// Try to move the input before the previous tag
|
||||
var $prevTag = $inputWrapper.prev();
|
||||
if ($input.val().length === 0 && $prevTag[0]) {
|
||||
$prevTag.before($inputWrapper);
|
||||
$input.focus();
|
||||
}
|
||||
break;
|
||||
// RIGHT ARROW
|
||||
case 39:
|
||||
// Try to move the input after the next tag
|
||||
var $nextTag = $inputWrapper.next();
|
||||
if ($input.val().length === 0 && $nextTag[0]) {
|
||||
$nextTag.after($inputWrapper);
|
||||
$input.focus();
|
||||
}
|
||||
break;
|
||||
default:
|
||||
// When key corresponds one of the confirmKeys, add current input
|
||||
// as a new tag
|
||||
if (self.options.freeInput && $.inArray(event.which, self.options.confirmKeys) >= 0) {
|
||||
self.add($input.val());
|
||||
$input.val('');
|
||||
event.preventDefault();
|
||||
}
|
||||
}
|
||||
|
||||
// Reset internal input's size
|
||||
$input.attr('size', Math.max(this.inputSize, $input.val().length));
|
||||
}, self));
|
||||
|
||||
// Remove icon clicked
|
||||
self.$container.on('click', '[data-role=remove]', $.proxy(function(event) {
|
||||
self.remove($(event.target).closest('.tag').data('item'));
|
||||
// Grafana mod, if tags input used in popover the click event will bubble up and hide popover
|
||||
event.stopPropagation();
|
||||
}, self));
|
||||
|
||||
// Only add existing value as tags when using strings as tags
|
||||
if (self.options.itemValue === defaultOptions.itemValue) {
|
||||
if (self.$element[0].tagName === 'INPUT') {
|
||||
self.add(self.$element.val());
|
||||
} else {
|
||||
$('option', self.$element).each(function() {
|
||||
self.add($(this).attr('value'), true);
|
||||
});
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
/**
|
||||
* Removes all tagsinput behaviour and unregsiter all event handlers
|
||||
*/
|
||||
destroy: function() {
|
||||
var self = this;
|
||||
|
||||
// Unbind events
|
||||
self.$container.off('keypress', 'input');
|
||||
self.$container.off('click', '[role=remove]');
|
||||
|
||||
self.$container.remove();
|
||||
self.$element.removeData('tagsinput');
|
||||
self.$element.show();
|
||||
},
|
||||
|
||||
/**
|
||||
* Sets focus on the tagsinput
|
||||
*/
|
||||
focus: function() {
|
||||
this.$input.focus();
|
||||
},
|
||||
|
||||
/**
|
||||
* Returns the internal input element
|
||||
*/
|
||||
input: function() {
|
||||
return this.$input;
|
||||
},
|
||||
|
||||
/**
|
||||
* Returns the element which is wrapped around the internal input. This
|
||||
* is normally the $container, but typeahead.js moves the $input element.
|
||||
*/
|
||||
findInputWrapper: function() {
|
||||
var elt = this.$input[0],
|
||||
container = this.$container[0];
|
||||
while(elt && elt.parentNode !== container)
|
||||
elt = elt.parentNode;
|
||||
|
||||
return $(elt);
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Register JQuery plugin
|
||||
*/
|
||||
$.fn.tagsinput = function(arg1, arg2) {
|
||||
var results = [];
|
||||
|
||||
this.each(function() {
|
||||
var tagsinput = $(this).data('tagsinput');
|
||||
|
||||
// Initialize a new tags input
|
||||
if (!tagsinput) {
|
||||
tagsinput = new TagsInput(this, arg1);
|
||||
$(this).data('tagsinput', tagsinput);
|
||||
results.push(tagsinput);
|
||||
|
||||
if (this.tagName === 'SELECT') {
|
||||
$('option', $(this)).attr('selected', 'selected');
|
||||
}
|
||||
|
||||
// Init tags from $(this).val()
|
||||
$(this).val($(this).val());
|
||||
} else {
|
||||
// Invoke function on existing tags input
|
||||
var retVal = tagsinput[arg1](arg2);
|
||||
if (retVal !== undefined)
|
||||
results.push(retVal);
|
||||
}
|
||||
});
|
||||
|
||||
if ( typeof arg1 == 'string') {
|
||||
// Return the results from the invoked function calls
|
||||
return results.length > 1 ? results : results[0];
|
||||
} else {
|
||||
return results;
|
||||
}
|
||||
};
|
||||
|
||||
$.fn.tagsinput.Constructor = TagsInput;
|
||||
|
||||
/**
|
||||
* Most options support both a string or number as well as a function as
|
||||
* option value. This function makes sure that the option with the given
|
||||
* key in the given options is wrapped in a function
|
||||
*/
|
||||
function makeOptionItemFunction(options, key) {
|
||||
if (typeof options[key] !== 'function') {
|
||||
var propertyName = options[key];
|
||||
options[key] = function(item) { return item[propertyName]; };
|
||||
}
|
||||
}
|
||||
function makeOptionFunction(options, key) {
|
||||
if (typeof options[key] !== 'function') {
|
||||
var value = options[key];
|
||||
options[key] = function() { return value; };
|
||||
}
|
||||
}
|
||||
/**
|
||||
* HtmlEncodes the given value
|
||||
*/
|
||||
var htmlEncodeContainer = $('<div />');
|
||||
function htmlEncode(value) {
|
||||
if (value) {
|
||||
return htmlEncodeContainer.text(value).html();
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the position of the caret in the given input field
|
||||
* http://flightschool.acylt.com/devnotes/caret-position-woes/
|
||||
*/
|
||||
function doGetCaretPosition(oField) {
|
||||
var iCaretPos = 0;
|
||||
if (document.selection) {
|
||||
oField.focus ();
|
||||
var oSel = document.selection.createRange();
|
||||
oSel.moveStart ('character', -oField.value.length);
|
||||
iCaretPos = oSel.text.length;
|
||||
} else if (oField.selectionStart || oField.selectionStart == '0') {
|
||||
iCaretPos = oField.selectionStart;
|
||||
}
|
||||
return (iCaretPos);
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize tagsinput behaviour on inputs and selects which have
|
||||
* data-role=tagsinput
|
||||
*/
|
||||
$(function() {
|
||||
$("input[data-role=tagsinput], select[multiple][data-role=tagsinput]").tagsinput();
|
||||
});
|
||||
})(window.jQuery);
|
||||
@@ -16,6 +16,7 @@ queryLibrary=true
|
||||
queryService=true
|
||||
secretsManagementAppPlatform = true
|
||||
secretsManagementAppPlatformUI = true
|
||||
reportingCsvEncodingOptions = true
|
||||
|
||||
[environment]
|
||||
stack_id = 12345
|
||||
|
||||
22
yarn.lock
22
yarn.lock
@@ -6121,32 +6121,32 @@ __metadata:
|
||||
languageName: node
|
||||
linkType: hard
|
||||
|
||||
"@openfeature/ofrep-core@npm:^1.0.0":
|
||||
version: 1.1.0
|
||||
resolution: "@openfeature/ofrep-core@npm:1.1.0"
|
||||
"@openfeature/ofrep-core@npm:^2.0.0":
|
||||
version: 2.0.0
|
||||
resolution: "@openfeature/ofrep-core@npm:2.0.0"
|
||||
peerDependencies:
|
||||
"@openfeature/core": ^1.6.0
|
||||
checksum: 10/4198f2f1abf974822bf14530a7f514292d8235552d6e61465d29ffe42d092f675e7f56a9e9da5aa7d45dfbd98cc36316efbdaadaf83e8f510f35865612f5f24f
|
||||
checksum: 10/598656fb35c517fec8abfc4cd8c5a9cb11d4c0f1698e6aa1dd1cf7c42ebb244a9e5be7c8f0fecdc0d2246747b1a51f7e2fe5b84e2f648b23cf40435d2e3dc176
|
||||
languageName: node
|
||||
linkType: hard
|
||||
|
||||
"@openfeature/ofrep-web-provider@npm:^0.3.3":
|
||||
version: 0.3.3
|
||||
resolution: "@openfeature/ofrep-web-provider@npm:0.3.3"
|
||||
version: 0.3.5
|
||||
resolution: "@openfeature/ofrep-web-provider@npm:0.3.5"
|
||||
dependencies:
|
||||
"@openfeature/ofrep-core": "npm:^1.0.0"
|
||||
"@openfeature/ofrep-core": "npm:^2.0.0"
|
||||
peerDependencies:
|
||||
"@openfeature/web-sdk": ^1.4.0
|
||||
checksum: 10/85f362e3ebaa9d421be91e4d966284e28850649e417d5db81181960b95ac693c9faa4a0bdc84eeb03fa694a8cd1e1f5d9de9bf0fba62f8e2669f9637836f0884
|
||||
checksum: 10/699a9d2591e9d5834f02baa3d68972e52d37efed3866e2c06ad5deb071447c68fdd8afdc2e66e9808f7be162358a338f69cb7912fcb70320f7a7bc6abd00d3d6
|
||||
languageName: node
|
||||
linkType: hard
|
||||
|
||||
"@openfeature/web-sdk@npm:^1.6.1":
|
||||
version: 1.7.1
|
||||
resolution: "@openfeature/web-sdk@npm:1.7.1"
|
||||
version: 1.7.2
|
||||
resolution: "@openfeature/web-sdk@npm:1.7.2"
|
||||
peerDependencies:
|
||||
"@openfeature/core": ^1.9.0
|
||||
checksum: 10/358cabfddda8bf67a9bd96ce28f0f84a3218416907f8df4f2a58750ee840d164d86a1c859e8c9aa92d132fa28fcf72a0f17ceece798ff8af297d3e1c1e30cdab
|
||||
checksum: 10/3fc966d0523ca1941b44350d4c8a42c1b74cb035265d3574a2a6b16e87fa9edfb7211ec945b937399badad5de1e771c55ba7ce04d6eb02f265fc5a37e3af9395
|
||||
languageName: node
|
||||
linkType: hard
|
||||
|
||||
|
||||
Reference in New Issue
Block a user