Compare commits

..

15 Commits

Author SHA1 Message Date
tdbishop b7010164da Add awareness to Toggletip when inside Drawer given a data attribute 2025-12-29 11:02:13 -06:00
tdbishop 5b3e8690fa Add awareness of a parent when toggletip is rendered to work inside other modals 2025-12-29 10:52:37 -06:00
Ivan Ortega Alba 30ad61e0e9 Dashboards: Fix adhoc filter click when panel has no panel-level datasource (#115576)
* V2: Panel datasource is defined only for mixed ds

* if getDatasourceFromQueryRunner only returns ds.type, resolve to full ds ref throgh ds service

---------

Co-authored-by: Haris Rozajac <haris.rozajac12@gmail.com>
2025-12-29 10:29:50 +01:00
Oscar Kilhed 0b58cd3900 Dashboard: Remove BOMs from links during conversion (#115689)
* Dashboard: Add test case for BOM characters in link URLs

This test demonstrates the issue where BOM (Byte Order Mark) characters
in dashboard link URLs cause CUE validation errors during v1 to v2
conversion ('illegal byte order mark').

The test input contains BOMs in various URL locations:
- Dashboard links
- Panel data links
- Field config override links
- Options dataLinks
- Field config default links

* Dashboard: Strip BOM characters from URLs during v1 to v2 conversion

BOM (Byte Order Mark) characters in dashboard link URLs cause CUE
validation errors ('illegal byte order mark') when opening v2 dashboards.

This fix strips BOMs from all URL fields during conversion:
- Dashboard links
- Panel data links
- Field config override links
- Options dataLinks
- Field config default links

The stripBOM helper recursively processes nested structures to ensure
all string values have BOMs removed.

* Dashboard: Strip BOM characters in frontend v2 conversion

Add stripBOMs parameter to sortedDeepCloneWithoutNulls utility to remove
Byte Order Mark (U+FEFF) characters from all strings when serializing
dashboards to v2 format.

This prevents CUE validation errors ('illegal byte order mark') that occur
when BOMs are present in any string field. BOMs can be introduced through
copy/paste from certain editors or text sources.

Applied at the final serialization step so it catches BOMs from:
- Existing v1 dashboards being converted
- New data entered during dashboard editing
2025-12-29 09:53:45 +01:00
Matheus Macabu 4ba2fe6cce Auditing: Add Event struct to map audit logs into (#115509) 2025-12-29 09:31:58 +01:00
grafana-pr-automation[bot] a345f78ae0 I18n: Download translations from Crowdin (#115717)
New Crowdin translations by GitHub Action

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2025-12-28 00:34:24 +00:00
Yuri Tseretyan fa1e6cce5e Alerting: Rule backtesting with experimental UI (#115525)
* add function to convert StateTransition to LokiEntry
* add QueryResultBuilder
* update backtesting to produce result similar to historian
* make shouldRecord public
* filter out noop transitions
* add experimental front-end
* add new fields
* move conversion of api model to AlertRule to validation
* add extra labels
* calculate tick timestamp using the same logic as in scheduler
* implement correct logic of calculating first evaluation timestamp
* add uid, group and folder uid they are needed for jitter strategy

* add JitterOffsetInDuration and JitterStrategy.String()

* add config `backtesting_max_evaluations` to [unified_alerting] (not documented for now)

* remove obsolete tests

* elevate permisisons for backtesting endpoint
* move backtesting to separate dir
2025-12-26 16:55:57 -05:00
Alexander Akhmetov e38f007d30 Alerting: Fetch alert rule provenances for a page of rules only (#115643)
* Alerting: Fetch alert rule provenances for a page of rules only

* error when failed to fetch provenance
2025-12-24 13:41:46 +01:00
Alexander Akhmetov c38e515dec Alerting: Fix export of imported Prometheus-style recording rules to terraform (#115661)
Alerting: Fix export imported Prometheus-style recording rules to terraform
2025-12-24 09:49:38 +01:00
Mustafa Sencer Özcan 4f57ebe4ad fix: bump default facet search limit for unified search (#115690)
* fix: bump limit

* feat: add facetLimit query parameter to search API

* fix: set to 500

* fix: update snapshot

* fix: yarn generate-apis
2025-12-24 09:33:24 +01:00
alerting-team[bot] 3f5f0f783b Alerting: Update alerting module to 926c7491019668286c423cad9d2a65f419b14944 (#115704)
[create-pull-request] automated change

Co-authored-by: alexander-akhmetov <1875873+alexander-akhmetov@users.noreply.github.com>
2025-12-24 08:48:06 +01:00
grafana-pr-automation[bot] 5e4e6c1172 I18n: Download translations from Crowdin (#115705)
New Crowdin translations by GitHub Action

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
2025-12-24 00:42:01 +00:00
Paul Marbach f5218b5eb8 Sparkline: Add point annotations for some common calcs (#115595) 2025-12-23 16:39:30 -05:00
alerting-team[bot] a1389bc173 Alerting: Update alerting module to 77a1e2f35be87bebc41a0bf634f336282f0b9b53 (#115498)
* [create-pull-request] automated change

* Remove IsProtectedField and temp structure

* Fix alerting historian

* make update-workspace

---------

Co-authored-by: yuri-tceretian <25988953+yuri-tceretian@users.noreply.github.com>
Co-authored-by: Yuri Tseretyan <yuriy.tseretyan@grafana.com>
Co-authored-by: Alexander Akhmetov <me@alx.cx>
2025-12-23 14:46:44 -05:00
Sergej-Vlasov 0a0f92e85e InspectJsonTab: Force render the layout after change to reflect new gridPos (#115688)
force render the layout after inspect panel change to account for gridPos change
2025-12-23 10:52:50 -05:00
121 changed files with 3491 additions and 1102 deletions
+1 -1
View File
@@ -157,7 +157,7 @@ require (
github.com/google/go-querystring v1.1.0 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/google/wire v0.7.0 // indirect
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 // indirect
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 // indirect
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // indirect
github.com/grafana/dataplane/sdata v0.0.9 // indirect
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4 // indirect
+2 -2
View File
@@ -619,8 +619,8 @@ github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 h1:ZzG/gCclEit9w0QUfQt9GURcOycAIGcsQAhY1u0AEX0=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 h1:A9UJtyBBUE7PkRsAITKU05iz+HpHO9SaVjfdo2Df3UQ=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
+1 -1
View File
@@ -4,7 +4,7 @@ go 1.25.5
require (
github.com/go-kit/log v0.2.1
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4
github.com/grafana/grafana-app-sdk v0.48.7
github.com/grafana/grafana-app-sdk/logging v0.48.7
+2 -2
View File
@@ -243,8 +243,8 @@ github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/googleapis/gax-go/v2 v2.0.4/go.mod h1:0Wqv26UfaUD9n4G6kQubkQ+KchISgw+vpHVxEJEs9eg=
github.com/googleapis/gax-go/v2 v2.0.5/go.mod h1:DWXyrwAJ9X0FpwwEdw+IPEYBICEFu5mhpdKc/us6bOk=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 h1:ZzG/gCclEit9w0QUfQt9GURcOycAIGcsQAhY1u0AEX0=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 h1:A9UJtyBBUE7PkRsAITKU05iz+HpHO9SaVjfdo2Df3UQ=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4 h1:jSojuc7njleS3UOz223WDlXOinmuLAIPI0z2vtq8EgI=
github.com/grafana/dskit v0.0.0-20250908063411-6b6da59b5cc4/go.mod h1:VahT+GtfQIM+o8ht2StR6J9g+Ef+C2Vokh5uuSmOD/4=
github.com/grafana/grafana-app-sdk v0.48.7 h1:9mF7nqkqP0QUYYDlznoOt+GIyjzj45wGfUHB32u2ZMo=
@@ -31,6 +31,10 @@ const (
maxLimit = 1000
Namespace = "grafana"
Subsystem = "alerting"
// LogQL field path for alert rule UID after JSON parsing.
// Loki flattens nested JSON fields with underscores: alert.labels.__alert_rule_uid__ -> alert_labels___alert_rule_uid__
lokiAlertRuleUIDField = "alert_labels___alert_rule_uid__"
)
var (
@@ -111,13 +115,13 @@ func buildQuery(query Query) (string, error) {
fmt.Sprintf(`%s=%q`, historian.LabelFrom, historian.LabelFromValue),
}
if query.RuleUID != nil {
selectors = append(selectors,
fmt.Sprintf(`%s=%q`, historian.LabelRuleUID, *query.RuleUID))
}
logql := fmt.Sprintf(`{%s} | json`, strings.Join(selectors, `,`))
// Add ruleUID filter as JSON line filter if specified.
if query.RuleUID != nil && *query.RuleUID != "" {
logql += fmt.Sprintf(` | %s = %q`, lokiAlertRuleUIDField, *query.RuleUID)
}
// Add receiver filter if specified.
if query.Receiver != nil && *query.Receiver != "" {
logql += fmt.Sprintf(` | receiver = %q`, *query.Receiver)
@@ -211,16 +215,13 @@ func parseLokiEntry(s lokiclient.Sample) (Entry, error) {
groupLabels = make(map[string]string)
}
alerts := make([]EntryAlert, len(lokiEntry.Alerts))
for i, a := range lokiEntry.Alerts {
alerts[i] = EntryAlert{
Status: a.Status,
Labels: a.Labels,
Annotations: a.Annotations,
StartsAt: a.StartsAt,
EndsAt: a.EndsAt,
}
}
alerts := []EntryAlert{{
Status: lokiEntry.Alert.Status,
Labels: lokiEntry.Alert.Labels,
Annotations: lokiEntry.Alert.Annotations,
StartsAt: lokiEntry.Alert.StartsAt,
EndsAt: lokiEntry.Alert.EndsAt,
}}
return Entry{
Timestamp: s.T,
@@ -7,6 +7,7 @@ import (
"testing"
"time"
"github.com/grafana/alerting/models"
"github.com/grafana/alerting/notify/historian"
"github.com/grafana/alerting/notify/historian/lokiclient"
"github.com/grafana/grafana-app-sdk/logging"
@@ -133,9 +134,8 @@ func TestBuildQuery(t *testing.T) {
query: Query{
RuleUID: stringPtr("test-rule-uid"),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid"`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with receiver filter",
@@ -143,9 +143,8 @@ func TestBuildQuery(t *testing.T) {
RuleUID: stringPtr("test-rule-uid"),
Receiver: stringPtr("email-receiver"),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json | receiver = "email-receiver"`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid" | receiver = "email-receiver"`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with status filter",
@@ -153,9 +152,8 @@ func TestBuildQuery(t *testing.T) {
RuleUID: stringPtr("test-rule-uid"),
Status: createStatusPtr(v0alpha1.CreateNotificationqueryRequestNotificationStatusFiring),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json | status = "firing"`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid" | status = "firing"`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with success outcome filter",
@@ -163,9 +161,8 @@ func TestBuildQuery(t *testing.T) {
RuleUID: stringPtr("test-rule-uid"),
Outcome: outcomePtr(v0alpha1.CreateNotificationqueryRequestNotificationOutcomeSuccess),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json | error = ""`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid" | error = ""`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with error outcome filter",
@@ -173,9 +170,8 @@ func TestBuildQuery(t *testing.T) {
RuleUID: stringPtr("test-rule-uid"),
Outcome: outcomePtr(v0alpha1.CreateNotificationqueryRequestNotificationOutcomeError),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json | error != ""`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid" | error != ""`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with many filters",
@@ -185,9 +181,8 @@ func TestBuildQuery(t *testing.T) {
Status: createStatusPtr(v0alpha1.CreateNotificationqueryRequestNotificationStatusResolved),
Outcome: outcomePtr(v0alpha1.CreateNotificationqueryRequestNotificationOutcomeSuccess),
},
expected: fmt.Sprintf(`{%s=%q,%s=%q} | json | receiver = "email-receiver" | status = "resolved" | error = ""`,
historian.LabelFrom, historian.LabelFromValue,
historian.LabelRuleUID, "test-rule-uid"),
expected: fmt.Sprintf(`{%s=%q} | json | alert_labels___alert_rule_uid__ = "test-rule-uid" | receiver = "email-receiver" | status = "resolved" | error = ""`,
historian.LabelFrom, historian.LabelFromValue),
},
{
name: "query with group label matcher",
@@ -277,19 +272,19 @@ func TestParseLokiEntry(t *testing.T) {
GroupLabels: map[string]string{
"alertname": "test-alert",
},
Alerts: []historian.NotificationHistoryLokiEntryAlert{
{
Status: "firing",
Labels: map[string]string{
"severity": "critical",
},
Annotations: map[string]string{
"summary": "Test alert",
},
StartsAt: now,
EndsAt: now.Add(1 * time.Hour),
Alert: historian.NotificationHistoryLokiEntryAlert{
Status: "firing",
Labels: map[string]string{
"severity": "critical",
},
Annotations: map[string]string{
"summary": "Test alert",
},
StartsAt: now,
EndsAt: now.Add(1 * time.Hour),
},
AlertIndex: 0,
AlertCount: 1,
Retry: false,
Duration: 100,
PipelineTime: now,
@@ -335,7 +330,9 @@ func TestParseLokiEntry(t *testing.T) {
Error: "notification failed",
GroupKey: "key:thing",
GroupLabels: map[string]string{},
Alerts: []historian.NotificationHistoryLokiEntryAlert{},
Alert: historian.NotificationHistoryLokiEntryAlert{},
AlertIndex: 0,
AlertCount: 1,
PipelineTime: now,
}),
},
@@ -347,7 +344,7 @@ func TestParseLokiEntry(t *testing.T) {
Outcome: OutcomeError,
GroupKey: "key:thing",
GroupLabels: map[string]string{},
Alerts: []EntryAlert{},
Alerts: []EntryAlert{{}},
Error: stringPtr("notification failed"),
PipelineTime: now,
},
@@ -365,7 +362,7 @@ func TestParseLokiEntry(t *testing.T) {
Status: Status("firing"),
Outcome: OutcomeSuccess,
GroupLabels: map[string]string{},
Alerts: []EntryAlert{},
Alerts: []EntryAlert{{}},
PipelineTime: now,
},
},
@@ -448,7 +445,9 @@ func TestLokiReader_RunQuery(t *testing.T) {
Receiver: "receiver-1",
Status: "firing",
GroupLabels: map[string]string{},
Alerts: []historian.NotificationHistoryLokiEntryAlert{},
Alert: historian.NotificationHistoryLokiEntryAlert{},
AlertIndex: 0,
AlertCount: 1,
PipelineTime: now,
}),
},
@@ -459,7 +458,9 @@ func TestLokiReader_RunQuery(t *testing.T) {
Receiver: "receiver-3",
Status: "firing",
GroupLabels: map[string]string{},
Alerts: []historian.NotificationHistoryLokiEntryAlert{},
Alert: historian.NotificationHistoryLokiEntryAlert{},
AlertIndex: 0,
AlertCount: 1,
PipelineTime: now,
}),
},
@@ -474,7 +475,9 @@ func TestLokiReader_RunQuery(t *testing.T) {
Receiver: "receiver-2",
Status: "firing",
GroupLabels: map[string]string{},
Alerts: []historian.NotificationHistoryLokiEntryAlert{},
Alert: historian.NotificationHistoryLokiEntryAlert{},
AlertIndex: 0,
AlertCount: 1,
PipelineTime: now,
}),
},
@@ -546,19 +549,19 @@ func createMockLokiResponse(timestamp time.Time) lokiclient.QueryRes {
GroupLabels: map[string]string{
"alertname": "test-alert",
},
Alerts: []historian.NotificationHistoryLokiEntryAlert{
{
Status: "firing",
Labels: map[string]string{
"severity": "critical",
},
Annotations: map[string]string{
"summary": "Test alert",
},
StartsAt: timestamp,
EndsAt: timestamp.Add(1 * time.Hour),
Alert: historian.NotificationHistoryLokiEntryAlert{
Status: "firing",
Labels: map[string]string{
"severity": "critical",
},
Annotations: map[string]string{
"summary": "Test alert",
},
StartsAt: timestamp,
EndsAt: timestamp.Add(1 * time.Hour),
},
AlertIndex: 0,
AlertCount: 1,
Retry: false,
Duration: 100,
PipelineTime: timestamp,
@@ -587,10 +590,19 @@ func createLokiEntryJSONWithNilLabels(t *testing.T, timestamp time.Time) string
"status": "firing",
"error": "",
"groupLabels": null,
"alerts": [],
"alert": {},
"alertIndex": 0,
"alertCount": 1,
"retry": false,
"duration": 0,
"pipelineTime": "%s"
}`, timestamp.Format(time.RFC3339Nano))
return jsonStr
}
func TestRuleUIDLabelConstant(t *testing.T) {
// Verify that models.RuleUIDLabel has the expected value.
// If this changes in the alerting module, our LogQL field path constant will be incorrect
// and filtering for a single alert rule by its UID will break.
assert.Equal(t, "__alert_rule_uid__", models.RuleUIDLabel)
}
-2
View File
@@ -6,8 +6,6 @@ require (
github.com/grafana/grafana-app-sdk v0.48.7
github.com/grafana/grafana-app-sdk/logging v0.48.7
k8s.io/apimachinery v0.34.3
k8s.io/apiserver v0.34.2
k8s.io/kube-openapi v0.0.0-20250910181357-589584f1c912
k8s.io/kube-openapi v0.0.0-20251125145642-4e65d59e963e
)
-2
View File
@@ -248,8 +248,6 @@ k8s.io/apimachinery v0.34.3 h1:/TB+SFEiQvN9HPldtlWOTp0hWbJ+fjU+wkxysf/aQnE=
k8s.io/apimachinery v0.34.3/go.mod h1:/GwIlEcWuTX9zKIg2mbw0LRFIsXwrfoVxn+ef0X13lw=
k8s.io/client-go v0.34.3 h1:wtYtpzy/OPNYf7WyNBTj3iUA0XaBHVqhv4Iv3tbrF5A=
k8s.io/client-go v0.34.3/go.mod h1:OxxeYagaP9Kdf78UrKLa3YZixMCfP6bgPwPwNBQBzpM=
k8s.io/apiserver v0.34.2 h1:2/yu8suwkmES7IzwlehAovo8dDE07cFRC7KMDb1+MAE=
k8s.io/apiserver v0.34.2/go.mod h1:gqJQy2yDOB50R3JUReHSFr+cwJnL8G1dzTA0YLEqAPI=
k8s.io/klog/v2 v2.130.1 h1:n9Xl7H1Xvksem4KFG4PYbdQCQxqc/tTUyrgXaOhHSzk=
k8s.io/klog/v2 v2.130.1/go.mod h1:3Jpz1GvMt720eyJH1ckRHK1EDfpxISzJ7I9OYgaDtPE=
k8s.io/kube-openapi v0.0.0-20251125145642-4e65d59e963e h1:iW9ChlU0cU16w8MpVYjXk12dqQ4BPFBEgif+ap7/hqQ=
+8 -13
View File
@@ -1,22 +1,17 @@
package kinds
annotationv0alpha1: {
kind: "Annotation"
kind: "Annotation"
pluralName: "Annotations"
schema: {
spec: {
text: string
schema: {
spec: {
text: string
time: int64
timeEnd?: int64
dashboardUID?: string
panelID?: int64
tags?: [...string]
}
}
selectableFields: [
"spec.time",
"spec.timeEnd",
"spec.dashboardUID",
"spec.panelID",
]
}
}
}
}
@@ -25,13 +25,6 @@ type Annotation struct {
Status AnnotationStatus `json:"status" yaml:"status"`
}
func NewAnnotation() *Annotation {
return &Annotation{
Spec: *NewAnnotationSpec(),
Status: *NewAnnotationStatus(),
}
}
func (o *Annotation) GetSpec() any {
return o.Spec
}
@@ -5,69 +5,13 @@
package v0alpha1
import (
"errors"
"fmt"
"github.com/grafana/grafana-app-sdk/resource"
)
// schema is unexported to prevent accidental overwrites
var (
schemaAnnotation = resource.NewSimpleSchema("annotation.grafana.app", "v0alpha1", NewAnnotation(), &AnnotationList{}, resource.WithKind("Annotation"),
resource.WithPlural("annotations"), resource.WithScope(resource.NamespacedScope), resource.WithSelectableFields([]resource.SelectableField{resource.SelectableField{
FieldSelector: "spec.time",
FieldValueFunc: func(o resource.Object) (string, error) {
cast, ok := o.(*Annotation)
if !ok {
return "", errors.New("provided object must be of type *Annotation")
}
return fmt.Sprintf("%d", cast.Spec.Time), nil
},
},
resource.SelectableField{
FieldSelector: "spec.timeEnd",
FieldValueFunc: func(o resource.Object) (string, error) {
cast, ok := o.(*Annotation)
if !ok {
return "", errors.New("provided object must be of type *Annotation")
}
if cast.Spec.TimeEnd == nil {
return "", nil
}
return fmt.Sprintf("%d", *cast.Spec.TimeEnd), nil
},
},
resource.SelectableField{
FieldSelector: "spec.dashboardUID",
FieldValueFunc: func(o resource.Object) (string, error) {
cast, ok := o.(*Annotation)
if !ok {
return "", errors.New("provided object must be of type *Annotation")
}
if cast.Spec.DashboardUID == nil {
return "", nil
}
return *cast.Spec.DashboardUID, nil
},
},
resource.SelectableField{
FieldSelector: "spec.panelID",
FieldValueFunc: func(o resource.Object) (string, error) {
cast, ok := o.(*Annotation)
if !ok {
return "", errors.New("provided object must be of type *Annotation")
}
if cast.Spec.PanelID == nil {
return "", nil
}
return fmt.Sprintf("%d", *cast.Spec.PanelID), nil
},
},
}))
schemaAnnotation = resource.NewSimpleSchema("annotation.grafana.app", "v0alpha1", &Annotation{}, &AnnotationList{}, resource.WithKind("Annotation"),
resource.WithPlural("annotations"), resource.WithScope(resource.NamespacedScope))
kindAnnotation = resource.Kind{
Schema: schemaAnnotation,
Codecs: map[resource.KindEncoding]resource.Codec{
-28
View File
@@ -40,12 +40,6 @@ var appManifestData = app.ManifestData{
Scope: "Namespaced",
Conversion: false,
Schema: &versionSchemaAnnotationv0alpha1,
SelectableFields: []string{
"spec.time",
"spec.timeEnd",
"spec.dashboardUID",
"spec.panelID",
},
},
},
Routes: app.ManifestVersionRoutes{
@@ -83,28 +77,6 @@ var appManifestData = app.ManifestData{
"tags": {
SchemaProps: spec.SchemaProps{
Type: []string{"array"},
Items: &spec.SchemaOrArray{
Schema: &spec.Schema{
SchemaProps: spec.SchemaProps{
Type: []string{"object"},
Properties: map[string]spec.Schema{
"count": {
SchemaProps: spec.SchemaProps{
Type: []string{"number"},
},
},
"tag": {
SchemaProps: spec.SchemaProps{
Type: []string{"string"},
},
},
},
Required: []string{
"tag",
"count",
},
}},
},
},
},
},
-20
View File
@@ -1,20 +0,0 @@
package app
import (
"context"
"k8s.io/apiserver/pkg/authorization/authorizer"
)
func GetAuthorizer() authorizer.Authorizer {
return authorizer.AuthorizerFunc(func(
ctx context.Context, attr authorizer.Attributes,
) (authorized authorizer.Decision, reason string, err error) {
if !attr.IsResourceRequest() {
return authorizer.DecisionNoOpinion, "", nil
}
// Any authenticated user can access the API
return authorizer.DecisionAllow, "", nil
})
}
@@ -0,0 +1,142 @@
{
"kind": "Dashboard",
"apiVersion": "dashboard.grafana.app/v1beta1",
"metadata": {
"name": "bom-in-links-test",
"namespace": "org-1",
"labels": {
"test": "bom-stripping"
}
},
"spec": {
"title": "BOM Stripping Test Dashboard",
"description": "Testing that BOM characters are stripped from URLs during conversion",
"schemaVersion": 42,
"tags": ["test", "bom"],
"editable": true,
"links": [
{
"title": "Dashboard link with BOM",
"type": "link",
"url": "http://example.com?var=${datasource}&other=value",
"targetBlank": true,
"icon": "external link"
}
],
"panels": [
{
"id": 1,
"type": "table",
"title": "Panel with BOM in field config override links",
"gridPos": {
"h": 8,
"w": 12,
"x": 0,
"y": 0
},
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{"color": "green"},
{"color": "red", "value": 80}
]
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "server"
},
"properties": [
{
"id": "links",
"value": [
{
"title": "Override link with BOM",
"url": "http://localhost:3000/d/test?var-datacenter=${__data.fields[datacenter]}&var-server=${__value.raw}"
}
]
}
]
}
]
},
"links": [
{
"title": "Panel data link with BOM",
"url": "http://example.com/${__data.fields.cluster}&var=value",
"targetBlank": true
}
],
"targets": [
{
"refId": "A",
"datasource": {
"type": "prometheus",
"uid": "test-ds"
}
}
]
},
{
"id": 2,
"type": "timeseries",
"title": "Panel with BOM in options dataLinks",
"gridPos": {
"h": 8,
"w": 12,
"x": 12,
"y": 0
},
"options": {
"legend": {
"showLegend": true,
"displayMode": "list",
"placement": "bottom"
},
"dataLinks": [
{
"title": "Options data link with BOM",
"url": "http://example.com?series=${__series.name}&time=${__value.time}",
"targetBlank": true
}
]
},
"fieldConfig": {
"defaults": {
"links": [
{
"title": "Field config default link with BOM",
"url": "http://example.com?field=${__field.name}&value=${__value.raw}",
"targetBlank": false
}
]
},
"overrides": []
},
"targets": [
{
"refId": "A",
"datasource": {
"type": "prometheus",
"uid": "test-ds"
}
}
]
}
],
"time": {
"from": "now-6h",
"to": "now"
},
"timepicker": {
"refresh_intervals": ["5s", "10s", "30s", "1m", "5m"]
}
}
}
@@ -120,7 +120,7 @@
"value": [
{
"title": "filter",
"url": "http://localhost:3000/d/-Y-tnEDWk/templating-nested-template-variables?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
"url": "http://localhost:3000/d/-Y-tnEDWk/templating-nested-template-variables?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
}
]
}
@@ -124,7 +124,7 @@
"value": [
{
"title": "filter",
"url": "http://localhost:3000/d/-Y-tnEDWk/templating-nested-template-variables?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
"url": "http://localhost:3000/d/-Y-tnEDWk/templating-nested-template-variables?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
}
]
}
@@ -2051,4 +2051,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -2691,4 +2691,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -2764,4 +2764,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -1173,4 +1173,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -1618,4 +1618,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -1670,4 +1670,4 @@
"storedVersion": "v0alpha1"
}
}
}
}
@@ -0,0 +1,161 @@
{
"kind": "Dashboard",
"apiVersion": "dashboard.grafana.app/v0alpha1",
"metadata": {
"name": "bom-in-links-test",
"namespace": "org-1",
"labels": {
"test": "bom-stripping"
}
},
"spec": {
"description": "Testing that BOM characters are stripped from URLs during conversion",
"editable": true,
"links": [
{
"icon": "external link",
"targetBlank": true,
"title": "Dashboard link with BOM",
"type": "link",
"url": "http://example.com?var=${datasource}\u0026other=value"
}
],
"panels": [
{
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green"
},
{
"color": "red",
"value": 80
}
]
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "server"
},
"properties": [
{
"id": "links",
"value": [
{
"title": "Override link with BOM",
"url": "http://localhost:3000/d/test?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
}
]
}
]
}
]
},
"gridPos": {
"h": 8,
"w": 12,
"x": 0,
"y": 0
},
"id": 1,
"links": [
{
"targetBlank": true,
"title": "Panel data link with BOM",
"url": "http://example.com/${__data.fields.cluster}\u0026var=value"
}
],
"targets": [
{
"datasource": {
"type": "prometheus",
"uid": "test-ds"
},
"refId": "A"
}
],
"title": "Panel with BOM in field config override links",
"type": "table"
},
{
"fieldConfig": {
"defaults": {
"links": [
{
"targetBlank": false,
"title": "Field config default link with BOM",
"url": "http://example.com?field=${__field.name}\u0026value=${__value.raw}"
}
]
},
"overrides": []
},
"gridPos": {
"h": 8,
"w": 12,
"x": 12,
"y": 0
},
"id": 2,
"options": {
"dataLinks": [
{
"targetBlank": true,
"title": "Options data link with BOM",
"url": "http://example.com?series=${__series.name}\u0026time=${__value.time}"
}
],
"legend": {
"displayMode": "list",
"placement": "bottom",
"showLegend": true
}
},
"targets": [
{
"datasource": {
"type": "prometheus",
"uid": "test-ds"
},
"refId": "A"
}
],
"title": "Panel with BOM in options dataLinks",
"type": "timeseries"
}
],
"schemaVersion": 42,
"tags": [
"test",
"bom"
],
"time": {
"from": "now-6h",
"to": "now"
},
"timepicker": {
"refresh_intervals": [
"5s",
"10s",
"30s",
"1m",
"5m"
]
},
"title": "BOM Stripping Test Dashboard"
},
"status": {
"conversion": {
"failed": false,
"storedVersion": "v1beta1"
}
}
}
@@ -0,0 +1,242 @@
{
"kind": "Dashboard",
"apiVersion": "dashboard.grafana.app/v2alpha1",
"metadata": {
"name": "bom-in-links-test",
"namespace": "org-1",
"labels": {
"test": "bom-stripping"
}
},
"spec": {
"annotations": [],
"cursorSync": "Off",
"description": "Testing that BOM characters are stripped from URLs during conversion",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Panel with BOM in field config override links",
"description": "",
"links": [
{
"title": "Panel data link with BOM",
"url": "http://example.com/${__data.fields.cluster}\u0026var=value",
"targetBlank": true
}
],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "prometheus",
"spec": {}
},
"datasource": {
"type": "prometheus",
"uid": "test-ds"
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "table",
"spec": {
"pluginVersion": "",
"options": {},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": null,
"color": "green"
},
{
"value": 80,
"color": "red"
}
]
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "server"
},
"properties": [
{
"id": "links",
"value": [
{
"title": "Override link with BOM",
"url": "http://localhost:3000/d/test?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
}
]
}
]
}
]
}
}
}
}
},
"panel-2": {
"kind": "Panel",
"spec": {
"id": 2,
"title": "Panel with BOM in options dataLinks",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "prometheus",
"spec": {}
},
"datasource": {
"type": "prometheus",
"uid": "test-ds"
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "timeseries",
"spec": {
"pluginVersion": "",
"options": {
"dataLinks": [
{
"targetBlank": true,
"title": "Options data link with BOM",
"url": "http://example.com?series=${__series.name}\u0026time=${__value.time}"
}
],
"legend": {
"displayMode": "list",
"placement": "bottom",
"showLegend": true
}
},
"fieldConfig": {
"defaults": {
"links": [
{
"targetBlank": false,
"title": "Field config default link with BOM",
"url": "http://example.com?field=${__field.name}\u0026value=${__value.raw}"
}
]
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 12,
"height": 8,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
},
{
"kind": "GridLayoutItem",
"spec": {
"x": 12,
"y": 0,
"width": 12,
"height": 8,
"element": {
"kind": "ElementReference",
"name": "panel-2"
}
}
}
]
}
},
"links": [
{
"title": "Dashboard link with BOM",
"type": "link",
"icon": "external link",
"tooltip": "",
"url": "http://example.com?var=${datasource}\u0026other=value",
"tags": [],
"asDropdown": false,
"targetBlank": true,
"includeVars": false,
"keepTime": false
}
],
"liveNow": false,
"preload": false,
"tags": [
"test",
"bom"
],
"timeSettings": {
"timezone": "browser",
"from": "now-6h",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "BOM Stripping Test Dashboard",
"variables": []
},
"status": {
"conversion": {
"failed": false,
"storedVersion": "v1beta1"
}
}
}
@@ -0,0 +1,246 @@
{
"kind": "Dashboard",
"apiVersion": "dashboard.grafana.app/v2beta1",
"metadata": {
"name": "bom-in-links-test",
"namespace": "org-1",
"labels": {
"test": "bom-stripping"
}
},
"spec": {
"annotations": [],
"cursorSync": "Off",
"description": "Testing that BOM characters are stripped from URLs during conversion",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Panel with BOM in field config override links",
"description": "",
"links": [
{
"title": "Panel data link with BOM",
"url": "http://example.com/${__data.fields.cluster}\u0026var=value",
"targetBlank": true
}
],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "prometheus",
"version": "v0",
"datasource": {
"name": "test-ds"
},
"spec": {}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "table",
"version": "",
"spec": {
"options": {},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": null,
"color": "green"
},
{
"value": 80,
"color": "red"
}
]
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "server"
},
"properties": [
{
"id": "links",
"value": [
{
"title": "Override link with BOM",
"url": "http://localhost:3000/d/test?var-datacenter=${__data.fields[datacenter]}\u0026var-server=${__value.raw}"
}
]
}
]
}
]
}
}
}
}
},
"panel-2": {
"kind": "Panel",
"spec": {
"id": 2,
"title": "Panel with BOM in options dataLinks",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "prometheus",
"version": "v0",
"datasource": {
"name": "test-ds"
},
"spec": {}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "timeseries",
"version": "",
"spec": {
"options": {
"dataLinks": [
{
"targetBlank": true,
"title": "Options data link with BOM",
"url": "http://example.com?series=${__series.name}\u0026time=${__value.time}"
}
],
"legend": {
"displayMode": "list",
"placement": "bottom",
"showLegend": true
}
},
"fieldConfig": {
"defaults": {
"links": [
{
"targetBlank": false,
"title": "Field config default link with BOM",
"url": "http://example.com?field=${__field.name}\u0026value=${__value.raw}"
}
]
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 12,
"height": 8,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
},
{
"kind": "GridLayoutItem",
"spec": {
"x": 12,
"y": 0,
"width": 12,
"height": 8,
"element": {
"kind": "ElementReference",
"name": "panel-2"
}
}
}
]
}
},
"links": [
{
"title": "Dashboard link with BOM",
"type": "link",
"icon": "external link",
"tooltip": "",
"url": "http://example.com?var=${datasource}\u0026other=value",
"tags": [],
"asDropdown": false,
"targetBlank": true,
"includeVars": false,
"keepTime": false
}
],
"liveNow": false,
"preload": false,
"tags": [
"test",
"bom"
],
"timeSettings": {
"timezone": "browser",
"from": "now-6h",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "BOM Stripping Test Dashboard",
"variables": []
},
"status": {
"conversion": {
"failed": false,
"storedVersion": "v1beta1"
}
}
}
@@ -229,6 +229,36 @@ func getBoolField(m map[string]interface{}, key string, defaultValue bool) bool
return defaultValue
}
// stripBOM removes Byte Order Mark (BOM) characters from a string.
// BOMs (U+FEFF) can be introduced through copy/paste from certain editors
// and cause CUE validation errors ("illegal byte order mark").
func stripBOM(s string) string {
return strings.ReplaceAll(s, "\ufeff", "")
}
// stripBOMFromInterface recursively strips BOM characters from all strings
// in an interface{} value (map, slice, or string).
func stripBOMFromInterface(v interface{}) interface{} {
switch val := v.(type) {
case string:
return stripBOM(val)
case map[string]interface{}:
result := make(map[string]interface{}, len(val))
for k, v := range val {
result[k] = stripBOMFromInterface(v)
}
return result
case []interface{}:
result := make([]interface{}, len(val))
for i, item := range val {
result[i] = stripBOMFromInterface(item)
}
return result
default:
return v
}
}
func getUnionField[T ~string](m map[string]interface{}, key string) *T {
if val, ok := m[key]; ok {
if str, ok := val.(string); ok && str != "" {
@@ -393,7 +423,8 @@ func transformLinks(dashboard map[string]interface{}) []dashv2alpha1.DashboardDa
// Optional field - only set if present
if url, exists := linkMap["url"]; exists {
if urlStr, ok := url.(string); ok {
dashLink.Url = &urlStr
cleanUrl := stripBOM(urlStr)
dashLink.Url = &cleanUrl
}
}
@@ -2239,7 +2270,7 @@ func transformDataLinks(panelMap map[string]interface{}) []dashv2alpha1.Dashboar
if linkMap, ok := link.(map[string]interface{}); ok {
dataLink := dashv2alpha1.DashboardDataLink{
Title: schemaversion.GetStringValue(linkMap, "title"),
Url: schemaversion.GetStringValue(linkMap, "url"),
Url: stripBOM(schemaversion.GetStringValue(linkMap, "url")),
}
if _, exists := linkMap["targetBlank"]; exists {
targetBlank := getBoolField(linkMap, "targetBlank", false)
@@ -2331,6 +2362,12 @@ func buildVizConfig(panelMap map[string]interface{}) dashv2alpha1.DashboardVizCo
}
}
// Strip BOMs from options (may contain dataLinks with URLs that have BOMs)
cleanedOptions := stripBOMFromInterface(options)
if cleanedMap, ok := cleanedOptions.(map[string]interface{}); ok {
options = cleanedMap
}
// Build field config by mapping each field individually
fieldConfigSource := extractFieldConfigSource(fieldConfig)
@@ -2474,9 +2511,14 @@ func extractFieldConfigDefaults(defaults map[string]interface{}) dashv2alpha1.Da
hasDefaults = true
}
// Extract array field
// Extract array field - strip BOMs from link URLs
if linksArray, ok := extractArrayField(defaults, "links"); ok {
fieldConfigDefaults.Links = linksArray
cleanedLinks := stripBOMFromInterface(linksArray)
if cleanedArray, ok := cleanedLinks.([]interface{}); ok {
fieldConfigDefaults.Links = cleanedArray
} else {
fieldConfigDefaults.Links = linksArray
}
hasDefaults = true
}
@@ -2762,9 +2804,11 @@ func extractFieldConfigOverrides(fieldConfig map[string]interface{}) []dashv2alp
fieldOverride.Properties = make([]dashv2alpha1.DashboardDynamicConfigValue, 0, len(propertiesArray))
for _, property := range propertiesArray {
if propertyMap, ok := property.(map[string]interface{}); ok {
// Strip BOMs from property values (may contain links with URLs)
cleanedValue := stripBOMFromInterface(propertyMap["value"])
fieldOverride.Properties = append(fieldOverride.Properties, dashv2alpha1.DashboardDynamicConfigValue{
Id: schemaversion.GetStringValue(propertyMap, "id"),
Value: propertyMap["value"],
Value: cleanedValue,
})
}
}
+1 -1
View File
@@ -223,7 +223,7 @@ require (
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.15.0 // indirect
github.com/gorilla/mux v1.8.1 // indirect
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 // indirect
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 // indirect
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // indirect
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // indirect
github.com/grafana/dataplane/sdata v0.0.9 // indirect
+2 -2
View File
@@ -827,8 +827,8 @@ github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 h1:ZzG/gCclEit9w0QUfQt9GURcOycAIGcsQAhY1u0AEX0=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 h1:A9UJtyBBUE7PkRsAITKU05iz+HpHO9SaVjfdo2Df3UQ=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
+1 -1
View File
@@ -90,7 +90,7 @@ require (
github.com/google/gnostic-models v0.7.1 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 // indirect
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 // indirect
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // indirect
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // indirect
github.com/grafana/dataplane/sdata v0.0.9 // indirect
+2 -2
View File
@@ -213,8 +213,8 @@ github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 h1:ZzG/gCclEit9w0QUfQt9GURcOycAIGcsQAhY1u0AEX0=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 h1:A9UJtyBBUE7PkRsAITKU05iz+HpHO9SaVjfdo2Df3UQ=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
+1 -1
View File
@@ -87,7 +87,7 @@ require (
github.com/googleapis/gax-go/v2 v2.15.0 // @grafana/grafana-backend-group
github.com/gorilla/mux v1.8.1 // @grafana/grafana-backend-group
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 // @grafana/grafana-app-platform-squad
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 // @grafana/alerting-backend
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 // @grafana/alerting-backend
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f // @grafana/identity-access-team
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 // @grafana/identity-access-team
github.com/grafana/dataplane/examples v0.0.1 // @grafana/observability-metrics
+2 -2
View File
@@ -1622,8 +1622,8 @@ github.com/gorilla/sessions v1.2.1 h1:DHd3rPN5lE3Ts3D8rKkQ8x/0kqfeNmBAaiSi+o7Fsg
github.com/gorilla/sessions v1.2.1/go.mod h1:dk2InVEVJ0sfLlnXv9EAgkf6ecYs/i80K/zI+bUmuGM=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674 h1:JeSE6pjso5THxAzdVpqr6/geYxZytqFMBCOtn/ujyeo=
github.com/gorilla/websocket v1.5.4-0.20250319132907-e064f32e3674/go.mod h1:r4w70xmWCQKmi1ONH4KIaBptdivuRPyosB9RmPlGEwA=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7 h1:ZzG/gCclEit9w0QUfQt9GURcOycAIGcsQAhY1u0AEX0=
github.com/grafana/alerting v0.0.0-20251212143239-491433b332b7/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196 h1:A9UJtyBBUE7PkRsAITKU05iz+HpHO9SaVjfdo2Df3UQ=
github.com/grafana/alerting v0.0.0-20251223160021-926c74910196/go.mod h1:l7v67cgP7x72ajB9UPZlumdrHqNztpKoqQ52cU8T3LU=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f h1:Cbm6OKkOcJ+7CSZsGsEJzktC/SIa5bxVeYKQLuYK86o=
github.com/grafana/authlib v0.0.0-20250930082137-a40e2c2b094f/go.mod h1:axY0cdOg3q0TZHwpHnIz5x16xZ8ZBxJHShsSHHXcHQg=
github.com/grafana/authlib/types v0.0.0-20251119142549-be091cf2f4d4 h1:Muoy+FMGrHj3GdFbvsMzUT7eusgii9PKf9L1ZaXDDbY=
+12 -10
View File
@@ -793,7 +793,15 @@ github.com/go-openapi/loads v0.22.0/go.mod h1:yLsaTCS92mnSAZX5WWoxszLj0u+Ojl+Zs5
github.com/go-openapi/spec v0.21.0/go.mod h1:78u6VdPw81XU44qEWGhtr982gJ5BWg2c0I5XwVMotYk=
github.com/go-openapi/strfmt v0.23.0/go.mod h1:NrtIpfKtWIygRkKVsxh7XQMDQW5HKQl6S5ik2elW+K4=
github.com/go-openapi/swag v0.22.3/go.mod h1:UzaqsxGiab7freDnrUUra0MwWfN/q7tE4j+VcZ0yl14=
github.com/go-openapi/swag v0.23.0/go.mod h1:esZ8ITTYEsH1V2trKHjAN8Ai7xHb8RV+YSZ577vPjgQ=
github.com/go-openapi/swag v0.23.1/go.mod h1:STZs8TbRvEQQKUA+JZNAm3EWlgaOBGpyFDqQnDHMef0=
github.com/go-openapi/swag/conv v0.25.1/go.mod h1:Z1mFEGPfyIKPu0806khI3zF+/EUXde+fdeksUl2NiDs=
github.com/go-openapi/swag/fileutils v0.25.1/go.mod h1:+NXtt5xNZZqmpIpjqcujqojGFek9/w55b3ecmOdtg8M=
github.com/go-openapi/swag/jsonutils v0.25.1/go.mod h1:JpEkAjxQXpiaHmRO04N1zE4qbUEg3b7Udll7AMGTNOo=
github.com/go-openapi/swag/loading v0.25.1/go.mod h1:xoIe2EG32NOYYbqxvXgPzne989bWvSNoWoyQVWEZicc=
github.com/go-openapi/swag/mangling v0.25.1/go.mod h1:CdiMQ6pnfAgyQGSOIYnZkXvqhnnwOn997uXZMAd/7mQ=
github.com/go-openapi/swag/stringutils v0.25.1/go.mod h1:JLdSAq5169HaiDUbTvArA2yQxmgn4D6h4A+4HqVvAYg=
github.com/go-openapi/swag/typeutils v0.25.1/go.mod h1:9McMC/oCdS4BKwk2shEB7x17P6HmMmA6dQRtAkSnNb8=
github.com/go-openapi/swag/yamlutils v0.25.1/go.mod h1:cm9ywbzncy3y6uPm/97ysW8+wZ09qsks+9RS8fLWKqg=
github.com/go-openapi/validate v0.24.0/go.mod h1:iyeX1sEufmv3nPbBdX3ieNviWnOZaJ1+zquzJEf2BAQ=
github.com/go-pdf/fpdf v0.6.0 h1:MlgtGIfsdMEEQJr2le6b/HNr1ZlQwxyWr77r2aj2U/8=
github.com/go-playground/assert/v2 v2.0.1 h1:MsBgLAaY856+nPRTKrp3/OZK38U/wa0CcBYNjji3q3A=
@@ -982,7 +990,6 @@ github.com/grpc-ecosystem/grpc-gateway/v2 v2.16.0/go.mod h1:YN5jB8ie0yfIUg6VvR9K
github.com/grpc-ecosystem/grpc-gateway/v2 v2.19.1/go.mod h1:5SN9VR2LTsRFsrEC6FHgRbTWrTHu6tqPeKxEQv15giM=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.26.3/go.mod h1:ndYquD05frm2vACXE1nsccT4oJzjhw2arTS2cpUD1PI=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.1/go.mod h1:Zanoh4+gvIgluNqcfMVTJueD4wSS5hT7zTt4Mrutd90=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.2/go.mod h1:pkJQ2tZHJ0aFOVEEot6oZmaVEZcRme73eIFmhiVuRWs=
github.com/grpc-ecosystem/grpc-opentracing v0.0.0-20180507213350-8e809c8a8645 h1:MJG/KsmcqMwFAkh8mTnAwhyKoB+sTAnY4CACC110tbU=
github.com/grpc-ecosystem/grpc-opentracing v0.0.0-20180507213350-8e809c8a8645/go.mod h1:6iZfnjpejD4L/4DwD7NryNaJyCQdzwWwH2MWhCA90Kw=
github.com/hailocab/go-hostpool v0.0.0-20160125115350-e80d13ce29ed h1:5upAirOpQc1Q53c0bnx2ufif5kANL7bfZWcc6VJWJd8=
@@ -1404,7 +1411,6 @@ github.com/richardartoul/molecule v1.0.0/go.mod h1:uvX/8buq8uVeiZiFht+0lqSLBHF+u
github.com/rivo/uniseg v0.4.4/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
github.com/rogpeppe/fastuuid v1.2.0 h1:Ppwyp6VYCF1nvBTXL3trRso7mXMlRrw9ooo375wvi2s=
github.com/rogpeppe/go-internal v1.12.0/go.mod h1:E+RYuTGaKKdloAfM02xzb0FW3Paa99yedzYV+kq4uf4=
github.com/rogpeppe/go-internal v1.13.1/go.mod h1:uMEvuHeurkdAXX61udpOXGD/AzZDWNMNyH2VO9fmH0o=
github.com/rs/xid v1.5.0/go.mod h1:trrq9SKmegXys3aeAKXMUTdJsYXVwGY3RLcfgqegfbg=
github.com/russross/blackfriday v1.6.0 h1:KqfZb0pUVN2lYqZUYRddxF4OR8ZMURnJIG5Y3VRLtww=
github.com/russross/blackfriday v1.6.0/go.mod h1:ti0ldHuxg49ri4ksnFxlkCfN+hvslNlmVHqNRXXJNAY=
@@ -1623,7 +1629,6 @@ go.mongodb.org/mongo-driver v1.11.4/go.mod h1:PTSz5yu21bkT/wXpkS7WR5f0ddqw5queth
go.mongodb.org/mongo-driver v1.14.0/go.mod h1:Vzb0Mk/pa7e6cWw85R4F/endUC3u0U9jGcNU603k65c=
go.mongodb.org/mongo-driver v1.17.3/go.mod h1:Hy04i7O2kC4RS06ZrhPRqj/u4DTYkFDAAccj+rVKqgQ=
go.opencensus.io v0.24.0 h1:y73uSU6J157QMP2kn2r30vwW1A2W2WFwSCGnAVxeaD0=
go.opentelemetry.io/auto/sdk v1.1.0/go.mod h1:3wSPjt5PWp2RhlCcmmOial7AvC4DQqZb7a7wCow3W8A=
go.opentelemetry.io/collector v0.121.0/go.mod h1:M4TlnmkjIgishm2DNCk9K3hMKTmAsY9w8cNFsp9EchM=
go.opentelemetry.io/collector v0.124.0/go.mod h1:QzERYfmHUedawjr8Ph/CBEEkVqWS8IlxRLAZt+KHlCg=
go.opentelemetry.io/collector/client v1.29.0/go.mod h1:LCUoEV2KCTKA1i+/txZaGsSPVWUcqeOV6wCfNsAippE=
@@ -1839,6 +1844,7 @@ go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.58.0/go.mod h1:
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.60.0/go.mod h1:69uWxva0WgAA/4bu2Yy70SLDBwZXuQ6PbBpbsa5iZrQ=
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0/go.mod h1:UHB22Z8QsdRDrnAtX4PntOl36ajSxcdUMt1sF7Y6E7Q=
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.62.0/go.mod h1:NfchwuyNoMcZ5MLHwPrODwUF1HWCXWrL31s8gSAdIKY=
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.63.0/go.mod h1:h06DGIukJOevXaj/xrNjhi/2098RZzcLTbc0jDAUbsg=
go.opentelemetry.io/contrib/otelconf v0.15.0 h1:BLNiIUsrNcqhSKpsa6CnhE6LdrpY1A8X0szMVsu99eo=
go.opentelemetry.io/contrib/otelconf v0.15.0/go.mod h1:OPH1seO5z9dp1P26gnLtoM9ht7JDvh3Ws6XRHuXqImY=
go.opentelemetry.io/contrib/propagators/aws v1.37.0 h1:cp8AFiM/qjBm10C/ATIRnEDXpD5MBknrA0ANw4T2/ss=
@@ -1910,7 +1916,6 @@ go.opentelemetry.io/proto/otlp v1.0.0/go.mod h1:Sy6pihPLfYHkr3NkUbEhGHFhINUSI/v8
go.opentelemetry.io/proto/otlp v1.5.0/go.mod h1:keN8WnHxOy8PG0rQZjJJ5A2ebUoafqWp0eVQ4yIXvJ4=
go.opentelemetry.io/proto/otlp v1.6.0/go.mod h1:cicgGehlFuNdgZkcALOCh3VE6K/u2tAjzlRhDwmVpZc=
go.opentelemetry.io/proto/otlp v1.7.0/go.mod h1:fSKjH6YJ7HDlwzltzyMj036AJ3ejJLCgCSHGj4efDDo=
go.opentelemetry.io/proto/otlp v1.7.1/go.mod h1:b2rVh6rfI/s2pHWNlB7ILJcRALpcNDzKhACevjI+ZnE=
go.uber.org/atomic v1.10.0/go.mod h1:LUxbIzbOniOlMKjJjyPfpl4v+PKK2cNJn91OQbhoJI0=
go.uber.org/automaxprocs v1.6.0 h1:O3y2/QNTOdbF+e/dpXNNW7Rx2hZ4sTIPyybbxyNqTUs=
go.uber.org/automaxprocs v1.6.0/go.mod h1:ifeIMSnPZuznNm6jmdzmU3/bfk01Fe2fotchwEFJ8r8=
@@ -2118,8 +2123,8 @@ google.golang.org/genproto/googleapis/api v0.0.0-20250728155136-f173205681a0/go.
google.golang.org/genproto/googleapis/api v0.0.0-20250804133106-a7a43d27e69b/go.mod h1:oDOGiMSXHL4sDTJvFvIB9nRQCGdLP1o/iVaqQK8zB+M=
google.golang.org/genproto/googleapis/api v0.0.0-20250818200422-3122310a409c/go.mod h1:ea2MjsO70ssTfCjiwHgI0ZFqcw45Ksuk2ckf9G468GA=
google.golang.org/genproto/googleapis/api v0.0.0-20250825161204-c5933d9347a5/go.mod h1:j3QtIyytwqGr1JUDtYXwtMXWPKsEa5LtzIFN1Wn5WvE=
google.golang.org/genproto/googleapis/api v0.0.0-20250908214217-97024824d090/go.mod h1:U8EXRNSd8sUYyDfs/It7KVWodQr+Hf9xtxyxWudSwEw=
google.golang.org/genproto/googleapis/api v0.0.0-20250929231259-57b25ae835d4/go.mod h1:NnuHhy+bxcg30o7FnVAZbXsPHUDQ9qKWAQKCD7VxFtk=
google.golang.org/genproto/googleapis/api v0.0.0-20251111163417-95abcf5c77ba/go.mod h1:G5IanEx8/PgI9w6CFcYQf7jMtHQhZruvfM1i3qOqk5U=
google.golang.org/genproto/googleapis/api v0.0.0-20251202230838-ff82c1b0f217/go.mod h1:+rXWjjaukWZun3mLfjmVnQi18E1AsFbDN9QdJ5YXLto=
google.golang.org/genproto/googleapis/bytestream v0.0.0-20250603155806-513f23925822 h1:zWFRixYR5QlotL+Uv3YfsPRENIrQFXiGs+iwqel6fOQ=
google.golang.org/genproto/googleapis/bytestream v0.0.0-20250603155806-513f23925822/go.mod h1:h6yxum/C2qRb4txaZRLDHK8RyS0H/o2oEDeKY4onY/Y=
@@ -2150,10 +2155,9 @@ google.golang.org/genproto/googleapis/rpc v0.0.0-20250825161204-c5933d9347a5/go.
google.golang.org/genproto/googleapis/rpc v0.0.0-20250826171959-ef028d996bc1/go.mod h1:GmFNa4BdJZ2a8G+wCe9Bg3wwThLrJun751XstdJt5Og=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250908214217-97024824d090/go.mod h1:GmFNa4BdJZ2a8G+wCe9Bg3wwThLrJun751XstdJt5Og=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250929231259-57b25ae835d4/go.mod h1:HSkG/KdJWusxU1F6CNrwNDjBMgisKxGnc5dAZfT0mjQ=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251002232023-7c0ddcbb5797/go.mod h1:HSkG/KdJWusxU1F6CNrwNDjBMgisKxGnc5dAZfT0mjQ=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251014184007-4626949a642f/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251022142026-3a174f9686a8/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251103181224-f26f9409b101/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251111163417-95abcf5c77ba/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251124214823-79d6a2a48846/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251202230838-ff82c1b0f217/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/grpc v1.23.1/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
@@ -2177,7 +2181,6 @@ google.golang.org/grpc v1.73.0/go.mod h1:50sbHOUqWoCQGI8V2HQLJM0B+LMlIUjNSZmow7E
google.golang.org/grpc v1.74.2/go.mod h1:CtQ+BGjaAIXHs/5YS3i473GqwBBa1zGQNevxdeBEXrM=
google.golang.org/grpc v1.75.0/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=
google.golang.org/grpc v1.75.1/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=
google.golang.org/grpc v1.76.0/go.mod h1:Ju12QI8M6iQJtbcsV+awF5a4hfJMLi4X0JLo94ULZ6c=
google.golang.org/grpc/cmd/protoc-gen-go-grpc v1.1.0 h1:M1YKkFIboKNieVO5DLUEVzQfGwJD30Nv2jfUgzb5UcE=
google.golang.org/grpc/examples v0.0.0-20230224211313-3775f633ce20 h1:MLBCGN1O7GzIx+cBiwfYPwtmZ41U3Mn/cotLJciaArI=
google.golang.org/grpc/examples v0.0.0-20230224211313-3775f633ce20/go.mod h1:Nr5H8+MlGWr5+xX/STzdoEqJrO+YteqFbMyCsrb6mH0=
@@ -2299,7 +2302,6 @@ sigs.k8s.io/structured-merge-diff/v4 v4.2.3/go.mod h1:qjx8mGObPmV2aSZepjQjbmb2ih
sigs.k8s.io/structured-merge-diff/v4 v4.5.0 h1:nbCitCK2hfnhyiKo6uf2HxUPTCodY6Qaf85SbDIaMBk=
sigs.k8s.io/structured-merge-diff/v4 v4.5.0/go.mod h1:N8f93tFZh9U6vpxwRArLiikrE5/2tiu1w1AGfACIGE4=
sigs.k8s.io/structured-merge-diff/v6 v6.2.0/go.mod h1:M3W8sfWvn2HhQDIbGWj3S099YozAsymCo/wrT5ohRUE=
sigs.k8s.io/structured-merge-diff/v6 v6.3.0/go.mod h1:M3W8sfWvn2HhQDIbGWj3S099YozAsymCo/wrT5ohRUE=
sigs.k8s.io/yaml v1.3.0/go.mod h1:GeOyir5tyXNByN85N/dRIT9es5UQNerPYEKK56eTBm8=
sigs.k8s.io/yaml v1.4.0/go.mod h1:Ejl7/uTz7PSA4eKMyQCUTnhZYNmLIl+5c2lQPGR2BPY=
sigs.k8s.io/yaml v1.5.0/go.mod h1:wZs27Rbxoai4C0f8/9urLZtZtF3avA3gKvGyPdDqTO4=
@@ -243,6 +243,7 @@ const injectedRtkApi = api
type: queryArg['type'],
folder: queryArg.folder,
facet: queryArg.facet,
facetLimit: queryArg.facetLimit,
tags: queryArg.tags,
libraryPanel: queryArg.libraryPanel,
permission: queryArg.permission,
@@ -663,6 +664,8 @@ export type SearchDashboardsAndFoldersApiArg = {
folder?: string;
/** count distinct terms for selected fields */
facet?: string[];
/** maximum number of terms to return per facet (default 50, max 1000) */
facetLimit?: number;
/** tag query filter */
tags?: string[];
/** find dashboards that reference a given libraryPanel */
@@ -3,11 +3,18 @@ import { merge } from 'lodash';
import { toDataFrame } from '../dataframe/processDataFrame';
import { createTheme } from '../themes/createTheme';
import { ReducerID } from '../transformations/fieldReducer';
import { FieldType } from '../types/dataFrame';
import { FieldConfigPropertyItem } from '../types/fieldOverrides';
import { MappingType, SpecialValueMatch, ValueMapping } from '../types/valueMapping';
import { getDisplayProcessor } from './displayProcessor';
import { fixCellTemplateExpressions, getFieldDisplayValues, GetFieldDisplayValuesOptions } from './fieldDisplay';
import {
FieldSparkline,
fixCellTemplateExpressions,
getFieldDisplayValues,
GetFieldDisplayValuesOptions,
getSparklineHighlight,
} from './fieldDisplay';
import { standardFieldConfigEditorRegistry } from './standardFieldConfigEditorRegistry';
describe('FieldDisplay', () => {
@@ -556,3 +563,71 @@ describe('fixCellTemplateExpressions', () => {
);
});
});
describe('getSparklineHighlight', () => {
const sparkline: FieldSparkline = {
y: { name: 'A', type: FieldType.number, values: [null, 2, 3, 4, 10, 8, 8, 8, 9, null], config: {} },
};
it.each([
{
calc: ReducerID.last,
expected: {
type: 'point',
xIdx: 9,
},
},
{
calc: ReducerID.max,
expected: {
type: 'point',
xIdx: 4,
},
},
{
calc: ReducerID.min,
expected: {
type: 'point',
xIdx: 1,
},
},
{
calc: ReducerID.first,
expected: {
type: 'point',
xIdx: 0,
},
},
{
calc: ReducerID.firstNotNull,
expected: {
type: 'point',
xIdx: 1,
},
},
{
calc: ReducerID.lastNotNull,
expected: {
type: 'point',
xIdx: 8,
},
},
{
calc: ReducerID.mean,
expected: {
type: 'line',
y: 6.5,
},
},
{
calc: ReducerID.median,
expected: {
type: 'line',
y: 8,
},
},
])('it calculates the correct highlight for the $calc', ({ calc, expected }) => {
const result = getSparklineHighlight(sparkline, calc);
expect(result).toEqual(expected);
});
});
+81 -56
View File
@@ -3,7 +3,7 @@ import { isEmpty } from 'lodash';
import { DataFrameView } from '../dataframe/DataFrameView';
import { getTimeField } from '../dataframe/processDataFrame';
import { GrafanaTheme2 } from '../themes/types';
import { reduceField, ReducerID } from '../transformations/fieldReducer';
import { isReducerID, reduceField, ReducerID } from '../transformations/fieldReducer';
import { getFieldMatcher } from '../transformations/matchers';
import { FieldMatcherID } from '../transformations/matchers/ids';
import { ScopedVars } from '../types/ScopedVars';
@@ -43,6 +43,7 @@ export interface FieldSparkline {
x?: Field; // if this does not exist, use the index
timeRange?: TimeRange; // Optionally force an absolute time
highlightIndex?: number;
highlightLine?: number;
}
export interface FieldDisplay {
@@ -72,6 +73,76 @@ export interface GetFieldDisplayValuesOptions {
export const DEFAULT_FIELD_DISPLAY_VALUES_LIMIT = 25;
interface SparklineHighlightPoint {
type: 'point';
xIdx: number;
}
interface SparklineHighlightLine {
type: 'line';
y: number;
}
export function getSparklineHighlight(
sparkline: FieldSparkline,
calc: ReducerID
): SparklineHighlightPoint | SparklineHighlightLine | void {
switch (calc) {
case ReducerID.last:
return { type: 'point', xIdx: sparkline.y.values.length - 1 };
case ReducerID.first:
return { type: 'point', xIdx: 0 };
case ReducerID.lastNotNull: {
for (let k = sparkline.y.values.length - 1; k >= 0; k--) {
const v = sparkline.y.values[k];
if (v !== null && v !== undefined && !Number.isNaN(v)) {
return { type: 'point', xIdx: k };
}
}
return;
}
case ReducerID.firstNotNull: {
for (let k = 0; k < sparkline.y.values.length; k++) {
const v = sparkline.y.values[k];
if (v !== null && v !== undefined && !Number.isNaN(v)) {
return { type: 'point', xIdx: k };
}
}
return;
}
case ReducerID.min: {
let minIdx = -1;
let prevMin = Infinity;
for (let k = 0; k < sparkline.y.values.length; k++) {
const v = sparkline.y.values[k];
if (v !== null && v !== undefined && !Number.isNaN(v) && v < prevMin) {
prevMin = v;
minIdx = k;
}
}
return minIdx >= 0 ? { type: 'point', xIdx: minIdx } : undefined;
}
case ReducerID.max: {
let maxIdx = -1;
let prevMax = -Infinity;
for (let k = 0; k < sparkline.y.values.length; k++) {
const v = sparkline.y.values[k];
if (v !== null && v !== undefined && !Number.isNaN(v) && v > prevMax) {
prevMax = v;
maxIdx = k;
}
}
return maxIdx >= 0 ? { type: 'point', xIdx: maxIdx } : undefined;
}
case ReducerID.mean:
return { type: 'line', y: reduceField({ field: sparkline.y, reducers: [ReducerID.mean] }).mean };
case ReducerID.median:
return { type: 'line', y: reduceField({ field: sparkline.y, reducers: [ReducerID.median] }).median };
default:
return;
}
}
export const getFieldDisplayValues = (options: GetFieldDisplayValuesOptions): FieldDisplay[] => {
const { replaceVariables, reduceOptions, timeZone, theme } = options;
const calcs = reduceOptions.calcs.length ? reduceOptions.calcs : [ReducerID.last];
@@ -190,62 +261,16 @@ export const getFieldDisplayValues = (options: GetFieldDisplayValuesOptions): Fi
y: dataFrame.fields[i],
x: timeField,
};
let highlightIdx: number | undefined = (() => {
switch (calc) {
case ReducerID.last:
return sparkline.y.values.length - 1;
case ReducerID.first:
return 0;
// TODO: #112977 enable more reducers for highlight index
// case ReducerID.lastNotNull: {
// for (let k = sparkline.y.values.length - 1; k >= 0; k--) {
// const v = sparkline.y.values[k];
// if (v !== null && v !== undefined && !Number.isNaN(v)) {
// return k;
// }
// }
// return;
// }
// case ReducerID.firstNotNull: {
// for (let k = 0; k < sparkline.y.values.length; k++) {
// const v = sparkline.y.values[k];
// if (v !== null && v !== undefined && !Number.isNaN(v)) {
// return k;
// }
// }
// return;
// }
// case ReducerID.min: {
// let minIdx = -1;
// let prevMin = Infinity;
// for (let k = 0; k < sparkline.y.values.length; k++) {
// const v = sparkline.y.values[k];
// if (v !== null && v !== undefined && !Number.isNaN(v) && v < prevMin) {
// prevMin = v;
// minIdx = k;
// }
// }
// return minIdx >= 0 ? minIdx : undefined;
// }
// case ReducerID.max: {
// let maxIdx = -1;
// let prevMax = -Infinity;
// for (let k = 0; k < sparkline.y.values.length; k++) {
// const v = sparkline.y.values[k];
// if (v !== null && v !== undefined && !Number.isNaN(v) && v > prevMax) {
// prevMax = v;
// maxIdx = k;
// }
// }
// return maxIdx >= 0 ? maxIdx : undefined;
// }
default:
return;
if (isReducerID(calc)) {
const sparklineHighlight = getSparklineHighlight(sparkline, calc);
switch (sparklineHighlight?.type) {
case 'point':
sparkline.highlightIndex = sparklineHighlight.xIdx;
break;
case 'line':
sparkline.highlightLine = sparklineHighlight.y;
break;
}
})();
if (typeof highlightIdx === 'number') {
sparkline.highlightIndex = highlightIdx;
}
}
@@ -131,6 +131,7 @@ export function Drawer({
>
<FocusScope restoreFocus contain autoFocus>
<div
data-grafana-portal-container
aria-label={
typeof title === 'string'
? selectors.components.Drawer.General.title(title)
@@ -67,7 +67,7 @@ export const RadialSparkline = memo(
return (
<div style={{ position: 'absolute', top: topPos }}>
<Sparkline height={height} width={width} sparkline={sparkline} theme={theme} config={config} />
<Sparkline height={height} width={width} sparkline={sparkline} theme={theme} config={config} showHighlights />
</div>
);
}
@@ -14,18 +14,18 @@ export interface SparklineProps extends Themeable2 {
height: number;
config?: FieldConfig<GraphFieldConfig>;
sparkline: FieldSparkline;
showHighlights?: boolean;
}
const SparklineFn: React.FC<SparklineProps> = memo((props) => {
const { sparkline, config: fieldConfig, theme, width, height } = props;
const { frame: alignedDataFrame, warning } = prepareSeries(sparkline, fieldConfig);
export const SparklineFn: React.FC<SparklineProps> = memo((props) => {
const { sparkline, config: fieldConfig, theme, width, height, showHighlights } = props;
const { frame: alignedDataFrame, warning } = prepareSeries(sparkline, theme, fieldConfig, showHighlights);
if (warning) {
return null;
}
const data = preparePlotData2(alignedDataFrame, getStackingGroups(alignedDataFrame));
const configBuilder = prepareConfig(sparkline, alignedDataFrame, theme);
const configBuilder = prepareConfig(sparkline, alignedDataFrame, theme, showHighlights);
return <UPlotChart data={data} config={configBuilder} width={width} height={height} />;
});
@@ -1,6 +1,6 @@
import { Field, FieldSparkline, FieldType } from '@grafana/data';
import { createTheme, Field, FieldSparkline, FieldType, toDataFrame } from '@grafana/data';
import { getYRange, preparePlotFrame } from './utils';
import { getYRange, prepareConfig, preparePlotFrame } from './utils';
describe('Prepare Sparkline plot frame', () => {
it('should return sorted array if x-axis numeric', () => {
@@ -201,3 +201,134 @@ describe('Get y range', () => {
expect(actual[0]).toBeLessThan(actual[1]!);
});
});
describe('prepareConfig', () => {
it('should not throw an error if there are multiple values', () => {
const sparkline: FieldSparkline = {
x: {
name: 'x',
values: [1679839200000, 1680444000000, 1681048800000, 1681653600000, 1682258400000],
type: FieldType.time,
config: {},
},
y: {
name: 'y',
values: [1, 2, 3, 4, 5],
type: FieldType.number,
config: {},
},
};
const dataFrame = toDataFrame({
fields: [sparkline.x, sparkline.y],
});
const config = prepareConfig(sparkline, dataFrame, createTheme());
expect(config.series.length).toBe(1);
});
it('should not throw an error if there is a single value', () => {
const sparkline: FieldSparkline = {
x: {
name: 'x',
values: [1679839200000],
type: FieldType.time,
config: {},
},
y: {
name: 'y',
values: [1],
type: FieldType.number,
config: {},
},
};
const dataFrame = toDataFrame({
fields: [sparkline.x, sparkline.y],
});
const config = prepareConfig(sparkline, dataFrame, createTheme());
expect(config.series.length).toBe(1);
});
it('should not throw an error if there are no values', () => {
const sparkline: FieldSparkline = {
x: {
name: 'x',
values: [],
type: FieldType.time,
config: {},
},
y: {
name: 'y',
values: [],
type: FieldType.number,
config: {},
},
};
const dataFrame = toDataFrame({
fields: [sparkline.x, sparkline.y],
});
const config = prepareConfig(sparkline, dataFrame, createTheme());
expect(config.series.length).toBe(1);
});
it('should set up highlight series if showHighlights is true and highlightIdx exists', () => {
const sparkline: FieldSparkline = {
x: {
name: 'x',
values: [1679839200000, 1680444000000, 1681048800000, 1681653600000, 1682258400000],
type: FieldType.time,
config: {},
},
y: {
name: 'y',
values: [1, 2, 3, 4, 5],
type: FieldType.number,
config: {},
},
highlightIndex: 2,
};
const dataFrame = toDataFrame({
fields: [sparkline.x, sparkline.y],
});
const config = prepareConfig(sparkline, dataFrame, createTheme(), true);
expect(config.series.length).toBe(1);
expect(config.series[0].getConfig().points).toEqual(
expect.objectContaining({
show: true,
filter: [2],
})
);
});
it('should not set up highlight series if showHighlights is false even if highlightIdx exists', () => {
const sparkline: FieldSparkline = {
x: {
name: 'x',
values: [1679839200000, 1680444000000, 1681048800000, 1681653600000, 1682258400000],
type: FieldType.time,
config: {},
},
y: {
name: 'y',
values: [1, 2, 3, 4, 5],
type: FieldType.number,
config: {},
},
highlightIndex: 2,
};
const dataFrame = toDataFrame({
fields: [sparkline.x, sparkline.y],
});
const config = prepareConfig(sparkline, dataFrame, createTheme(), false);
expect(config.series.length).toBe(1);
expect(config.series[0].getConfig().points?.show).not.toBe(true);
});
});
@@ -2,6 +2,7 @@ import { Range } from 'uplot';
import {
applyNullInsertThreshold,
// colorManipulator,
DataFrame,
FieldConfig,
FieldSparkline,
@@ -22,6 +23,7 @@ import {
VisibilityMode,
ScaleDirection,
ScaleOrientation,
// FieldColorModeId,
} from '@grafana/schema';
import { UPlotConfigBuilder } from '../uPlot/config/UPlotConfigBuilder';
@@ -112,8 +114,7 @@ export function getYRange(alignedFrame: DataFrame): Range.MinMax {
return [roundedMin, roundedMax];
}
// TODO: #112977 enable highlight index
// const HIGHLIGHT_IDX_POINT_SIZE = 6;
const HIGHLIGHT_IDX_POINT_SIZE = 6;
const defaultConfig: GraphFieldConfig = {
drawStyle: GraphDrawStyle.Line,
@@ -124,7 +125,9 @@ const defaultConfig: GraphFieldConfig = {
export const prepareSeries = (
sparkline: FieldSparkline,
fieldConfig?: FieldConfig<GraphFieldConfig>
_theme: GrafanaTheme2,
fieldConfig?: FieldConfig<GraphFieldConfig>,
_showHighlights?: boolean
): { frame: DataFrame; warning?: string } => {
const frame = nullToValue(preparePlotFrame(sparkline, fieldConfig));
if (frame.fields.some((f) => f.values.length <= 1)) {
@@ -136,16 +139,41 @@ export const prepareSeries = (
frame,
};
}
// TODO:rgb(24, 24, 24) will address this.
// if (showHighlights && typeof sparkline.highlightLine === 'number') {
// const highlightY = sparkline.highlightLine;
// const colorMode = getFieldColorModeForField(sparkline.y);
// const seriesColor = colorMode.getCalculator(sparkline.y, theme)(highlightY, 0);
// frame.fields.push({
// name: 'highlightLine',
// type: FieldType.number,
// values: new Array(frame.length).fill(highlightY),
// config: {
// color: {
// mode: FieldColorModeId.Fixed,
// fixedColor: colorManipulator.lighten(seriesColor, 0.5),
// },
// custom: {
// lineStyle: {
// fill: 'dash',
// dash: [5, 2],
// },
// },
// },
// state: {},
// });
// }
return { frame };
};
export const prepareConfig = (
sparkline: FieldSparkline,
dataFrame: DataFrame,
theme: GrafanaTheme2
theme: GrafanaTheme2,
showHighlights?: boolean
): UPlotConfigBuilder => {
const builder = new UPlotConfigBuilder();
// const rangePad = HIGHLIGHT_IDX_POINT_SIZE / 2;
const rangePad = HIGHLIGHT_IDX_POINT_SIZE / 2;
builder.setCursor({
show: false,
@@ -206,13 +234,14 @@ export const prepareConfig = (
const colorMode = getFieldColorModeForField(field);
const seriesColor = colorMode.getCalculator(field, theme)(0, 0);
// TODO: #112977 enable highlight index and adjust padding accordingly
// const hasHighlightIndex = typeof sparkline.highlightIndex === 'number';
// if (hasHighlightIndex) {
// builder.setPadding([rangePad, rangePad, rangePad, rangePad]);
// }
const hasHighlightIndex = showHighlights && typeof sparkline.highlightIndex === 'number';
if (hasHighlightIndex) {
builder.setPadding([rangePad, rangePad, rangePad, rangePad]);
}
const pointsMode =
customConfig.drawStyle === GraphDrawStyle.Points // || hasHighlightIndex
customConfig.drawStyle === GraphDrawStyle.Points || hasHighlightIndex
? VisibilityMode.Always
: customConfig.showPoints;
@@ -227,9 +256,8 @@ export const prepareConfig = (
lineWidth: customConfig.lineWidth,
lineInterpolation: customConfig.lineInterpolation,
showPoints: pointsMode,
// TODO: #112977 enable highlight index
pointSize: /* hasHighlightIndex ? HIGHLIGHT_IDX_POINT_SIZE : */ customConfig.pointSize,
// pointsFilter: hasHighlightIndex ? [sparkline.highlightIndex!] : undefined,
pointSize: hasHighlightIndex ? HIGHLIGHT_IDX_POINT_SIZE : customConfig.pointSize,
pointsFilter: hasHighlightIndex ? [sparkline.highlightIndex!] : undefined,
fillOpacity: customConfig.fillOpacity,
fillColor: customConfig.fillColor,
lineStyle: customConfig.lineStyle,
@@ -76,4 +76,24 @@ return (
);
```
### Usage inside Drawer
Toggletip automatically detects when it's inside a Drawer (or other focus-trapped container with the `data-grafana-portal-container` attribute) and adjusts its behavior accordingly. No additional configuration is needed:
```tsx
<Drawer title="Settings" onClose={onClose}>
<Toggletip content={<Input placeholder="Type here..." />}>
<Button>Open Toggletip</Button>
</Toggletip>
</Drawer>
```
When auto-detected inside a focus-trapped container:
- The Toggletip content renders inside the Drawer's DOM tree
- Focus management defers to the parent container's focus trap
- Interactive elements like inputs work correctly
If you need to override auto-detection or specify a custom container, use the `portalRoot` prop.
<ArgTypes of={Toggletip} />
@@ -1,6 +1,10 @@
import { Meta, StoryFn } from '@storybook/react';
import { useState } from 'react';
import { Button } from '../Button/Button';
import { Drawer } from '../Drawer/Drawer';
import { Field } from '../Forms/Field';
import { Input } from '../Input/Input';
import { ScrollContainer } from '../ScrollContainer/ScrollContainer';
import mdx from '../Toggletip/Toggletip.mdx';
@@ -133,4 +137,46 @@ LongContent.parameters = {
},
};
export const InsideDrawer: StoryFn<typeof Toggletip> = () => {
const [isDrawerOpen, setIsDrawerOpen] = useState(false);
return (
<>
<Button onClick={() => setIsDrawerOpen(true)}>Open Drawer</Button>
{isDrawerOpen && (
<Drawer title="Drawer with Toggletip" onClose={() => setIsDrawerOpen(false)}>
<p style={{ marginBottom: '16px' }}>
Toggletip automatically detects when it&apos;s inside a Drawer and renders its content within the
Drawer&apos;s DOM, allowing focus to work correctly. No manual configuration needed!
</p>
<Toggletip
title="Interactive Form"
content={
<div style={{ display: 'flex', flexDirection: 'column', gap: '8px' }}>
<Field label="Name">
<Input placeholder="Enter your name" />
</Field>
<Button variant="primary" size="sm">
Submit
</Button>
</div>
}
footer="Focus works correctly - auto-detected!"
placement="bottom-start"
>
<Button>Click to show Toggletip</Button>
</Toggletip>
</Drawer>
)}
</>
);
};
InsideDrawer.parameters = {
controls: {
hideNoControlsWarning: true,
exclude: ['title', 'content', 'footer', 'children', 'placement', 'theme', 'closeButton', 'portalRoot'],
},
};
export default meta;
@@ -11,7 +11,7 @@ import {
useInteractions,
} from '@floating-ui/react';
import { Placement } from '@popperjs/core';
import { memo, cloneElement, isValidElement, useRef, useState, type JSX } from 'react';
import { memo, cloneElement, isValidElement, useRef, useState, useMemo, type JSX } from 'react';
import { GrafanaTheme2 } from '@grafana/data';
import { t } from '@grafana/i18n';
@@ -47,6 +47,11 @@ export interface ToggletipProps {
show?: boolean;
/** Callback function to be called when the toggletip is opened */
onOpen?: () => void;
/** Optional root element for the portal. When Toggletip is inside a focus-trapped container like Drawer,
* the portal root is auto-detected via the `data-grafana-portal-container` attribute. Use this prop
* to override auto-detection or specify a custom container. When inside a focus-trapped container,
* the Toggletip disables its own modal focus trap, deferring focus management to the parent. */
portalRoot?: HTMLElement;
}
/**
@@ -67,6 +72,7 @@ export const Toggletip = memo(
fitContent = false,
onOpen,
show,
portalRoot,
}: ToggletipProps) => {
const arrowRef = useRef(null);
const grafanaTheme = useTheme2();
@@ -110,16 +116,30 @@ export const Toggletip = memo(
const { getReferenceProps, getFloatingProps } = useInteractions([dismiss, click]);
// Auto-detect portal container from reference element's ancestors
// This allows Toggletip to work automatically inside Drawer and other focus-trapped containers
const [referenceElement, setReferenceElement] = useState<Element | null>(null);
const autoDetectedPortalRoot = useMemo(() => {
if (portalRoot) {
return portalRoot;
}
const container = referenceElement?.closest('[data-grafana-portal-container]');
return container instanceof HTMLElement ? container : undefined;
}, [portalRoot, referenceElement]);
return (
<>
{cloneElement(children, {
ref: refs.setReference,
ref: (node: Element | null) => {
refs.setReference(node);
setReferenceElement(node);
},
tabIndex: 0,
'aria-expanded': isOpen,
...getReferenceProps(),
})}
{isOpen && (
<Portal>
<Portal root={autoDetectedPortalRoot}>
<FloatingFocusManager context={context} modal={true}>
<div
data-testid="toggletip-content"
+8 -56
View File
@@ -10,7 +10,6 @@ import (
"github.com/grafana/grafana/pkg/api/response"
contextmodel "github.com/grafana/grafana/pkg/services/contexthandler/model"
"github.com/grafana/grafana/pkg/services/ngalert/models"
)
func (hs *HTTPServer) GetAlertNotifiers() func(*contextmodel.ReqContext) response.Response {
@@ -24,13 +23,13 @@ func (hs *HTTPServer) GetAlertNotifiers() func(*contextmodel.ReqContext) respons
}
type NotifierPlugin struct {
Type string `json:"type"`
TypeAlias string `json:"typeAlias,omitempty"`
Name string `json:"name"`
Heading string `json:"heading"`
Description string `json:"description"`
Info string `json:"info"`
Options []Field `json:"options"`
Type string `json:"type"`
TypeAlias string `json:"typeAlias,omitempty"`
Name string `json:"name"`
Heading string `json:"heading"`
Description string `json:"description"`
Info string `json:"info"`
Options []schema.Field `json:"options"`
}
result := make([]*NotifierPlugin, 0, len(v2))
@@ -45,56 +44,9 @@ func (hs *HTTPServer) GetAlertNotifiers() func(*contextmodel.ReqContext) respons
Description: s.Description,
Heading: s.Heading,
Info: s.Info,
Options: schemaFieldsToFields(s.Type, nil, v1.Options),
Options: v1.Options,
})
}
return response.JSON(http.StatusOK, result)
}
}
type Field struct {
Element schema.ElementType `json:"element"`
InputType schema.InputType `json:"inputType"`
Label string `json:"label"`
Description string `json:"description"`
Placeholder string `json:"placeholder"`
PropertyName string `json:"propertyName"`
SelectOptions []schema.SelectOption `json:"selectOptions"`
ShowWhen schema.ShowWhen `json:"showWhen"`
Required bool `json:"required"`
Protected bool `json:"protected,omitempty"`
ValidationRule string `json:"validationRule"`
Secure bool `json:"secure"`
DependsOn string `json:"dependsOn"`
SubformOptions []Field `json:"subformOptions"`
}
func schemaFieldsToFields(iType schema.IntegrationType, parent schema.IntegrationFieldPath, fields []schema.Field) []Field {
if fields == nil {
return nil
}
result := make([]Field, 0, len(fields))
for _, f := range fields {
result = append(result, schemaFieldToField(iType, parent, f))
}
return result
}
func schemaFieldToField(iType schema.IntegrationType, parent schema.IntegrationFieldPath, f schema.Field) Field {
return Field{
Element: f.Element,
InputType: f.InputType,
Label: f.Label,
Description: f.Description,
Placeholder: f.Placeholder,
PropertyName: f.PropertyName,
SelectOptions: f.SelectOptions,
ShowWhen: f.ShowWhen,
Required: f.Required,
ValidationRule: f.ValidationRule,
Secure: f.Secure,
DependsOn: f.DependsOn,
SubformOptions: schemaFieldsToFields(iType, append(parent, f.PropertyName), f.SubformOptions),
Protected: models.IsProtectedField(iType, append(parent, f.PropertyName)),
}
}
+88
View File
@@ -0,0 +1,88 @@
package auditing
import (
"encoding/json"
"time"
)
type Event struct {
// The namespace the action was performed in.
Namespace string `json:"namespace"`
// When it happened.
ObservedAt time.Time `json:"-"` // see MarshalJSON for why this is omitted
// Who/what performed the action.
SubjectName string `json:"subjectName"`
SubjectUID string `json:"subjectUID"`
// What was performed.
Verb string `json:"verb"`
// The object the action was performed on. For verbs like "list" this will be empty.
Object string `json:"object,omitempty"`
// API information.
APIGroup string `json:"apiGroup,omitempty"`
APIVersion string `json:"apiVersion,omitempty"`
Kind string `json:"kind,omitempty"`
// Outcome of the action.
Outcome EventOutcome `json:"outcome"`
// Extra fields to add more context to the event.
Extra map[string]string `json:"extra,omitempty"`
}
func (e Event) Time() time.Time {
return e.ObservedAt
}
func (e Event) MarshalJSON() ([]byte, error) {
type Alias Event
return json.Marshal(&struct {
FormattedTimestamp string `json:"observedAt"`
Alias
}{
FormattedTimestamp: e.ObservedAt.UTC().Format(time.RFC3339Nano),
Alias: (Alias)(e),
})
}
func (e Event) KVPairs() []any {
args := []any{
"audit", true,
"namespace", e.Namespace,
"observedAt", e.ObservedAt.UTC().Format(time.RFC3339Nano),
"subjectName", e.SubjectName,
"subjectUID", e.SubjectUID,
"verb", e.Verb,
"object", e.Object,
"apiGroup", e.APIGroup,
"apiVersion", e.APIVersion,
"kind", e.Kind,
"outcome", e.Outcome,
}
if len(e.Extra) > 0 {
extraArgs := make([]any, 0, len(e.Extra)*2)
for k, v := range e.Extra {
extraArgs = append(extraArgs, "extra_"+k, v)
}
args = append(args, extraArgs...)
}
return args
}
type EventOutcome string
const (
EventOutcomeUnknown EventOutcome = "unknown"
EventOutcomeSuccess EventOutcome = "success"
EventOutcomeFailureUnauthorized EventOutcome = "failure_unauthorized"
EventOutcomeFailureNotFound EventOutcome = "failure_not_found"
EventOutcomeFailureGeneric EventOutcome = "failure_generic"
)
+64
View File
@@ -0,0 +1,64 @@
package auditing_test
import (
"encoding/json"
"strconv"
"strings"
"testing"
"time"
"github.com/grafana/grafana/pkg/apiserver/auditing"
"github.com/stretchr/testify/require"
)
func TestEvent_MarshalJSON(t *testing.T) {
t.Parallel()
t.Run("marshals the event", func(t *testing.T) {
t.Parallel()
now := time.Now()
event := auditing.Event{
ObservedAt: now,
Extra: map[string]string{"k1": "v1", "k2": "v2"},
}
data, err := json.Marshal(event)
require.NoError(t, err)
var result map[string]any
require.NoError(t, json.Unmarshal(data, &result))
require.Equal(t, event.Time().UTC().Format(time.RFC3339Nano), result["observedAt"])
require.NotNil(t, result["extra"])
require.Len(t, result["extra"], 2)
})
}
func TestEvent_KVPairs(t *testing.T) {
t.Parallel()
t.Run("records extra fields", func(t *testing.T) {
t.Parallel()
extraFields := 2
extra := make(map[string]string, 0)
for i := 0; i < extraFields; i++ {
extra[strconv.Itoa(i)] = "value"
}
event := auditing.Event{Extra: extra}
kvPairs := event.KVPairs()
extraCount := 0
for i := 0; i < len(kvPairs); i += 2 {
if strings.HasPrefix(kvPairs[i].(string), "extra_") {
extraCount++
}
}
require.Equal(t, extraCount, extraFields)
})
}
+19 -1
View File
@@ -115,6 +115,15 @@ func (s *SearchHandler) GetAPIRoutes(defs map[string]common.OpenAPIDefinition) *
Schema: spec.ArrayProperty(spec.StringProperty()),
},
},
{
ParameterProps: spec3.ParameterProps{
Name: "facetLimit",
In: "query",
Description: "maximum number of terms to return per facet (default 50, max 1000)",
Required: false,
Schema: spec.Int64Property(),
},
},
{
ParameterProps: spec3.ParameterProps{
Name: "tags",
@@ -340,6 +349,7 @@ func (s *SearchHandler) DoSearch(w http.ResponseWriter, r *http.Request) {
func convertHttpSearchRequestToResourceSearchRequest(queryParams url.Values, user identity.Requester, getDashboardsUIDsSharedWithUser func() ([]string, error)) (*resourcepb.ResourceSearchRequest, error) {
// get limit and offset from query params
limit := 50
facetLimit := 50
offset := 0
page := 1
if queryParams.Has("limit") {
@@ -422,11 +432,19 @@ func convertHttpSearchRequestToResourceSearchRequest(queryParams url.Values, use
// The facet term fields
if facets, ok := queryParams["facet"]; ok {
if queryParams.Has("facetLimit") {
if parsed, err := strconv.Atoi(queryParams.Get("facetLimit")); err == nil && parsed > 0 {
facetLimit = parsed
if facetLimit > 1000 {
facetLimit = 1000
}
}
}
searchRequest.Facet = make(map[string]*resourcepb.ResourceSearchRequest_Facet)
for _, v := range facets {
searchRequest.Facet[v] = &resourcepb.ResourceSearchRequest_Facet{
Field: v,
Limit: 50,
Limit: int64(facetLimit),
}
}
}
@@ -818,6 +818,38 @@ func TestConvertHttpSearchRequestToResourceSearchRequest(t *testing.T) {
Federated: []*resourcepb.ResourceKey{folderKey},
},
},
"facet fields with custom limit": {
queryString: "facet=tags&facetLimit=500",
expected: &resourcepb.ResourceSearchRequest{
Options: &resourcepb.ListOptions{Key: dashboardKey},
Query: "",
Limit: 50,
Offset: 0,
Page: 1,
Explain: false,
Fields: defaultFields,
Facet: map[string]*resourcepb.ResourceSearchRequest_Facet{
"tags": {Field: "tags", Limit: 500},
},
Federated: []*resourcepb.ResourceKey{folderKey},
},
},
"facet fields with limit exceeding max": {
queryString: "facet=tags&facetLimit=5000",
expected: &resourcepb.ResourceSearchRequest{
Options: &resourcepb.ListOptions{Key: dashboardKey},
Query: "",
Limit: 50,
Offset: 0,
Page: 1,
Explain: false,
Fields: defaultFields,
Facet: map[string]*resourcepb.ResourceSearchRequest_Facet{
"tags": {Field: "tags", Limit: 1000},
},
Federated: []*resourcepb.ResourceKey{folderKey},
},
},
"tag filter": {
queryString: "tag=tag1&tag=tag2",
expected: &resourcepb.ResourceSearchRequest{
+24 -39
View File
@@ -2,6 +2,7 @@ package annotation
import (
"context"
"errors"
"fmt"
"strconv"
"strings"
@@ -11,7 +12,6 @@ import (
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apimachinery/pkg/runtime/schema"
"k8s.io/apimachinery/pkg/selection"
"k8s.io/apiserver/pkg/authorization/authorizer"
"k8s.io/apiserver/pkg/endpoints/request"
"k8s.io/apiserver/pkg/registry/rest"
restclient "k8s.io/client-go/rest"
@@ -82,10 +82,6 @@ func RegisterAppInstaller(
return installer, nil
}
func (a *AnnotationAppInstaller) GetAuthorizer() authorizer.Authorizer {
return annotationapp.GetAuthorizer()
}
func (a *AnnotationAppInstaller) GetLegacyStorage(requested schema.GroupVersionResource) apiserverrest.Storage {
kind := annotationV0.AnnotationKind()
gvr := schema.GroupVersionResource{
@@ -182,25 +178,39 @@ func (s *legacyStorage) List(ctx context.Context, options *internalversion.ListO
return nil, fmt.Errorf("unsupported operator %s for spec.panelID (only = supported)", r.Operator)
}
case "spec.time":
if r.Operator == selection.Equals || r.Operator == selection.DoubleEquals {
switch r.Operator {
case selection.GreaterThan:
from, err := strconv.ParseInt(r.Value, 10, 64)
if err != nil {
return nil, fmt.Errorf("invalid from value %q: %w", r.Value, err)
return nil, fmt.Errorf("invalid time value %q: %w", r.Value, err)
}
opts.From = from
} else {
return nil, fmt.Errorf("unsupported operator %s for spec.from (only = supported)", r.Operator)
case selection.LessThan:
to, err := strconv.ParseInt(r.Value, 10, 64)
if err != nil {
return nil, fmt.Errorf("invalid time value %q: %w", r.Value, err)
}
opts.To = to
default:
return nil, fmt.Errorf("unsupported operator %s for spec.time (only >, < supported for ranges)", r.Operator)
}
case "spec.timeEnd":
if r.Operator == selection.Equals || r.Operator == selection.DoubleEquals {
switch r.Operator {
case selection.GreaterThan:
from, err := strconv.ParseInt(r.Value, 10, 64)
if err != nil {
return nil, fmt.Errorf("invalid timeEnd value %q: %w", r.Value, err)
}
opts.From = from
case selection.LessThan:
to, err := strconv.ParseInt(r.Value, 10, 64)
if err != nil {
return nil, fmt.Errorf("invalid to value %q: %w", r.Value, err)
return nil, fmt.Errorf("invalid timeEnd value %q: %w", r.Value, err)
}
opts.To = to
} else {
return nil, fmt.Errorf("unsupported operator %s for spec.to (only = supported)", r.Operator)
default:
return nil, fmt.Errorf("unsupported operator %s for spec.timeEnd (only >, < supported for ranges)", r.Operator)
}
default:
@@ -247,32 +257,7 @@ func (s *legacyStorage) Update(ctx context.Context,
forceAllowCreate bool,
options *metav1.UpdateOptions,
) (runtime.Object, bool, error) {
namespace := request.NamespaceValue(ctx)
obj, err := objInfo.UpdatedObject(ctx, nil)
if err != nil {
return nil, false, err
}
resource, ok := obj.(*annotationV0.Annotation)
if !ok {
return nil, false, fmt.Errorf("expected annotation")
}
if resource.Name != name {
return nil, false, fmt.Errorf("name in URL does not match name in body")
}
if resource.Namespace != namespace {
return nil, false, fmt.Errorf("namespace in URL does not match namespace in body")
}
updated, err := s.store.Update(ctx, resource)
if err != nil {
return nil, false, err
}
return updated, false, nil
return nil, false, errors.New("not implemented")
}
func (s *legacyStorage) Delete(ctx context.Context, name string, deleteValidation rest.ValidateObjectFunc, options *metav1.DeleteOptions) (runtime.Object, bool, error) {
+1 -1
View File
@@ -161,7 +161,7 @@ func (api *API) RegisterAPIEndpoints(m *metrics.API) {
authz: ruleAuthzService,
evaluator: api.EvaluatorFactory,
cfg: &api.Cfg.UnifiedAlerting,
backtesting: backtesting.NewEngine(api.AppUrl, api.EvaluatorFactory, api.Tracer),
backtesting: backtesting.NewEngine(api.AppUrl, api.EvaluatorFactory, api.Tracer, api.Cfg.UnifiedAlerting, api.FeatureManager),
featureManager: api.FeatureManager,
appUrl: api.AppUrl,
tracer: api.Tracer,
@@ -2369,6 +2369,140 @@ func TestRouteGetRuleStatuses(t *testing.T) {
}
})
t.Run("multi-page pagination loads provenance correctly", func(t *testing.T) {
fakeStore, fakeAIM, api, fakeProvisioning := setupAPIFull(t)
// Create 3 groups with 1 rule each: groups 1 and 3 firing, group 2 normal
for i := 1; i <= 3; i++ {
rule := gen.With(gen.WithOrgID(orgID), func(r *ngmodels.AlertRule) {
r.NamespaceUID = "ns-1"
r.RuleGroup = fmt.Sprintf("group-%d", i)
r.UID = fmt.Sprintf("rule-%d", i)
}, withClassicConditionSingleQuery()).GenerateRef()
alertState := eval.Normal
if i != 2 {
alertState = eval.Alerting
}
fakeAIM.GenerateAlertInstances(orgID, rule.UID, 1, func(s *state.State) *state.State {
s.State = alertState
s.Labels = data.Labels{"test": "label"}
return s
})
fakeStore.PutRule(context.Background(), rule)
}
// Set provenance for all rules
err := fakeProvisioning.SetProvenance(context.Background(),
&ngmodels.AlertRule{UID: "rule-1", OrgID: orgID}, orgID, ngmodels.ProvenanceAPI)
require.NoError(t, err)
err = fakeProvisioning.SetProvenance(context.Background(),
&ngmodels.AlertRule{UID: "rule-3", OrgID: orgID}, orgID, ngmodels.ProvenanceFile)
require.NoError(t, err)
// Request firing groups with group_limit=2 - fetches multiple pages, skipping group 2
req, err := http.NewRequest("GET", "/api/v1/rules?state=firing&group_limit=2", nil)
require.NoError(t, err)
c := &contextmodel.ReqContext{
Context: &web.Context{Req: req},
SignedInUser: &user.SignedInUser{
OrgID: orgID,
Permissions: queryPermissions,
},
}
resp := api.RouteGetRuleStatuses(c)
require.Equal(t, http.StatusOK, resp.Status())
var res apimodels.RuleResponse
require.NoError(t, json.Unmarshal(resp.Body(), &res))
// Should return 2 firing groups
require.Len(t, res.Data.RuleGroups, 2)
require.Equal(t, "group-1", res.Data.RuleGroups[0].Name)
require.Equal(t, apimodels.Provenance(ngmodels.ProvenanceAPI), res.Data.RuleGroups[0].Rules[0].Provenance)
require.Equal(t, "group-3", res.Data.RuleGroups[1].Name)
require.Equal(t, apimodels.Provenance(ngmodels.ProvenanceFile), res.Data.RuleGroups[1].Rules[0].Provenance)
})
t.Run("provenance fetch error returns error response in paginated mode", func(t *testing.T) {
fakeStore, fakeAIM, api, fakeProvisioning := setupAPIFull(t)
rule := gen.With(gen.WithOrgID(orgID), func(r *ngmodels.AlertRule) {
r.NamespaceUID = "ns-1"
r.RuleGroup = "group-1"
r.UID = "rule-1"
}, withClassicConditionSingleQuery()).GenerateRef()
fakeAIM.GenerateAlertInstances(orgID, rule.UID, 1, func(s *state.State) *state.State {
s.State = eval.Alerting
s.Labels = data.Labels{"test": "label"}
return s
})
fakeStore.PutRule(context.Background(), rule)
fakeProvisioning.GetProvenancesByUIDsFunc = func(ctx context.Context, orgID int64, resourceType string, uids []string) (map[string]ngmodels.Provenance, error) {
return nil, errors.New("database connection failed")
}
req, err := http.NewRequest("GET", "/api/v1/rules?group_limit=10", nil)
require.NoError(t, err)
c := &contextmodel.ReqContext{
Context: &web.Context{Req: req},
SignedInUser: &user.SignedInUser{
OrgID: orgID,
Permissions: queryPermissions,
},
}
resp := api.RouteGetRuleStatuses(c)
require.Equal(t, http.StatusInternalServerError, resp.Status())
var res apimodels.RuleResponse
require.NoError(t, json.Unmarshal(resp.Body(), &res))
require.Equal(t, "error", res.Status)
require.Contains(t, res.Error, "failed to load provenance")
})
t.Run("provenance fetch error returns error response in non-paginated mode", func(t *testing.T) {
fakeStore, fakeAIM, api, fakeProvisioning := setupAPIFull(t)
rule := gen.With(gen.WithOrgID(orgID), func(r *ngmodels.AlertRule) {
r.NamespaceUID = "ns-1"
r.RuleGroup = "group-1"
r.UID = "rule-1"
}, withClassicConditionSingleQuery()).GenerateRef()
fakeAIM.GenerateAlertInstances(orgID, rule.UID, 1, func(s *state.State) *state.State {
s.State = eval.Alerting
s.Labels = data.Labels{"test": "label"}
return s
})
fakeStore.PutRule(context.Background(), rule)
fakeProvisioning.GetProvenancesFunc = func(ctx context.Context, orgID int64, resourceType string) (map[string]ngmodels.Provenance, error) {
return nil, errors.New("database connection failed")
}
req, err := http.NewRequest("GET", "/api/v1/rules", nil)
require.NoError(t, err)
c := &contextmodel.ReqContext{
Context: &web.Context{Req: req},
SignedInUser: &user.SignedInUser{
OrgID: orgID,
Permissions: queryPermissions,
},
}
resp := api.RouteGetRuleStatuses(c)
require.Equal(t, http.StatusInternalServerError, resp.Status())
var res apimodels.RuleResponse
require.NoError(t, json.Unmarshal(resp.Body(), &res))
require.Equal(t, "error", res.Status)
require.Contains(t, res.Error, "failed to load provenance")
})
t.Run("state filter continues when first page has no matches", func(t *testing.T) {
fakeStore, fakeAIM, api := setupAPI(t)
@@ -493,6 +493,7 @@ func TestValidateRuleNode_NoUID(t *testing.T) {
r.GrafanaManagedAlert.NoDataState = apimodels.OK
r.GrafanaManagedAlert.ExecErrState = apimodels.AlertingErrState
r.GrafanaManagedAlert.NotificationSettings = &apimodels.AlertRuleNotificationSettings{}
r.GrafanaManagedAlert.MissingSeriesEvalsToResolve = util.Pointer[int64](1)
r.For = func() *model.Duration { five := model.Duration(time.Second * 5); return &five }()
r.KeepFiringFor = func() *model.Duration { five := model.Duration(time.Second * 5); return &five }()
return &r
@@ -502,6 +503,7 @@ func TestValidateRuleNode_NoUID(t *testing.T) {
require.Empty(t, alert.NoDataState)
require.Empty(t, alert.ExecErrState)
require.Nil(t, alert.NotificationSettings)
require.Nil(t, alert.MissingSeriesEvalsToResolve)
require.Zero(t, alert.For)
require.Zero(t, alert.KeepFiringFor)
},
+14 -46
View File
@@ -34,7 +34,6 @@ import (
"github.com/grafana/grafana/pkg/services/ngalert/state"
"github.com/grafana/grafana/pkg/services/ngalert/store"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util"
)
type folderService interface {
@@ -230,54 +229,27 @@ func (srv TestingApiSrv) BacktestAlertRule(c *contextmodel.ReqContext, cmd apimo
return ErrResp(http.StatusNotFound, nil, "Backgtesting API is not enabled")
}
if cmd.From.After(cmd.To) {
return ErrResp(400, nil, "From cannot be greater than To")
}
noDataState, err := ngmodels.NoDataStateFromString(string(cmd.NoDataState))
rule, err := apivalidation.ValidateBacktestConfig(c.GetOrgID(), cmd, apivalidation.RuleLimitsFromConfig(srv.cfg, srv.featureManager))
if err != nil {
return ErrResp(400, err, "")
}
forInterval := time.Duration(cmd.For)
if forInterval < 0 {
return ErrResp(400, nil, "Bad For interval")
return ErrResp(http.StatusBadRequest, err, "")
}
intervalSeconds, err := apivalidation.ValidateInterval(time.Duration(cmd.Interval), srv.cfg.BaseInterval)
if err != nil {
return ErrResp(400, err, "")
}
queries := AlertQueriesFromApiAlertQueries(cmd.Data)
if err := srv.authz.AuthorizeDatasourceAccessForRule(c.Req.Context(), c.SignedInUser, &ngmodels.AlertRule{Data: queries}); err != nil {
if err := srv.authz.AuthorizeDatasourceAccessForRule(c.Req.Context(), c.SignedInUser, rule); err != nil {
return errorToResponse(err)
}
rule := &ngmodels.AlertRule{
// ID: 0,
// Updated: time.Time{},
// Version: 0,
// NamespaceUID: "",
// DashboardUID: nil,
// PanelID: nil,
// RuleGroup: "",
// RuleGroupIndex: 0,
// ExecErrState: "",
Title: cmd.Title,
// prefix backtesting- is to distinguish between executions of regular rule and backtesting in logs (like expression engine, evaluator, state manager etc)
UID: "backtesting-" + util.GenerateShortUID(),
OrgID: c.GetOrgID(),
Condition: cmd.Condition,
Data: queries,
IntervalSeconds: intervalSeconds,
NoDataState: noDataState,
For: forInterval,
Annotations: cmd.Annotations,
Labels: cmd.Labels,
// Fetch folder path for alert labels, fallback to "Backtesting" if not available
var folderTitle string
if cmd.NamespaceUID != "" {
f, err := srv.folderService.GetNamespaceByUID(c.Req.Context(), cmd.NamespaceUID, c.OrgID, c.SignedInUser)
if err != nil {
srv.log.FromContext(c.Req.Context()).Warn("Failed to fetch folder path for alert labels", "error", err)
} else {
folderTitle = f.Fullpath
}
}
result, err := srv.backtesting.Test(c.Req.Context(), c.SignedInUser, rule, cmd.From, cmd.To)
result, err := srv.backtesting.Test(c.Req.Context(), c.SignedInUser, rule, cmd.From, cmd.To, folderTitle)
if err != nil {
if errors.Is(err, backtesting.ErrInvalidInputData) {
return ErrResp(400, err, "Failed to evaluate")
@@ -285,9 +257,5 @@ func (srv TestingApiSrv) BacktestAlertRule(c *contextmodel.ReqContext, cmd apimo
return ErrResp(500, err, "Failed to evaluate")
}
body, err := data.FrameToJSON(result, data.IncludeAll)
if err != nil {
return ErrResp(500, err, "Failed to convert frame to JSON")
}
return response.JSON(http.StatusOK, body)
return response.JSONStreaming(http.StatusOK, result)
}
+8 -2
View File
@@ -81,9 +81,15 @@ func (api *API) authorize(method, path string) web.Handler {
// additional authorization is done in the request handler
eval = ac.EvalPermission(ac.ActionAlertingRuleRead)
// Grafana Rules Testing Paths
case http.MethodPost + "/api/v1/rule/backtest":
case http.MethodPost + "/api/v1/rule/backtest": // TODO (yuri) this should be protected by dedicated permission
// additional authorization is done in the request handler
eval = ac.EvalPermission(ac.ActionAlertingRuleRead)
eval = ac.EvalAll(
ac.EvalPermission(ac.ActionAlertingRuleRead),
ac.EvalAny(
ac.EvalPermission(ac.ActionAlertingRuleUpdate),
ac.EvalPermission(ac.ActionAlertingRuleCreate),
),
)
case http.MethodPost + "/api/v1/eval":
// additional authorization is done in the request handler
eval = ac.EvalPermission(ac.ActionAlertingRuleRead)
+47 -37
View File
@@ -189,42 +189,11 @@ func AlertRuleExportFromAlertRule(rule models.AlertRule) (definitions.AlertRuleE
data = append(data, query)
}
cPtr := &rule.Condition
if rule.Condition == "" {
cPtr = nil
}
noDataState := definitions.NoDataState(rule.NoDataState)
ndsPtr := &noDataState
if noDataState == "" {
ndsPtr = nil
}
execErrorState := definitions.ExecutionErrorState(rule.ExecErrState)
eesPtr := &execErrorState
if execErrorState == "" {
eesPtr = nil
}
result := definitions.AlertRuleExport{
UID: rule.UID,
Title: rule.Title,
For: model.Duration(rule.For),
KeepFiringFor: model.Duration(rule.KeepFiringFor),
Condition: cPtr,
Data: data,
DashboardUID: rule.DashboardUID,
PanelID: rule.PanelID,
NoDataState: ndsPtr,
ExecErrState: eesPtr,
IsPaused: rule.IsPaused,
NotificationSettings: AlertRuleNotificationSettingsExportFromNotificationSettings(rule.NotificationSettings),
Record: AlertRuleRecordExportFromRecord(rule.Record),
}
if rule.For.Seconds() > 0 {
result.ForString = util.Pointer(model.Duration(rule.For).String())
}
if rule.KeepFiringFor.Seconds() > 0 {
result.KeepFiringForString = util.Pointer(model.Duration(rule.KeepFiringFor).String())
UID: rule.UID,
Title: rule.Title,
Data: data,
IsPaused: rule.IsPaused,
}
if rule.Annotations != nil {
result.Annotations = &rule.Annotations
@@ -232,13 +201,54 @@ func AlertRuleExportFromAlertRule(rule models.AlertRule) (definitions.AlertRuleE
if rule.Labels != nil {
result.Labels = &rule.Labels
}
if rule.MissingSeriesEvalsToResolve != nil && *rule.MissingSeriesEvalsToResolve != -1 {
result.MissingSeriesEvalsToResolve = rule.MissingSeriesEvalsToResolve
if rule.Type() == models.RuleTypeRecording {
populateRecordingRuleExportFields(rule, &result)
} else {
populateAlertingRuleExportFields(rule, &result)
}
return result, nil
}
func populateRecordingRuleExportFields(rule models.AlertRule, result *definitions.AlertRuleExport) {
result.Record = AlertRuleRecordExportFromRecord(rule.Record)
}
func populateAlertingRuleExportFields(rule models.AlertRule, result *definitions.AlertRuleExport) {
result.DashboardUID = rule.DashboardUID
result.PanelID = rule.PanelID
result.NotificationSettings = AlertRuleNotificationSettingsExportFromNotificationSettings(rule.NotificationSettings)
if rule.Condition != "" {
result.Condition = &rule.Condition
}
if rule.NoDataState != "" {
noDataState := definitions.NoDataState(rule.NoDataState)
result.NoDataState = &noDataState
}
if rule.ExecErrState != "" {
execErrorState := definitions.ExecutionErrorState(rule.ExecErrState)
result.ExecErrState = &execErrorState
}
result.For = model.Duration(rule.For)
if rule.For > 0 {
result.ForString = util.Pointer(model.Duration(rule.For).String())
}
result.KeepFiringFor = model.Duration(rule.KeepFiringFor)
if rule.KeepFiringFor > 0 {
result.KeepFiringForString = util.Pointer(model.Duration(rule.KeepFiringFor).String())
}
if rule.MissingSeriesEvalsToResolve != nil && *rule.MissingSeriesEvalsToResolve != -1 {
result.MissingSeriesEvalsToResolve = rule.MissingSeriesEvalsToResolve
}
}
func encodeQueryModel(m map[string]any) (string, error) {
var buf bytes.Buffer
enc := json.NewEncoder(&buf)
@@ -9,6 +9,7 @@ import (
"github.com/grafana/grafana/pkg/services/ngalert/api/tooling/definitions"
"github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/util"
)
func TestToModel(t *testing.T) {
@@ -115,6 +116,102 @@ func TestToModel(t *testing.T) {
})
}
func TestAlertRuleExportFromAlertRule(t *testing.T) {
alertingRule := models.RuleGen.With(
models.RuleGen.WithNotEmptyLabels(2, "lbl-"),
models.RuleGen.WithAnnotations(map[string]string{"ann-key": "ann-value"}),
models.RuleGen.WithFor(2*time.Minute),
models.RuleGen.WithKeepFiringFor(5*time.Minute),
models.RuleGen.WithNotificationSettingsGen(models.NotificationSettingsGen()),
).Generate()
recordingRule := models.RuleGen.With(
models.RuleGen.WithAllRecordingRules(),
models.RuleGen.WithNotEmptyLabels(2, "lbl-"),
models.RuleGen.WithAnnotations(map[string]string{"ann-key": "ann-value"}),
).Generate()
// Build expected exported recording rule
recordingRuleData, err := AlertQueryExportFromAlertQuery(recordingRule.Data[0])
require.NoError(t, err)
expectedRecordingRuleExport := definitions.AlertRuleExport{
UID: recordingRule.UID,
Title: recordingRule.Title,
Data: []definitions.AlertQueryExport{recordingRuleData},
Annotations: &recordingRule.Annotations,
Labels: &recordingRule.Labels,
Record: &definitions.AlertRuleRecordExport{
Metric: recordingRule.Record.Metric,
From: recordingRule.Record.From,
TargetDatasourceUID: util.Pointer(recordingRule.Record.TargetDatasourceUID),
},
}
// Build expected exported alerting rule
alertingRuleData, err := AlertQueryExportFromAlertQuery(alertingRule.Data[0])
require.NoError(t, err)
noDataState := definitions.NoDataState(alertingRule.NoDataState)
execErrState := definitions.ExecutionErrorState(alertingRule.ExecErrState)
expectedAlertingRuleExport := definitions.AlertRuleExport{
UID: alertingRule.UID,
Title: alertingRule.Title,
Condition: &alertingRule.Condition,
Data: []definitions.AlertQueryExport{alertingRuleData},
DashboardUID: alertingRule.DashboardUID,
PanelID: alertingRule.PanelID,
NoDataState: &noDataState,
ExecErrState: &execErrState,
For: prommodel.Duration(alertingRule.For),
KeepFiringFor: prommodel.Duration(alertingRule.KeepFiringFor),
ForString: util.Pointer(prommodel.Duration(alertingRule.For).String()),
KeepFiringForString: util.Pointer(prommodel.Duration(alertingRule.KeepFiringFor).String()),
Annotations: &alertingRule.Annotations,
Labels: &alertingRule.Labels,
NotificationSettings: AlertRuleNotificationSettingsExportFromNotificationSettings(alertingRule.NotificationSettings),
MissingSeriesEvalsToResolve: alertingRule.MissingSeriesEvalsToResolve,
}
testCases := []struct {
name string
rule models.AlertRule
expected definitions.AlertRuleExport
}{
{
name: "export recording rule",
rule: recordingRule,
expected: expectedRecordingRuleExport,
},
{
name: "export alerting rule",
rule: alertingRule,
expected: expectedAlertingRuleExport,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
exported, err := AlertRuleExportFromAlertRule(tc.rule)
require.NoError(t, err)
require.Equal(t, tc.expected, exported)
})
}
}
func TestAlertQueryExportFromAlertQuery(t *testing.T) {
query := models.RuleGen.GenerateQuery()
exported, err := AlertQueryExportFromAlertQuery(query)
require.NoError(t, err)
require.Equal(t, query.RefID, exported.RefID)
require.Equal(t, query.DatasourceUID, exported.DatasourceUID)
require.Equal(t, int64(time.Duration(query.RelativeTimeRange.From).Seconds()), exported.RelativeTimeRange.FromSeconds)
require.Equal(t, int64(time.Duration(query.RelativeTimeRange.To).Seconds()), exported.RelativeTimeRange.ToSeconds)
require.NotNil(t, exported.QueryType)
require.Equal(t, query.QueryType, *exported.QueryType)
require.NotNil(t, exported.Model)
require.NotEmpty(t, exported.ModelString)
}
func TestAlertRuleMetadataFromModelMetadata(t *testing.T) {
t.Run("should convert model metadata to api metadata", func(t *testing.T) {
modelMetadata := models.AlertRuleMetadata{
@@ -54,6 +54,7 @@ type StatusReader interface {
type ProvenanceStore interface {
GetProvenances(ctx context.Context, org int64, resourceType string) (map[string]ngmodels.Provenance, error)
GetProvenancesByUIDs(ctx context.Context, org int64, resourceType string, uids []string) (map[string]ngmodels.Provenance, error)
}
type PrometheusSrv struct {
@@ -328,14 +329,6 @@ func (srv PrometheusSrv) RouteGetRuleStatuses(c *contextmodel.ReqContext) respon
span.AddEvent("User permissions checked")
span.SetAttributes(attribute.Int("allowedNamespaces", len(allowedNamespaces)))
provenanceRecords, err := srv.provenanceStore.GetProvenances(c.Req.Context(), c.GetOrgID(), (&ngmodels.AlertRule{}).ResourceType())
if err != nil {
ruleResponse.Status = "error"
ruleResponse.Error = fmt.Sprintf("failed to get provenances visible to the user: %s", err.Error())
ruleResponse.ErrorType = apiv1.ErrServer
return response.JSON(ruleResponse.HTTPStatusCode(), ruleResponse)
}
ruleResponse = PrepareRuleGroupStatusesV2(
srv.log,
srv.store,
@@ -347,7 +340,7 @@ func (srv PrometheusSrv) RouteGetRuleStatuses(c *contextmodel.ReqContext) respon
},
RuleStatusMutatorGenerator(srv.status),
RuleAlertStateMutatorGenerator(srv.manager),
provenanceRecords,
srv.provenanceStore,
)
return response.JSON(ruleResponse.HTTPStatusCode(), ruleResponse)
@@ -454,6 +447,7 @@ func RuleAlertStateMutatorGenerator(manager state.AlertInstanceManager) RuleAler
type paginationContext struct {
opts RuleGroupStatusesOptions
provenanceRecords map[string]ngmodels.Provenance
provenanceStore ProvenanceStore
ruleStatusMutator RuleStatusMutator
alertStateMutator RuleAlertStateMutator
@@ -532,6 +526,37 @@ func (ctx *paginationContext) fetchAndFilterPage(log log.Logger, store ListAlert
)
span.AddEvent("Alert rules retrieved from store")
// Load provenance for this page's rules
if ctx.provenanceStore != nil {
maxGroups := getInt64WithDefault(ctx.opts.Query, "group_limit", -1)
maxRules := getInt64WithDefault(ctx.opts.Query, "rule_limit", -1)
if maxGroups > 0 || maxRules > 0 {
// Paginated, fetch and merge provenances for this page
uids := make([]string, 0, len(ruleList))
for _, rule := range ruleList {
uids = append(uids, rule.UID)
}
pageProvenances, err := ctx.provenanceStore.GetProvenancesByUIDs(ctx.opts.Ctx, ctx.opts.OrgID, (&ngmodels.AlertRule{}).ResourceType(), uids)
if err != nil {
return pageResult{}, fmt.Errorf("failed to load provenance: %w", err)
}
if ctx.provenanceRecords == nil {
ctx.provenanceRecords = pageProvenances
} else {
maps.Copy(ctx.provenanceRecords, pageProvenances)
}
} else if ctx.provenanceRecords == nil {
// Not paginated, fetch all once
var err error
ctx.provenanceRecords, err = ctx.provenanceStore.GetProvenances(ctx.opts.Ctx, ctx.opts.OrgID, (&ngmodels.AlertRule{}).ResourceType())
if err != nil {
return pageResult{}, fmt.Errorf("failed to load provenance: %w", err)
}
}
}
span.AddEvent("Provenances retrieved from store")
groupedRules := getGroupedRules(log, ruleList, ctx.ruleNamesSet, ctx.opts.AllowedNamespaces)
result := pageResult{
@@ -643,7 +668,7 @@ func paginateRuleGroups(log log.Logger, store ListAlertRulesStoreV2, ctx *pagina
return allGroups, rulesTotals, continueToken, nil
}
func PrepareRuleGroupStatusesV2(log log.Logger, store ListAlertRulesStoreV2, opts RuleGroupStatusesOptions, ruleStatusMutator RuleStatusMutator, alertStateMutator RuleAlertStateMutator, provenanceRecords map[string]ngmodels.Provenance) apimodels.RuleResponse {
func PrepareRuleGroupStatusesV2(log log.Logger, store ListAlertRulesStoreV2, opts RuleGroupStatusesOptions, ruleStatusMutator RuleStatusMutator, alertStateMutator RuleAlertStateMutator, provenanceStore ProvenanceStore) apimodels.RuleResponse {
ctx, span := tracer.Start(opts.Ctx, "api.prometheus.PrepareRuleGroupStatusesV2")
defer span.End()
opts.Ctx = ctx
@@ -835,7 +860,8 @@ func PrepareRuleGroupStatusesV2(log log.Logger, store ListAlertRulesStoreV2, opt
span.SetAttributes(attribute.Bool("compact", compact))
pagCtx := &paginationContext{
opts: opts,
provenanceRecords: provenanceRecords,
provenanceRecords: nil,
provenanceStore: provenanceStore,
ruleStatusMutator: ruleStatusMutator,
alertStateMutator: alertStateMutator,
namespaceUIDs: namespaceUIDs,
@@ -221,15 +221,21 @@ type BacktestConfig struct {
To time.Time `json:"to"`
Interval model.Duration `json:"interval,omitempty"`
Condition string `json:"condition"`
Data []AlertQuery `json:"data"`
For model.Duration `json:"for,omitempty"`
Condition string `json:"condition"`
Data []AlertQuery `json:"data"`
For *model.Duration `json:"for,omitempty"`
KeepFiringFor *model.Duration `json:"keep_firing_for,omitempty"`
Title string `json:"title"`
Labels map[string]string `json:"labels,omitempty"`
Annotations map[string]string `json:"annotations,omitempty"`
Title string `json:"title"`
Labels map[string]string `json:"labels,omitempty"`
NoDataState NoDataState `json:"no_data_state"`
NoDataState NoDataState `json:"no_data_state"`
ExecErrState ExecutionErrorState `json:"exec_err_state"`
MissingSeriesEvalsToResolve *int64 `json:"missing_series_evals_to_resolve,omitempty"`
UID string `json:"uid,omitempty"`
RuleGroup string `json:"rule_group,omitempty"`
NamespaceUID string `json:"namespace_uid,omitempty"`
}
// swagger:model
@@ -193,6 +193,7 @@ func validateRecordingRuleFields(in *apimodels.PostableExtendedRuleNode, newRule
newRule.For = 0
newRule.KeepFiringFor = 0
newRule.NotificationSettings = nil
newRule.MissingSeriesEvalsToResolve = nil
return newRule, nil
}
@@ -248,6 +249,21 @@ func ValidateCondition(condition string, queries []apimodels.AlertQuery, canPatc
return nil
}
func validateGroupInterval(incoming prommodels.Duration, limits RuleLimits) (time.Duration, error) {
interval := time.Duration(incoming)
if interval == 0 {
// if group interval is 0 (undefined) then we automatically fall back to the default interval
interval = limits.DefaultRuleEvaluationInterval
}
if interval < 0 || int64(interval.Seconds())%int64(limits.BaseInterval.Seconds()) != 0 {
return 0, fmt.Errorf("rule evaluation interval (%d second) should be positive number that is multiple of the base interval of %d seconds", int64(interval.Seconds()), int64(limits.BaseInterval.Seconds()))
}
// TODO should we validate that interval is >= cfg.MinInterval? Currently, we allow to save but fix the specified interval if it is < cfg.MinInterval
return interval, nil
}
func ValidateInterval(interval, baseInterval time.Duration) (int64, error) {
intervalSeconds := int64(interval.Seconds())
@@ -335,18 +351,11 @@ func ValidateRuleGroup(
return nil, fmt.Errorf("rule group name is too long. Max length is %d", store.AlertRuleMaxRuleGroupNameLength)
}
interval := time.Duration(ruleGroupConfig.Interval)
if interval == 0 {
// if group interval is 0 (undefined) then we automatically fall back to the default interval
interval = limits.DefaultRuleEvaluationInterval
interval, err := validateGroupInterval(ruleGroupConfig.Interval, limits)
if err != nil {
return nil, err
}
if interval < 0 || int64(interval.Seconds())%int64(limits.BaseInterval.Seconds()) != 0 {
return nil, fmt.Errorf("rule evaluation interval (%d second) should be positive number that is multiple of the base interval of %d seconds", int64(interval.Seconds()), int64(limits.BaseInterval.Seconds()))
}
// TODO should we validate that interval is >= cfg.MinInterval? Currently, we allow to save but fix the specified interval if it is < cfg.MinInterval
// If the rule group is reserved for no-group rules, we cannot have multiple rules in it.
if isNoGroupRuleGroup && len(ruleGroupConfig.Rules) > 1 {
return nil, fmt.Errorf("rule group %s is reserved for no-group rules and cannot be used for rule groups with multiple rules", ruleGroupConfig.Name)
@@ -409,3 +418,32 @@ func ValidateNotificationSettings(n *apimodels.AlertRuleNotificationSettings) ([
s,
}, nil
}
func ValidateBacktestConfig(orgId int64, config apimodels.BacktestConfig, limits RuleLimits) (*ngmodels.AlertRule, error) {
if config.From.After(config.To) {
return nil, fmt.Errorf("invalid testing range: from %s must be before to %s", config.From, config.To)
}
interval, err := validateGroupInterval(config.Interval, limits)
if err != nil {
return nil, err
}
return ValidateRuleNode(&apimodels.PostableExtendedRuleNode{
ApiRuleNode: &apimodels.ApiRuleNode{
For: config.For,
KeepFiringFor: config.KeepFiringFor,
Labels: config.Labels,
Annotations: nil,
},
GrafanaManagedAlert: &apimodels.PostableGrafanaRule{
Title: config.Title,
Condition: config.Condition,
Data: config.Data,
UID: config.UID,
NoDataState: config.NoDataState,
ExecErrState: config.ExecErrState,
MissingSeriesEvalsToResolve: config.MissingSeriesEvalsToResolve,
},
}, config.RuleGroup, interval, orgId, config.NamespaceUID, limits)
}
+181 -54
View File
@@ -15,10 +15,16 @@ import (
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/infra/tracing"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/ngalert/eval"
"github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/services/ngalert/schedule"
"github.com/grafana/grafana/pkg/services/ngalert/schedule/ticker"
"github.com/grafana/grafana/pkg/services/ngalert/state"
"github.com/grafana/grafana/pkg/services/ngalert/state/historian"
history_model "github.com/grafana/grafana/pkg/services/ngalert/state/historian/model"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util"
)
var (
@@ -28,7 +34,7 @@ var (
backtestingEvaluatorFactory = newBacktestingEvaluator
)
type callbackFunc = func(evaluationIndex int, now time.Time, results eval.Results) error
type callbackFunc = func(evaluationIndex int, now time.Time, results eval.Results) (bool, error)
type backtestingEvaluator interface {
Eval(ctx context.Context, from time.Time, interval time.Duration, evaluations int, callback callbackFunc) error
@@ -40,11 +46,17 @@ type stateManager interface {
}
type Engine struct {
evalFactory eval.EvaluatorFactory
createStateManager func() stateManager
evalFactory eval.EvaluatorFactory
createStateManager func() stateManager
disableGrafanaFolder bool
featureToggles featuremgmt.FeatureToggles
minInterval time.Duration
baseInterval time.Duration
jitterStrategy schedule.JitterStrategy
maxEvaluations int
}
func NewEngine(appUrl *url.URL, evalFactory eval.EvaluatorFactory, tracer tracing.Tracer) *Engine {
func NewEngine(appUrl *url.URL, evalFactory eval.EvaluatorFactory, tracer tracing.Tracer, cfg setting.UnifiedAlertingSettings, toggles featuremgmt.FeatureToggles) *Engine {
return &Engine{
evalFactory: evalFactory,
createStateManager: func() stateManager {
@@ -60,74 +72,139 @@ func NewEngine(appUrl *url.URL, evalFactory eval.EvaluatorFactory, tracer tracin
}
return state.NewManager(cfg, state.NewNoopPersister())
},
disableGrafanaFolder: false,
featureToggles: toggles,
minInterval: cfg.MinInterval,
baseInterval: cfg.BaseInterval,
maxEvaluations: cfg.BacktestingMaxEvaluations,
jitterStrategy: schedule.JitterStrategyFrom(cfg, toggles),
}
}
func (e *Engine) Test(ctx context.Context, user identity.Requester, rule *models.AlertRule, from, to time.Time) (*data.Frame, error) {
ruleCtx := models.WithRuleKey(ctx, rule.GetKey())
logger := logger.FromContext(ctx)
func (e *Engine) Test(ctx context.Context, user identity.Requester, rule *models.AlertRule, from, to time.Time, folderTitle string) (res *data.Frame, err error) {
if rule == nil {
return nil, fmt.Errorf("%w: rule is not defined", ErrInvalidInputData)
}
if !from.Before(to) {
return nil, fmt.Errorf("%w: invalid interval of the backtesting [%d,%d]", ErrInvalidInputData, from.Unix(), to.Unix())
return nil, fmt.Errorf("%w: invalid interval [%d,%d]", ErrInvalidInputData, from.Unix(), to.Unix())
}
if to.Sub(from).Seconds() < float64(rule.IntervalSeconds) {
return nil, fmt.Errorf("%w: interval of the backtesting [%d,%d] is less than evaluation interval [%ds]", ErrInvalidInputData, from.Unix(), to.Unix(), rule.IntervalSeconds)
ruleCtx := models.WithRuleKey(ctx, rule.GetKey())
logger := logger.FromContext(ruleCtx).New("backtesting", util.GenerateShortUID())
var warns []string
if rule.GetInterval() < e.minInterval {
logger.Warn("Interval adjusted to minimal interval", "originalInterval", rule.GetInterval(), "adjustedInterval", e.minInterval)
rule = rule.Copy()
rule.IntervalSeconds = int64(e.minInterval.Seconds())
warns = append(warns, fmt.Sprintf("Interval adjusted to minimal interval %ds", rule.IntervalSeconds))
}
length := int(to.Sub(from).Seconds()) / int(rule.IntervalSeconds)
stateManager := e.createStateManager()
effectiveStrategy := e.jitterStrategy
if e.jitterStrategy == schedule.JitterByGroup && (rule.RuleGroup == "" || rule.NamespaceUID == "") ||
e.jitterStrategy == schedule.JitterByRule && rule.UID == "" {
logger.Warn(fmt.Sprintf("Jitter strategy is set to %s, but rule group or namespace is not set. Ignore jitter", e.jitterStrategy))
warns = append(warns, fmt.Sprintf("Jitter strategy is set to %s, but rule group or namespace is not set. Ignore jitter. The results of testing will be different than real evaluations", e.jitterStrategy))
effectiveStrategy = schedule.JitterNever
}
jitterOffset := schedule.JitterOffsetInDuration(rule, e.baseInterval, effectiveStrategy)
firstEval, err := getFirstEvaluationTime(from, rule, e.baseInterval, jitterOffset)
if err != nil {
return nil, fmt.Errorf("%w: %s", ErrInvalidInputData, err)
}
evaluator, err := backtestingEvaluatorFactory(ruleCtx, e.evalFactory, user, rule.GetEvalCondition().WithSource("backtesting"), &schedule.AlertingResultsFromRuleState{
Manager: stateManager,
Rule: rule,
})
evaluations := calculateNumberOfEvaluations(firstEval, to, rule.GetInterval())
if e.maxEvaluations > 0 && evaluations > e.maxEvaluations {
logger.Warn("Evaluations adjusted to maximal number", "originalEvaluations", evaluations, "adjustedEvaluations", e.maxEvaluations)
warns = append(warns, fmt.Sprintf("Number of evaluations are adjusted to the limit of %d evaluations. Requested: %d", e.maxEvaluations, evaluations))
evaluations = e.maxEvaluations
}
start := time.Now()
defer func() {
if err == nil {
logger.Info("Rule testing finished successfully", "duration", time.Since(start))
} else {
logger.Error("Rule testing finished with error", "duration", time.Since(start), "error", err)
}
}()
stateMgr := e.createStateManager()
evaluator, err := backtestingEvaluatorFactory(ruleCtx,
e.evalFactory,
user,
rule.GetEvalCondition().WithSource("backtesting"),
&schedule.AlertingResultsFromRuleState{
Manager: stateMgr,
Rule: rule,
},
)
if err != nil {
return nil, errors.Join(ErrInvalidInputData, err)
}
logger.Info("Start testing alert rule", "from", from, "to", to, "interval", rule.IntervalSeconds, "evaluations", length)
logger.Info("Start testing alert rule", "from", from, "to", to, "interval", rule.GetInterval(), "firstTick", firstEval, "evaluations", evaluations, "jitterOffset", jitterOffset, "jitterStrategy", effectiveStrategy)
start := time.Now()
var builder *historian.QueryResultBuilder
tsField := data.NewField("Time", nil, make([]time.Time, length))
valueFields := make(map[data.Fingerprint]*data.Field)
err = evaluator.Eval(ruleCtx, from, time.Duration(rule.IntervalSeconds)*time.Second, length, func(idx int, currentTime time.Time, results eval.Results) error {
if idx >= length {
logger.Info("Unexpected evaluation. Skipping", "from", from, "to", to, "interval", rule.IntervalSeconds, "evaluationTime", currentTime, "evaluationIndex", idx, "expectedEvaluations", length)
return nil
}
states := stateManager.ProcessEvalResults(ruleCtx, currentTime, rule, results, nil, nil)
tsField.Set(idx, currentTime)
for _, s := range states {
field, ok := valueFields[s.CacheID]
if !ok {
field = data.NewField("", s.Labels, make([]*string, length))
valueFields[s.CacheID] = field
}
if s.State.State != eval.NoData { // set nil if NoData
value := s.State.State.String()
if s.StateReason != "" {
value += " (" + s.StateReason + ")"
}
field.Set(idx, &value)
continue
}
}
return nil
})
fields := make([]*data.Field, 0, len(valueFields)+1)
fields = append(fields, tsField)
for _, f := range valueFields {
fields = append(fields, f)
ruleMeta := history_model.RuleMeta{
ID: rule.ID,
OrgID: rule.OrgID,
UID: rule.UID,
Title: rule.Title,
Group: rule.RuleGroup,
NamespaceUID: rule.NamespaceUID,
// DashboardUID: "",
// PanelID: 0,
Condition: rule.Condition,
}
result := data.NewFrame("Testing results", fields...)
labels := map[string]string{
historian.OrgIDLabel: fmt.Sprint(ruleMeta.OrgID),
historian.GroupLabel: fmt.Sprint(ruleMeta.Group),
historian.FolderUIDLabel: fmt.Sprint(rule.NamespaceUID),
}
labelsBytes, err := json.Marshal(labels)
if err != nil {
return nil, err
}
logger.Info("Rule testing finished successfully", "duration", time.Since(start))
return result, nil
// Ensure fallback if empty string is passed
if folderTitle == "" {
folderTitle = "Backtesting"
}
extraLabels := state.GetRuleExtraLabels(logger, rule, folderTitle, !e.disableGrafanaFolder, e.featureToggles)
processFn := func(idx int, currentTime time.Time, results eval.Results) (bool, error) {
// init the builder. Do the best guess for the size of the result
if builder == nil {
builder = historian.NewQueryResultBuilder(evaluations * len(results))
for _, warn := range warns {
builder.AddWarn(warn)
}
}
states := stateMgr.ProcessEvalResults(ruleCtx, currentTime, rule, results, extraLabels, nil)
for _, s := range states {
if !historian.ShouldRecord(s) {
continue
}
entry := historian.StateTransitionToLokiEntry(ruleMeta, s)
err := builder.AddRow(currentTime, entry, labelsBytes)
if err != nil {
return false, err
}
}
return idx <= evaluations, nil
}
err = evaluator.Eval(ruleCtx, firstEval, rule.GetInterval(), evaluations, processFn)
if err != nil {
return nil, err
}
if builder == nil {
return nil, errors.New("no results were produced")
}
return builder.ToFrame(), nil
}
func newBacktestingEvaluator(ctx context.Context, evalFactory eval.EvaluatorFactory, user identity.Requester, condition models.Condition, reader eval.AlertingResultsReader) (backtestingEvaluator, error) {
@@ -173,3 +250,53 @@ type NoopImageService struct{}
func (s *NoopImageService) NewImage(_ context.Context, _ *models.AlertRule) (*models.Image, error) {
return &models.Image{}, nil
}
func getNextEvaluationTime(currentTime time.Time, rule *models.AlertRule, baseInterval time.Duration, jitterOffset time.Duration) (time.Time, error) {
if rule.IntervalSeconds%int64(baseInterval.Seconds()) != 0 {
return time.Time{}, fmt.Errorf("interval %ds is not divisible by base interval %ds", rule.IntervalSeconds, int64(baseInterval.Seconds()))
}
freq := rule.IntervalSeconds / int64(baseInterval.Seconds())
firstTickNum := currentTime.Unix() / int64(baseInterval.Seconds())
jitterOffsetTicks := int64(jitterOffset / baseInterval)
firstEvalTickNum := firstTickNum + (jitterOffsetTicks-(firstTickNum%freq)+freq)%freq
return time.Unix(firstEvalTickNum*int64(baseInterval.Seconds()), 0), nil
}
func getFirstEvaluationTime(from time.Time, rule *models.AlertRule, baseInterval time.Duration, jitterOffset time.Duration) (time.Time, error) {
// Now calculate the time of the tick the same way as in the scheduler
firstTick := ticker.GetStartTick(from, baseInterval)
// calculate time of the first evaluation that is at or after the first tick
firstEval, err := getNextEvaluationTime(firstTick, rule, baseInterval, jitterOffset)
if err != nil {
return time.Time{}, err
}
// Ensure firstEval is at or after from
// Calculate how many intervals to skip to get past 'from'
if firstEval.Before(from) {
diff := from.Sub(firstEval)
interval := rule.GetInterval()
// Ceiling division: how many intervals needed to cover the difference
intervalsToAdd := (diff + interval - 1) / interval
firstEval = firstEval.Add(interval * intervalsToAdd)
}
return firstEval, nil
}
func calculateNumberOfEvaluations(firstEval, to time.Time, interval time.Duration) int {
var evaluations int
if to.After(firstEval) {
evaluations = int(to.Sub(firstEval).Seconds()) / int(interval.Seconds())
}
if evaluations == 0 {
evaluations = 1
}
return evaluations
}
+191 -147
View File
@@ -4,7 +4,6 @@ import (
"context"
"encoding/json"
"errors"
"fmt"
"math/rand"
"testing"
"time"
@@ -14,9 +13,11 @@ import (
"github.com/grafana/grafana-plugin-sdk-go/data"
"github.com/grafana/grafana/pkg/apimachinery/identity"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/ngalert/eval"
"github.com/grafana/grafana/pkg/services/ngalert/eval/eval_mocks"
"github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/services/ngalert/schedule"
"github.com/grafana/grafana/pkg/services/ngalert/state"
"github.com/grafana/grafana/pkg/util"
)
@@ -158,16 +159,6 @@ func TestNewBacktestingEvaluator(t *testing.T) {
}
func TestEvaluatorTest(t *testing.T) {
states := []eval.State{eval.Normal, eval.Alerting, eval.Pending}
generateState := func(prefix string) *state.State {
labels := models.GenerateAlertLabels(rand.Intn(5)+1, prefix+"-")
return &state.State{
CacheID: labels.Fingerprint(),
Labels: labels,
State: states[rand.Intn(len(states))],
}
}
randomResultCallback := func(now time.Time) (eval.Results, error) {
return eval.GenerateResults(rand.Intn(5)+1, eval.ResultGen()), nil
}
@@ -189,84 +180,17 @@ func TestEvaluatorTest(t *testing.T) {
createStateManager: func() stateManager {
return manager
},
disableGrafanaFolder: false,
featureToggles: featuremgmt.WithFeatures(),
minInterval: 1 * time.Second,
baseInterval: 1 * time.Second,
jitterStrategy: schedule.JitterNever,
maxEvaluations: 10000,
}
gen := models.RuleGen
rule := gen.With(gen.WithInterval(time.Second)).GenerateRef()
ruleInterval := time.Duration(rule.IntervalSeconds) * time.Second
t.Run("should return data frame in specific format", func(t *testing.T) {
from := time.Unix(0, 0)
to := from.Add(5 * ruleInterval)
allStates := [...]eval.State{eval.Normal, eval.Alerting, eval.Pending, eval.NoData, eval.Error}
var states []state.StateTransition
for _, s := range allStates {
labels := models.GenerateAlertLabels(rand.Intn(5)+1, s.String()+"-")
states = append(states, state.StateTransition{
State: &state.State{
CacheID: labels.Fingerprint(),
Labels: labels,
State: s,
StateReason: util.GenerateShortUID(),
},
})
}
manager.stateCallback = func(now time.Time) []state.StateTransition {
return states
}
frame, err := engine.Test(context.Background(), nil, rule, from, to)
require.NoError(t, err)
require.Len(t, frame.Fields, len(states)+1) // +1 - timestamp
t.Run("should contain field Time", func(t *testing.T) {
timestampField, _ := frame.FieldByName("Time")
require.NotNil(t, timestampField, "frame does not contain field 'Time'")
require.Equal(t, data.FieldTypeTime, timestampField.Type())
})
fieldByState := make(map[data.Fingerprint]*data.Field, len(states))
t.Run("should contain a field per state", func(t *testing.T) {
for _, s := range states {
var f *data.Field
for _, field := range frame.Fields {
if field.Labels.String() == s.Labels.String() {
f = field
break
}
}
require.NotNilf(t, f, "Cannot find a field by state labels")
fieldByState[s.CacheID] = f
}
})
t.Run("should be populated with correct values", func(t *testing.T) {
timestampField, _ := frame.FieldByName("Time")
expectedLength := timestampField.Len()
for _, field := range frame.Fields {
require.Equalf(t, expectedLength, field.Len(), "Field %s should have the size %d", field.Name, expectedLength)
}
for i := 0; i < expectedLength; i++ {
expectedTime := from.Add(time.Duration(int64(i)*rule.IntervalSeconds) * time.Second)
require.Equal(t, expectedTime, timestampField.At(i).(time.Time))
for _, s := range states {
f := fieldByState[s.CacheID]
if s.State.State == eval.NoData {
require.Nil(t, f.At(i))
} else {
v := f.At(i).(*string)
require.NotNilf(t, v, "Field [%s] value at index %d should not be nil", s.CacheID, i)
require.Equal(t, fmt.Sprintf("%s (%s)", s.State.State, s.StateReason), *v)
}
}
}
})
})
t.Run("should not fail if 'to-from' is not times of interval", func(t *testing.T) {
from := time.Unix(0, 0)
to := from.Add(5 * ruleInterval)
@@ -287,84 +211,26 @@ func TestEvaluatorTest(t *testing.T) {
return states
}
frame, err := engine.Test(context.Background(), nil, rule, from, to)
frame, err := engine.Test(context.Background(), nil, rule, from, to, "")
require.NoError(t, err)
expectedLen := frame.Rows()
for i := 0; i < 100; i++ {
jitter := time.Duration(rand.Int63n(ruleInterval.Milliseconds())) * time.Millisecond
frame, err = engine.Test(context.Background(), nil, rule, from, to.Add(jitter))
frame, err = engine.Test(context.Background(), nil, rule, from, to.Add(jitter), "")
require.NoError(t, err)
require.Equalf(t, expectedLen, frame.Rows(), "jitter %v caused result to be different that base-line", jitter)
}
})
t.Run("should backfill field with nulls if a new dimension created in the middle", func(t *testing.T) {
from := time.Unix(0, 0)
state1 := state.StateTransition{
State: generateState("1"),
}
state2 := state.StateTransition{
State: generateState("2"),
}
state3 := state.StateTransition{
State: generateState("3"),
}
stateByTime := map[time.Time][]state.StateTransition{
from: {state1, state2},
from.Add(1 * ruleInterval): {state1, state2},
from.Add(2 * ruleInterval): {state1, state2},
from.Add(3 * ruleInterval): {state1, state2, state3},
from.Add(4 * ruleInterval): {state1, state2, state3},
}
to := from.Add(time.Duration(len(stateByTime)) * ruleInterval)
manager.stateCallback = func(now time.Time) []state.StateTransition {
return stateByTime[now]
}
frame, err := engine.Test(context.Background(), nil, rule, from, to)
require.NoError(t, err)
var field3 *data.Field
for _, field := range frame.Fields {
if field.Labels.String() == state3.Labels.String() {
field3 = field
break
}
}
require.NotNilf(t, field3, "Result for state 3 was not found")
require.Equalf(t, len(stateByTime), field3.Len(), "State3 result has unexpected number of values")
idx := 0
for curTime, states := range stateByTime {
value := field3.At(idx).(*string)
if len(states) == 2 {
require.Nilf(t, value, "The result should be nil if state3 was not available for time %v", curTime)
}
}
})
t.Run("should fail", func(t *testing.T) {
manager.stateCallback = func(now time.Time) []state.StateTransition {
return nil
}
t.Run("when interval is not correct", func(t *testing.T) {
from := time.Now()
t.Run("when from=to", func(t *testing.T) {
to := from
_, err := engine.Test(context.Background(), nil, rule, from, to)
require.ErrorIs(t, err, ErrInvalidInputData)
})
t.Run("when from > to", func(t *testing.T) {
to := from.Add(-ruleInterval)
_, err := engine.Test(context.Background(), nil, rule, from, to)
require.ErrorIs(t, err, ErrInvalidInputData)
})
t.Run("when to-from < interval", func(t *testing.T) {
to := from.Add(ruleInterval).Add(-time.Millisecond)
_, err := engine.Test(context.Background(), nil, rule, from, to)
_, err := engine.Test(context.Background(), nil, rule, from, to, "")
require.ErrorIs(t, err, ErrInvalidInputData)
})
})
@@ -376,7 +242,7 @@ func TestEvaluatorTest(t *testing.T) {
}
from := time.Now()
to := from.Add(ruleInterval)
_, err := engine.Test(context.Background(), nil, rule, from, to)
_, err := engine.Test(context.Background(), nil, rule, from, to, "")
require.ErrorIs(t, err, expectedError)
})
})
@@ -404,10 +270,188 @@ func (f *fakeBacktestingEvaluator) Eval(_ context.Context, from time.Time, inter
if err != nil {
return err
}
err = callback(idx, now, results)
c, err := callback(idx, now, results)
if err != nil {
return err
}
if !c {
break
}
}
return nil
}
func TestGetNextEvaluationTime(t *testing.T) {
baseInterval := 10 * time.Second
testCases := []struct {
name string
ruleInterval int64
currentTimestamp int64
jitterOffset time.Duration
expectError bool
expectedNext int64
}{
{
name: "interval not divisible by base interval",
ruleInterval: 15,
currentTimestamp: 0,
jitterOffset: 0,
expectError: true,
},
{
name: "no jitter - from tick 0",
ruleInterval: 20,
currentTimestamp: 0,
jitterOffset: 0,
expectedNext: 0,
},
{
name: "no jitter - from tick 1",
ruleInterval: 20,
currentTimestamp: 10,
jitterOffset: 0,
expectedNext: 20,
},
{
name: "no jitter - from tick 2",
ruleInterval: 20,
currentTimestamp: 20,
jitterOffset: 0,
expectedNext: 20,
},
{
name: "with 20s jitter - from tick 0",
ruleInterval: 60,
currentTimestamp: 0,
jitterOffset: 20 * time.Second,
expectedNext: 20,
},
{
name: "with 20s jitter - from tick 2",
ruleInterval: 60,
currentTimestamp: 20,
jitterOffset: 20 * time.Second,
expectedNext: 20,
},
{
name: "with 20s jitter - from tick 3",
ruleInterval: 60,
currentTimestamp: 30,
jitterOffset: 20 * time.Second,
expectedNext: 80,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
rule := &models.AlertRule{IntervalSeconds: tc.ruleInterval}
currentTime := time.Unix(tc.currentTimestamp, 0)
result, err := getNextEvaluationTime(currentTime, rule, baseInterval, tc.jitterOffset)
if tc.expectError {
require.Error(t, err)
require.Contains(t, err.Error(), "is not divisible by base interval")
return
}
require.NoError(t, err)
require.Equal(t, tc.expectedNext, result.Unix())
})
}
}
func TestGetFirstEvaluationTime(t *testing.T) {
baseInterval := 10 * time.Second
testCases := []struct {
name string
ruleInterval int64
fromUnix int64
jitterOffset time.Duration
expectError bool
expectedUnix int64
}{
{
name: "interval not divisible by base interval",
ruleInterval: 15,
fromUnix: 0,
jitterOffset: 0,
expectError: true,
},
{
name: "no jitter - from at tick 0",
ruleInterval: 20,
fromUnix: 0,
jitterOffset: 0,
expectedUnix: 0,
},
{
name: "no jitter - from at tick 1",
ruleInterval: 20,
fromUnix: 10,
jitterOffset: 0,
expectedUnix: 20,
},
{
name: "no jitter - from before first tick",
ruleInterval: 20,
fromUnix: 5,
jitterOffset: 0,
expectedUnix: 20,
},
{
name: "no jitter - from after first aligned tick",
ruleInterval: 20,
fromUnix: 25,
jitterOffset: 0,
expectedUnix: 40,
},
{
name: "no jitter - from at tick boundary",
ruleInterval: 10,
fromUnix: 10,
jitterOffset: 0,
expectedUnix: 10,
},
{
name: "with 20s jitter - from epoch",
ruleInterval: 60,
fromUnix: 0,
jitterOffset: 20 * time.Second,
expectedUnix: 20,
},
{
name: "with 20s jitter - from 70s",
ruleInterval: 60,
fromUnix: 70,
jitterOffset: 20 * time.Second,
expectedUnix: 80,
},
{
name: "with 50s jitter - from 25s",
ruleInterval: 60,
fromUnix: 25,
jitterOffset: 50 * time.Second,
expectedUnix: 50,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
rule := &models.AlertRule{IntervalSeconds: tc.ruleInterval}
from := time.Unix(tc.fromUnix, 0)
result, err := getFirstEvaluationTime(from, rule, baseInterval, tc.jitterOffset)
if tc.expectError {
require.Error(t, err)
require.Contains(t, err.Error(), "is not divisible by base interval")
return
}
require.NoError(t, err)
require.Equal(t, tc.expectedUnix, result.Unix())
require.GreaterOrEqual(t, result.Unix(), from.Unix(), "first eval should be at or after from")
})
}
}
@@ -85,10 +85,13 @@ func (d *dataEvaluator) Eval(_ context.Context, from time.Time, interval time.Du
EvaluatedAt: now,
})
}
err := callback(i, now, result)
cont, err := callback(i, now, result)
if err != nil {
return err
}
if !cont {
break
}
}
return nil
}
@@ -100,11 +100,11 @@ func TestDataEvaluator_Eval(t *testing.T) {
resultsCount := int(to.Sub(from).Seconds() / interval.Seconds())
err = evaluator.Eval(context.Background(), from, time.Second, resultsCount, func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), from, time.Second, resultsCount, func(idx int, now time.Time, res eval.Results) (bool, error) {
r = append(r, results{
now, res,
})
return nil
return true, nil
})
require.NoError(t, err)
@@ -164,11 +164,11 @@ func TestDataEvaluator_Eval(t *testing.T) {
size := to.Sub(from).Milliseconds() / interval.Milliseconds()
r := make([]results, 0, size)
err = evaluator.Eval(context.Background(), from, interval, int(size), func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), from, interval, int(size), func(idx int, now time.Time, res eval.Results) (bool, error) {
r = append(r, results{
now, res,
})
return nil
return true, nil
})
currentRowIdx := 0
@@ -195,11 +195,11 @@ func TestDataEvaluator_Eval(t *testing.T) {
size := int(to.Sub(from).Seconds() / interval.Seconds())
r := make([]results, 0, size)
err = evaluator.Eval(context.Background(), from, interval, size, func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), from, interval, size, func(idx int, now time.Time, res eval.Results) (bool, error) {
r = append(r, results{
now, res,
})
return nil
return true, nil
})
currentRowIdx := 0
@@ -230,11 +230,11 @@ func TestDataEvaluator_Eval(t *testing.T) {
t.Run("should be noData until the frame interval", func(t *testing.T) {
newFrom := from.Add(-10 * time.Second)
r := make([]results, 0, int(to.Sub(newFrom).Seconds()))
err = evaluator.Eval(context.Background(), newFrom, time.Second, cap(r), func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), newFrom, time.Second, cap(r), func(idx int, now time.Time, res eval.Results) (bool, error) {
r = append(r, results{
now, res,
})
return nil
return true, nil
})
rowIdx := 0
@@ -258,11 +258,11 @@ func TestDataEvaluator_Eval(t *testing.T) {
t.Run("should be the last value after the frame interval", func(t *testing.T) {
newTo := to.Add(10 * time.Second)
r := make([]results, 0, int(newTo.Sub(from).Seconds()))
err = evaluator.Eval(context.Background(), from, time.Second, cap(r), func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), from, time.Second, cap(r), func(idx int, now time.Time, res eval.Results) (bool, error) {
r = append(r, results{
now, res,
})
return nil
return true, nil
})
rowIdx := 0
@@ -282,12 +282,21 @@ func TestDataEvaluator_Eval(t *testing.T) {
})
t.Run("should stop if callback error", func(t *testing.T) {
expectedError := errors.New("error")
err = evaluator.Eval(context.Background(), from, time.Second, 6, func(idx int, now time.Time, res eval.Results) error {
err = evaluator.Eval(context.Background(), from, time.Second, 6, func(idx int, now time.Time, res eval.Results) (bool, error) {
if idx == 5 {
return expectedError
return false, expectedError
}
return nil
return true, nil
})
require.ErrorIs(t, err, expectedError)
})
t.Run("should stop if callback does not want to continue", func(t *testing.T) {
evaluated := 0
err = evaluator.Eval(context.Background(), from, time.Second, 6, func(idx int, now time.Time, res eval.Results) (bool, error) {
evaluated++
return evaluated < 2, nil
})
require.NoError(t, err)
require.Equal(t, 2, evaluated)
})
}
@@ -18,10 +18,13 @@ func (d *queryEvaluator) Eval(ctx context.Context, from time.Time, interval time
if err != nil {
return err
}
err = callback(idx, now, results)
cont, err := callback(idx, now, results)
if err != nil {
return err
}
if !cont {
break
}
}
return nil
}
@@ -31,9 +31,9 @@ func TestQueryEvaluator_Eval(t *testing.T) {
intervals := make([]time.Time, times)
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) error {
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) (bool, error) {
intervals[idx] = now
return nil
return true, nil
})
require.NoError(t, err)
require.Len(t, intervals, times)
@@ -49,7 +49,7 @@ func TestQueryEvaluator_Eval(t *testing.T) {
}
})
t.Run("should stop evaluation if error", func(t *testing.T) {
t.Run("should stop evaluation", func(t *testing.T) {
t.Run("when evaluation fails", func(t *testing.T) {
m := &eval_mocks.ConditionEvaluatorMock{}
expectedResults := eval.Results{}
@@ -62,9 +62,9 @@ func TestQueryEvaluator_Eval(t *testing.T) {
intervals := make([]time.Time, 0, times)
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) error {
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) (bool, error) {
intervals = append(intervals, now)
return nil
return true, nil
})
require.ErrorIs(t, err, expectedError)
require.Len(t, intervals, 3)
@@ -81,14 +81,31 @@ func TestQueryEvaluator_Eval(t *testing.T) {
intervals := make([]time.Time, 0, times)
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) error {
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) (bool, error) {
if len(intervals) > 3 {
return expectedError
return false, expectedError
}
intervals = append(intervals, now)
return nil
return true, nil
})
require.ErrorIs(t, err, expectedError)
})
t.Run("when callback does not want to continue", func(t *testing.T) {
m := &eval_mocks.ConditionEvaluatorMock{}
expectedResults := eval.Results{}
m.EXPECT().Evaluate(mock.Anything, mock.Anything).Return(expectedResults, nil)
evaluator := queryEvaluator{
eval: m,
}
evaluated := 0
err := evaluator.Eval(ctx, from, interval, times, func(idx int, now time.Time, results eval.Results) (bool, error) {
evaluated++
return evaluated <= 2, nil
})
require.NoError(t, err, nil)
require.Equal(t, 3, evaluated)
})
})
}
@@ -480,6 +480,10 @@ func (alertRule *AlertRule) GetPanelID() int64 {
return -1
}
func (alertRule *AlertRule) GetInterval() time.Duration {
return time.Duration(alertRule.IntervalSeconds) * time.Second
}
type LabelOption func(map[string]string)
func WithoutInternalLabels() LabelOption {
+1 -54
View File
@@ -169,62 +169,9 @@ func HasIntegrationsDifferentProtectedFields(existing, incoming *Integration) []
var result []schema.IntegrationFieldPath
settingsDiff := diff.GetSettingsPaths()
for _, path := range settingsDiff {
if IsProtectedField(incoming.Config.Type(), path) {
if incoming.Config.IsProtectedField(path) {
result = append(result, path)
}
}
return result
}
// IsProtectedField returns true if the field at the given path is existing protected one.
// This includes:
// 1. URL fields marked as secure in the schema (e.g., webhook URLs with credentials)
// 2. URL fields NOT marked as secure but could contain credentials (e.g., API endpoints)
func IsProtectedField(integrationType schema.IntegrationType, path schema.IntegrationFieldPath) bool {
str := strings.ToLower(string(integrationType))
pathStr := path.String()
switch str {
case "prometheus-alertmanager":
return pathStr == "url"
case "dingding":
return pathStr == "url" // marked as secure
case "discord":
return pathStr == "url" // marked as secure (webhook URL)
case "googlechat":
return pathStr == "url" // marked as secure
case "jira":
return pathStr == "api_url"
case "kafka":
return pathStr == "kafkaRestProxy"
case "line":
return false
case "mqtt":
return pathStr == "brokerUrl"
case "oncall":
return pathStr == "url"
case "opsgenie":
return pathStr == "apiUrl"
case "pagerduty":
return pathStr == "url"
case "sensugo":
return pathStr == "url"
case "slack":
return pathStr == "url" || pathStr == "endpointUrl"
case "teams":
return pathStr == "url"
case "victorops":
return pathStr == "url" // marked as secure
case "webex":
return pathStr == "api_url"
case "webhook":
return pathStr == "url" ||
pathStr == "http_config.oauth2.token_url" ||
pathStr == "http_config.oauth2.proxy_config.proxy_url"
case "wecom":
return pathStr == "url" || // marked as secure
pathStr == "endpointUrl"
default:
return false
}
}
@@ -485,6 +485,7 @@ func assignReceiverConfigsUIDs(c []*definitions.PostableApiReceiver) error {
type provisioningStore interface {
GetProvenance(ctx context.Context, o models.Provisionable, org int64) (models.Provenance, error)
GetProvenances(ctx context.Context, org int64, resourceType string) (map[string]models.Provenance, error)
GetProvenancesByUIDs(ctx context.Context, org int64, resourceType string, uids []string) (map[string]models.Provenance, error)
SetProvenance(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error
DeleteProvenance(ctx context.Context, o models.Provisionable, org int64) error
}
+5 -5
View File
@@ -272,16 +272,16 @@ func (p *Converter) convertRule(orgID int64, namespaceUID string, promGroup Prom
RuleGroup: promGroup.Name,
IsPaused: isPaused,
Record: record,
}
if !isRecordingRule {
result.NotificationSettings = p.cfg.NotificationSettings
// MissingSeriesEvalsToResolve is set to 1 to match the Prometheus behaviour.
// Prometheus resolves alerts as soon as the series disappears.
// By setting this value to 1 we ensure that the alert is resolved on the first evaluation
// that doesn't have the series.
MissingSeriesEvalsToResolve: util.Pointer[int64](1),
}
if !isRecordingRule {
result.NotificationSettings = p.cfg.NotificationSettings
result.MissingSeriesEvalsToResolve = util.Pointer[int64](1)
}
if p.cfg.KeepOriginalRuleDefinition != nil && *p.cfg.KeepOriginalRuleDefinition {
+6 -1
View File
@@ -358,7 +358,12 @@ func TestPrometheusRulesToGrafana(t *testing.T) {
require.Equal(t, models.Duration(evalOffset), grafanaRule.Data[0].RelativeTimeRange.To)
require.Equal(t, models.Duration(10*time.Minute+evalOffset), grafanaRule.Data[0].RelativeTimeRange.From)
require.Equal(t, util.Pointer(int64(1)), grafanaRule.MissingSeriesEvalsToResolve)
if promRule.Record != "" {
require.Nil(t, grafanaRule.MissingSeriesEvalsToResolve)
} else {
require.Equal(t, util.Pointer(int64(1)), grafanaRule.MissingSeriesEvalsToResolve)
}
require.Equal(t, models.OkErrState, grafanaRule.ExecErrState)
require.Equal(t, models.OK, grafanaRule.NoDataState)
@@ -19,6 +19,7 @@ type alertmanagerConfigStore interface {
type ProvisioningStore interface {
GetProvenance(ctx context.Context, o models.Provisionable, org int64) (models.Provenance, error)
GetProvenances(ctx context.Context, org int64, resourceType string) (map[string]models.Provenance, error)
GetProvenancesByUIDs(ctx context.Context, org int64, resourceType string, uids []string) (map[string]models.Provenance, error)
SetProvenance(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error
DeleteProvenance(ctx context.Context, o models.Provisionable, org int64) error
}
@@ -188,6 +188,67 @@ func (_c *MockProvisioningStore_GetProvenances_Call) RunAndReturn(run func(conte
return _c
}
// GetProvenancesByUIDs provides a mock function with given fields: ctx, org, resourceType, uids
func (_m *MockProvisioningStore) GetProvenancesByUIDs(ctx context.Context, org int64, resourceType string, uids []string) (map[string]models.Provenance, error) {
ret := _m.Called(ctx, org, resourceType, uids)
if len(ret) == 0 {
panic("no return value specified for GetProvenancesByUIDs")
}
var r0 map[string]models.Provenance
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, int64, string, []string) (map[string]models.Provenance, error)); ok {
return rf(ctx, org, resourceType, uids)
}
if rf, ok := ret.Get(0).(func(context.Context, int64, string, []string) map[string]models.Provenance); ok {
r0 = rf(ctx, org, resourceType, uids)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(map[string]models.Provenance)
}
}
if rf, ok := ret.Get(1).(func(context.Context, int64, string, []string) error); ok {
r1 = rf(ctx, org, resourceType, uids)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockProvisioningStore_GetProvenancesByUIDs_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetProvenancesByUIDs'
type MockProvisioningStore_GetProvenancesByUIDs_Call struct {
*mock.Call
}
// GetProvenancesByUIDs is a helper method to define mock.On call
// - ctx context.Context
// - org int64
// - resourceType string
// - uids []string
func (_e *MockProvisioningStore_Expecter) GetProvenancesByUIDs(ctx interface{}, org interface{}, resourceType interface{}, uids interface{}) *MockProvisioningStore_GetProvenancesByUIDs_Call {
return &MockProvisioningStore_GetProvenancesByUIDs_Call{Call: _e.mock.On("GetProvenancesByUIDs", ctx, org, resourceType, uids)}
}
func (_c *MockProvisioningStore_GetProvenancesByUIDs_Call) Run(run func(ctx context.Context, org int64, resourceType string, uids []string)) *MockProvisioningStore_GetProvenancesByUIDs_Call {
_c.Call.Run(func(args mock.Arguments) {
run(args[0].(context.Context), args[1].(int64), args[2].(string), args[3].([]string))
})
return _c
}
func (_c *MockProvisioningStore_GetProvenancesByUIDs_Call) Return(_a0 map[string]models.Provenance, _a1 error) *MockProvisioningStore_GetProvenancesByUIDs_Call {
_c.Call.Return(_a0, _a1)
return _c
}
func (_c *MockProvisioningStore_GetProvenancesByUIDs_Call) RunAndReturn(run func(context.Context, int64, string, []string) (map[string]models.Provenance, error)) *MockProvisioningStore_GetProvenancesByUIDs_Call {
_c.Call.Return(run)
return _c
}
// SetProvenance provides a mock function with given fields: ctx, o, org, p
func (_m *MockProvisioningStore) SetProvenance(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error {
ret := _m.Called(ctx, o, org, p)
+10
View File
@@ -5,6 +5,7 @@ import (
"time"
"github.com/grafana/grafana-plugin-sdk-go/data"
"github.com/grafana/grafana/pkg/services/featuremgmt"
ngmodels "github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/setting"
@@ -13,6 +14,10 @@ import (
// JitterStrategy represents a modifier to alert rule timing that affects how evaluations are distributed.
type JitterStrategy int
func (s JitterStrategy) String() string {
return [...]string{"never", "by group", "by rule"}[s]
}
const (
JitterNever JitterStrategy = iota
JitterByGroup
@@ -57,6 +62,11 @@ func jitterOffsetInTicks(r *ngmodels.AlertRule, baseInterval time.Duration, stra
return res
}
// JitterOffsetInDuration gives the jitter offset for a rule, in terms of a duration relative to its interval and a base interval.
func JitterOffsetInDuration(r *ngmodels.AlertRule, baseInterval time.Duration, strategy JitterStrategy) time.Duration {
return time.Duration(jitterOffsetInTicks(r, baseInterval, strategy)) * baseInterval
}
func jitterHash(r *ngmodels.AlertRule, strategy JitterStrategy) uint64 {
ls := data.Labels{
"name": r.RuleGroup,
@@ -44,7 +44,11 @@ func New(c clock.Clock, interval time.Duration, metric *Metrics, logger log.Logg
}
func getStartTick(clk clock.Clock, interval time.Duration) time.Time {
nano := clk.Now().UnixNano()
return GetStartTick(clk.Now(), interval)
}
func GetStartTick(t time.Time, interval time.Duration) time.Time {
nano := t.UnixNano()
return time.Unix(0, nano-(nano%interval.Nanoseconds()))
}
+3 -3
View File
@@ -17,7 +17,7 @@ import (
const StateHistoryWriteTimeout = time.Minute
func shouldRecord(transition state.StateTransition) bool {
func ShouldRecord(transition state.StateTransition) bool {
if !transition.Changed() {
return false
}
@@ -35,9 +35,9 @@ func shouldRecord(transition state.StateTransition) bool {
}
// ShouldRecordAnnotation returns true if an annotation should be created for a given state transition.
// This is stricter than shouldRecord to avoid cluttering panels with state transitions.
// This is stricter than ShouldRecord to avoid cluttering panels with state transitions.
func ShouldRecordAnnotation(t state.StateTransition) bool {
if !shouldRecord(t) {
if !ShouldRecord(t) {
return false
}
@@ -92,7 +92,7 @@ func TestShouldRecord(t *testing.T) {
}
t.Run(fmt.Sprintf("%s -> %s should be %v", trans.PreviousFormatted(), trans.Formatted(), !ok), func(t *testing.T) {
require.Equal(t, !ok, shouldRecord(trans))
require.Equal(t, !ok, ShouldRecord(trans))
})
}
}
+90 -42
View File
@@ -41,6 +41,69 @@ const (
dfLabels = "labels"
)
// QueryResultBuilder is a builder for a data frame that represents query results from Loki.
// It contains three fields: time (timestamp), line (JSON data), and labels (JSON labels).
type QueryResultBuilder struct {
frame *data.Frame
}
// NewQueryResultBuilder creates a new QueryResultBuilder with the specified capacity.
// The capacity is used to pre-allocate the underlying slices for better performance.
func NewQueryResultBuilder(capacity int) *QueryResultBuilder {
frame := data.NewFrame("states")
lbls := data.Labels(map[string]string{})
// We represent state history as a single merged history, that roughly corresponds to what you get in the Grafana Explore tab when querying Loki directly.
// The format is composed of the following vectors:
// 1. `time` - timestamp - when the transition happened
// 2. `line` - JSON - the full data of the transition
// 3. `labels` - JSON - the labels associated with that state transition
times := make([]time.Time, 0, capacity)
lines := make([]json.RawMessage, 0, capacity)
labels := make([]json.RawMessage, 0, capacity)
frame.Fields = append(frame.Fields, data.NewField(dfTime, lbls, times))
frame.Fields = append(frame.Fields, data.NewField(dfLine, lbls, lines))
frame.Fields = append(frame.Fields, data.NewField(dfLabels, lbls, labels))
return &QueryResultBuilder{frame: frame}
}
func (qr QueryResultBuilder) AddRowRaw(timestamp time.Time, line json.RawMessage, labels json.RawMessage) {
frame := qr.frame
frame.Fields[0].Append(timestamp)
frame.Fields[1].Append(line)
frame.Fields[2].Append(labels)
}
func (qr QueryResultBuilder) AddRow(timestamp time.Time, line LokiEntry, labels json.RawMessage) error {
lineBytes, err := json.Marshal(line)
if err != nil {
return err
}
qr.AddRowRaw(timestamp, lineBytes, labels)
return nil
}
// ToFrame converts the QueryResultBuilder back to a data.Frame.
func (qr QueryResultBuilder) ToFrame() *data.Frame {
return qr.frame
}
func (qr QueryResultBuilder) AddWarn(s string) {
m := qr.frame.Meta
if m == nil {
m = &data.FrameMeta{}
qr.frame.SetMeta(m)
}
m.Notices = append(m.Notices, data.Notice{
Severity: data.NoticeSeverityWarning,
Text: s,
Link: "",
Inspect: 0,
})
}
const (
StateHistoryLabelKey = "from"
StateHistoryLabelValue = "state-history"
@@ -191,20 +254,7 @@ func (h RemoteLokiBackend) merge(res []lokiclient.Stream, folderUIDToFilter []st
totalLen += len(arr.Values)
}
// Create a new slice to store the merged elements.
frame := data.NewFrame("states")
// We merge all series into a single linear history.
lbls := data.Labels(map[string]string{})
// We represent state history as a single merged history, that roughly corresponds to what you get in the Grafana Explore tab when querying Loki directly.
// The format is composed of the following vectors:
// 1. `time` - timestamp - when the transition happened
// 2. `line` - JSON - the full data of the transition
// 3. `labels` - JSON - the labels associated with that state transition
times := make([]time.Time, 0, totalLen)
lines := make([]json.RawMessage, 0, totalLen)
labels := make([]json.RawMessage, 0, totalLen)
queryResult := NewQueryResultBuilder(totalLen)
// Initialize a slice of pointers to the current position in each array.
pointers := make([]int, len(res))
@@ -259,17 +309,10 @@ func (h RemoteLokiBackend) merge(res []lokiclient.Stream, folderUIDToFilter []st
pointers[minElStreamIdx]++
continue
}
times = append(times, time.Unix(0, tsNano))
labels = append(labels, lblsJson)
lines = append(lines, json.RawMessage(entryBytes))
queryResult.AddRowRaw(time.Unix(0, tsNano), entryBytes, lblsJson)
pointers[minElStreamIdx]++
}
frame.Fields = append(frame.Fields, data.NewField(dfTime, lbls, times))
frame.Fields = append(frame.Fields, data.NewField(dfLine, lbls, lines))
frame.Fields = append(frame.Fields, data.NewField(dfLabels, lbls, labels))
return frame, nil
return queryResult.ToFrame(), nil
}
func StatesToStream(rule history_model.RuleMeta, states []state.StateTransition, externalLabels map[string]string, logger log.Logger) lokiclient.Stream {
@@ -282,28 +325,11 @@ func StatesToStream(rule history_model.RuleMeta, states []state.StateTransition,
samples := make([]lokiclient.Sample, 0, len(states))
for _, state := range states {
if !shouldRecord(state) {
if !ShouldRecord(state) {
continue
}
sanitizedLabels := removePrivateLabels(state.Labels)
entry := LokiEntry{
SchemaVersion: 1,
Previous: state.PreviousFormatted(),
Current: state.Formatted(),
Values: valuesAsDataBlob(state.State),
Condition: rule.Condition,
DashboardUID: rule.DashboardUID,
PanelID: rule.PanelID,
Fingerprint: labelFingerprint(sanitizedLabels),
RuleTitle: rule.Title,
RuleID: rule.ID,
RuleUID: rule.UID,
InstanceLabels: sanitizedLabels,
}
if state.State.State == eval.Error {
entry.Error = state.Error.Error()
}
entry := StateTransitionToLokiEntry(rule, state)
jsn, err := json.Marshal(entry)
if err != nil {
@@ -324,6 +350,28 @@ func StatesToStream(rule history_model.RuleMeta, states []state.StateTransition,
}
}
func StateTransitionToLokiEntry(rule history_model.RuleMeta, state state.StateTransition) LokiEntry {
sanitizedLabels := removePrivateLabels(state.Labels)
entry := LokiEntry{
SchemaVersion: 1,
Previous: state.PreviousFormatted(),
Current: state.Formatted(),
Values: valuesAsDataBlob(state.State),
Condition: rule.Condition,
DashboardUID: rule.DashboardUID,
PanelID: rule.PanelID,
Fingerprint: labelFingerprint(sanitizedLabels),
RuleTitle: rule.Title,
RuleID: rule.ID,
RuleUID: rule.UID,
InstanceLabels: sanitizedLabels,
}
if state.State.State == eval.Error && state.Error != nil {
entry.Error = state.Error.Error()
}
return entry
}
func (h *RemoteLokiBackend) recordStreams(ctx context.Context, stream lokiclient.Stream, logger log.Logger) error {
if err := h.client.Push(ctx, []lokiclient.Stream{stream}); err != nil {
return err
@@ -62,6 +62,30 @@ func (st DBstore) GetProvenances(ctx context.Context, org int64, resourceType st
return resultMap, err
}
// GetProvenancesByUIDs gets the provenance status for specific UIDs.
func (st DBstore) GetProvenancesByUIDs(ctx context.Context, org int64, resourceType string, uids []string) (map[string]models.Provenance, error) {
if len(uids) == 0 {
return map[string]models.Provenance{}, nil
}
result := make(map[string]models.Provenance, len(uids))
err := st.SQLStore.WithDbSession(ctx, func(sess *db.Session) error {
rawData, err := sess.Table(provenanceRecord{}).
Where("record_type = ? AND org_id = ?", resourceType, org).
In("record_key", uids).
Cols("record_key", "provenance").
QueryString()
if err != nil {
return fmt.Errorf("failed to query for existing provenance status: %w", err)
}
for _, data := range rawData {
result[data["record_key"]] = models.Provenance(data["provenance"])
}
return nil
})
return result, err
}
// SetProvenance changes the provenance status for a provisionable object.
func (st DBstore) SetProvenance(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error {
recordType := o.ResourceType()
@@ -133,6 +133,55 @@ func TestIntegrationProvisioningStore(t *testing.T) {
require.Equal(t, models.ProvenanceAPI, p[rule2.UID])
})
t.Run("Store should return provenances by UIDs", func(t *testing.T) {
const orgID = 124
rule1 := models.AlertRule{UID: "uid-1", OrgID: orgID}
rule2 := models.AlertRule{UID: "uid-2", OrgID: orgID}
rule3 := models.AlertRule{UID: "uid-3", OrgID: orgID}
err := store.SetProvenance(context.Background(), &rule1, orgID, models.ProvenanceFile)
require.NoError(t, err)
err = store.SetProvenance(context.Background(), &rule2, orgID, models.ProvenanceAPI)
require.NoError(t, err)
err = store.SetProvenance(context.Background(), &rule3, orgID, models.ProvenanceFile)
require.NoError(t, err)
// Fetch only rule1 and rule2
p, err := store.GetProvenancesByUIDs(context.Background(), orgID, rule1.ResourceType(), []string{rule1.UID, rule2.UID})
require.NoError(t, err)
require.Len(t, p, 2)
require.Equal(t, models.ProvenanceFile, p[rule1.UID])
require.Equal(t, models.ProvenanceAPI, p[rule2.UID])
_, exists := p[rule3.UID]
require.False(t, exists)
})
t.Run("GetProvenancesByUIDs returns empty map for empty UIDs", func(t *testing.T) {
p, err := store.GetProvenancesByUIDs(context.Background(), 1, "alertRule", []string{})
require.NoError(t, err)
require.Empty(t, p)
})
t.Run("GetProvenancesByUIDs respects org ID", func(t *testing.T) {
const orgID1 = 125
const orgID2 = 126
rule := models.AlertRule{UID: "cross-org-uid"}
err := store.SetProvenance(context.Background(), &rule, orgID1, models.ProvenanceFile)
require.NoError(t, err)
// Should not find in different org
p, err := store.GetProvenancesByUIDs(context.Background(), orgID2, rule.ResourceType(), []string{rule.UID})
require.NoError(t, err)
require.Empty(t, p)
// Should find in correct org
p, err = store.GetProvenancesByUIDs(context.Background(), orgID1, rule.ResourceType(), []string{rule.UID})
require.NoError(t, err)
require.Len(t, p, 1)
require.Equal(t, models.ProvenanceFile, p[rule.UID])
})
t.Run("Store should delete provenance correctly", func(t *testing.T) {
const orgID = 1234
ruleOrg := models.AlertRule{
@@ -8,12 +8,13 @@ import (
)
type FakeProvisioningStore struct {
Calls []Call
Records map[int64]map[string]models.Provenance
GetProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64) (models.Provenance, error)
GetProvenancesFunc func(ctx context.Context, orgID int64, resourceType string) (map[string]models.Provenance, error)
SetProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error
DeleteProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64) error
Calls []Call
Records map[int64]map[string]models.Provenance
GetProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64) (models.Provenance, error)
GetProvenancesFunc func(ctx context.Context, orgID int64, resourceType string) (map[string]models.Provenance, error)
GetProvenancesByUIDsFunc func(ctx context.Context, orgID int64, resourceType string, uids []string) (map[string]models.Provenance, error)
SetProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error
DeleteProvenanceFunc func(ctx context.Context, o models.Provisionable, org int64) error
}
func NewFakeProvisioningStore() *FakeProvisioningStore {
@@ -51,6 +52,23 @@ func (f *FakeProvisioningStore) GetProvenances(ctx context.Context, orgID int64,
return results, nil
}
func (f *FakeProvisioningStore) GetProvenancesByUIDs(ctx context.Context, orgID int64, resourceType string, uids []string) (map[string]models.Provenance, error) {
f.Calls = append(f.Calls, Call{MethodName: "GetProvenancesByUIDs", Arguments: []any{ctx, orgID, resourceType, uids}})
if f.GetProvenancesByUIDsFunc != nil {
return f.GetProvenancesByUIDsFunc(ctx, orgID, resourceType, uids)
}
results := make(map[string]models.Provenance)
if val, ok := f.Records[orgID]; ok {
for _, uid := range uids {
key := uid + resourceType
if prov, ok := val[key]; ok {
results[uid] = prov
}
}
}
return results, nil
}
func (f *FakeProvisioningStore) SetProvenance(ctx context.Context, o models.Provisionable, org int64, p models.Provenance) error {
f.Calls = append(f.Calls, Call{MethodName: "SetProvenance", Arguments: []any{ctx, o, org, p}})
if f.SetProvenanceFunc != nil {
+7
View File
@@ -156,6 +156,8 @@ type UnifiedAlertingSettings struct {
// AlertmanagerMaxTemplateOutputSize specifies the maximum allowed size for rendered template output in bytes.
AlertmanagerMaxTemplateOutputSize int64
BacktestingMaxEvaluations int
}
type RecordingRuleSettings struct {
@@ -594,6 +596,11 @@ func (cfg *Cfg) ReadUnifiedAlertingSettings(iniFile *ini.File) error {
return fmt.Errorf("setting 'alertmanager_max_template_output_bytes' is invalid, only 0 or a positive integer are allowed")
}
uaCfg.BacktestingMaxEvaluations = ua.Key("backtesting_max_evaluations").MustInt(100)
if uaCfg.BacktestingMaxEvaluations < 0 {
uaCfg.BacktestingMaxEvaluations = 100
}
cfg.UnifiedAlerting = uaCfg
return nil
}
@@ -68,7 +68,7 @@ func TestBacktesting(t *testing.T) {
require.Truef(t, ok, "The data file does not contain a field `data`")
status, body := apiCli.SubmitRuleForBacktesting(t, request)
require.Equal(t, http.StatusOK, status)
require.Equalf(t, http.StatusOK, status, "Response: %s", body)
var result data.Frame
require.NoErrorf(t, json.Unmarshal([]byte(body), &result), "cannot parse response to data frame")
})
@@ -107,6 +107,7 @@ func TestBacktesting(t *testing.T) {
resourcepermissions.SetResourcePermissionCommand{
Actions: []string{
accesscontrol.ActionAlertingRuleRead,
accesscontrol.ActionAlertingRuleUpdate,
},
Resource: "folders",
ResourceID: "*",
@@ -93,6 +93,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "",
@@ -225,6 +226,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "",
@@ -1300,6 +1302,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "",
@@ -1405,6 +1408,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -2476,6 +2480,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -2645,6 +2650,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -2935,6 +2941,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -3139,6 +3146,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -4405,6 +4413,7 @@
"is": ""
},
"required": false,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -5334,6 +5343,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -6630,6 +6640,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -6928,6 +6939,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "token",
@@ -6946,6 +6958,7 @@
"is": ""
},
"required": false,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -9237,6 +9250,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -11515,6 +11529,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "",
@@ -12308,6 +12323,7 @@
"is": ""
},
"required": false,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -13001,6 +13017,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -13443,6 +13460,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -13641,6 +13659,7 @@
"is": ""
},
"required": false,
"protected": true,
"validationRule": "",
"secure": false,
"dependsOn": "",
@@ -15072,6 +15091,7 @@
"is": ""
},
"required": true,
"protected": true,
"validationRule": "",
"secure": true,
"dependsOn": "secret",
@@ -12,6 +12,9 @@
},
"condition": "A",
"no_data_state": "Alerting",
"title": "test-rule-backtesting-data",
"rule_group": "test-group",
"namespace_uid": "test-namespace",
"data": [
{
"refId": "A",
@@ -193,6 +196,9 @@
},
"condition": "C",
"no_data_state": "Alerting",
"title": "test-rule-backtesting-data",
"rule_group": "test-group",
"namespace_uid": "test-namespace",
"data": [
{
"refId": "A",
@@ -1802,6 +1802,15 @@
}
}
},
{
"name": "facetLimit",
"in": "query",
"description": "maximum number of terms to return per facet (default 50, max 1000)",
"schema": {
"type": "integer",
"format": "int64"
}
},
{
"name": "tags",
"in": "query",
+11 -5
View File
@@ -1,23 +1,29 @@
import { isArray, isPlainObject } from 'lodash';
import { isArray, isPlainObject, isString } from 'lodash';
/**
* @returns A deep clone of the object, but with any null value removed.
* @param value - The object to be cloned and cleaned.
* @param convertInfinity - If true, -Infinity or Infinity is converted to 0.
* This is because Infinity is not a valid JSON value, and sometimes we want to convert it to 0 instead of default null.
* @param stripBOMs - If true, strips Byte Order Mark (BOM) characters from all strings.
* BOMs (U+FEFF) can cause CUE validation errors ("illegal byte order mark").
*/
export function sortedDeepCloneWithoutNulls<T>(value: T, convertInfinity?: boolean): T {
export function sortedDeepCloneWithoutNulls<T>(value: T, convertInfinity?: boolean, stripBOMs?: boolean): T {
if (isArray(value)) {
return value.map((item) => sortedDeepCloneWithoutNulls(item, convertInfinity)) as unknown as T;
return value.map((item) => sortedDeepCloneWithoutNulls(item, convertInfinity, stripBOMs)) as unknown as T;
}
if (isPlainObject(value)) {
return Object.keys(value as { [key: string]: any })
.sort()
.reduce((acc: any, key) => {
const v = (value as any)[key];
let v = (value as any)[key];
// Remove null values
if (v != null) {
acc[key] = sortedDeepCloneWithoutNulls(v, convertInfinity);
// Strip BOMs from strings
if (stripBOMs && isString(v)) {
v = v.replace(/\ufeff/g, '');
}
acc[key] = sortedDeepCloneWithoutNulls(v, convertInfinity, stripBOMs);
}
if (convertInfinity && (v === Infinity || v === -Infinity)) {
@@ -0,0 +1,50 @@
import { DataFrameJSON } from '@grafana/data';
import { AlertQuery, GrafanaAlertStateDecision, Labels } from 'app/types/unified-alerting-dto';
import { alertingApi } from './alertingApi';
/**
* Request body for the backtest API matching the BacktestConfig struct in the backend
*/
export interface BacktestRequest {
// Required time range fields
from: string; // ISO 8601 timestamp
to: string; // ISO 8601 timestamp
interval: string; // e.g., "1m", "5m"
// Required alert definition fields
condition: string;
data: AlertQuery[];
title: string;
no_data_state?: GrafanaAlertStateDecision;
exec_err_state?: GrafanaAlertStateDecision;
// Optional duration fields
for?: string;
keep_firing_for?: string;
// Optional metadata fields
labels?: Labels;
missing_series_evals_to_resolve?: number;
// Optional rule identification fields
uid?: string;
rule_group?: string;
namespace_uid?: string;
}
export const BACKTEST_URL = '/api/v1/rule/backtest';
export const backtestApi = alertingApi.injectEndpoints({
endpoints: (build) => ({
runBacktest: build.mutation<DataFrameJSON, BacktestRequest>({
query: (requestBody) => ({
url: BACKTEST_URL,
method: 'POST',
body: requestBody,
}),
}),
}),
});
export const { useRunBacktestMutation } = backtestApi;
@@ -0,0 +1,63 @@
import { useCallback, useState } from 'react';
import { TimeRange, rangeUtil } from '@grafana/data';
import { Trans, t } from '@grafana/i18n';
import { Button, Drawer, Dropdown, Menu, MenuItem } from '@grafana/ui';
import { RuleFormValues } from '../../types/rule-form';
import { BacktestPanel } from './BacktestPanel';
interface BacktestDropdownButtonProps {
ruleDefinition: RuleFormValues;
}
export function BacktestDropdownButton({ ruleDefinition }: BacktestDropdownButtonProps) {
const [isBacktestPanelOpen, setIsBacktestPanelOpen] = useState(false);
const [backtestTimeRange, setBacktestTimeRange] = useState<TimeRange>();
const handleTimeRangeSelect = useCallback((rawFrom: string) => {
const timeRange = rangeUtil.convertRawToRange({ from: rawFrom, to: 'now' });
setBacktestTimeRange(timeRange);
setIsBacktestPanelOpen(true);
}, []);
const handleCustomSelect = useCallback(() => {
setBacktestTimeRange(undefined);
setIsBacktestPanelOpen(true);
}, []);
return (
<>
<Dropdown
overlay={
<Menu>
<MenuItem
label={t('alerting.queryAndExpressionsStep.last15m', 'Last 15 minutes')}
onClick={() => handleTimeRangeSelect('now-15m')}
/>
<MenuItem
label={t('alerting.queryAndExpressionsStep.last1h', 'Last 1 hour')}
onClick={() => handleTimeRangeSelect('now-1h')}
/>
<MenuItem label={t('alerting.queryAndExpressionsStep.custom', 'Custom')} onClick={handleCustomSelect} />
</Menu>
}
>
<Button icon="bug" variant="secondary">
<Trans i18nKey="alerting.queryAndExpressionsStep.testRule">Test Rule</Trans>
</Button>
</Dropdown>
{isBacktestPanelOpen && (
<Drawer
title={t('alerting.backtest.panel-title', 'Rule Retroactive Testing')}
onClose={() => setIsBacktestPanelOpen(false)}
size="md"
>
<BacktestPanel ruleDefinition={ruleDefinition} initialTimeRange={backtestTimeRange} />
</Drawer>
)}
</>
);
}
@@ -0,0 +1,200 @@
import { css } from '@emotion/css';
import { fromPairs, isEmpty, isEqual } from 'lodash';
import { useCallback, useEffect, useRef, useState } from 'react';
import { AlertLabels } from '@grafana/alerting/unstable';
import { DataFrameJSON, GrafanaTheme2, TimeRange, rangeUtil } from '@grafana/data';
import { Trans, t } from '@grafana/i18n';
import {
Alert,
Icon,
LoadingPlaceholder,
RefreshPicker,
Stack,
Text,
TimeRangePicker,
Tooltip,
useStyles2,
} from '@grafana/ui';
import { useRunBacktestMutation } from '../../api/backtestApi';
import { RuleFormValues } from '../../types/rule-form';
import { combineMatcherStrings } from '../../utils/alertmanager';
import { messageFromError } from '../../utils/redux';
import { formValuesToRulerGrafanaRuleDTO } from '../../utils/rule-form';
import { LogRecordViewerByTimestamp } from '../rules/state-history/LogRecordViewer';
import { LogTimelineViewer } from '../rules/state-history/LogTimelineViewer';
import { useFrameSubset } from '../rules/state-history/LokiStateHistory';
import { useRuleHistoryRecords } from '../rules/state-history/useRuleHistoryRecords';
interface BacktestPanelProps {
ruleDefinition: RuleFormValues;
initialTimeRange?: TimeRange;
}
export function BacktestPanel({ ruleDefinition, initialTimeRange }: BacktestPanelProps) {
const styles = useStyles2(getStyles);
const [timeRange, setTimeRange] = useState<TimeRange>(
initialTimeRange || rangeUtil.convertRawToRange({ from: 'now-15m', to: 'now' })
);
const [stateHistory, setStateHistory] = useState<DataFrameJSON>();
const [instancesFilter, setInstancesFilter] = useState('');
const shouldRunInitialBacktest = useRef(!!initialTimeRange);
const [runBacktest, { isLoading, error: mutationError }] = useRunBacktestMutation();
const handleRunBacktest = useCallback(async () => {
// Convert form values to the proper AlertRule format
const alertRule = formValuesToRulerGrafanaRuleDTO(ruleDefinition);
// Build requestBody matching BacktestConfig struct
const requestBody = {
// Required time range fields
from: timeRange.from.toISOString(),
to: timeRange.to.toISOString(),
interval: ruleDefinition.evaluateEvery,
// Required alert definition fields
condition: alertRule.grafana_alert.condition,
data: alertRule.grafana_alert.data,
title: alertRule.grafana_alert.title,
no_data_state: alertRule.grafana_alert.no_data_state,
exec_err_state: alertRule.grafana_alert.exec_err_state,
// Optional duration fields
for: alertRule.for,
keep_firing_for: alertRule.keep_firing_for,
// Optional metadata fields
labels: alertRule.labels,
missing_series_evals_to_resolve: alertRule.grafana_alert.missing_series_evals_to_resolve,
// Optional rule identification fields
uid: alertRule.grafana_alert.uid,
rule_group: ruleDefinition.group,
namespace_uid: ruleDefinition.folder?.uid,
};
try {
const result = await runBacktest(requestBody).unwrap();
setStateHistory(result);
} catch (err) {
// Error is handled by RTK Query and available via mutationError
}
}, [ruleDefinition, timeRange, runBacktest]);
// Update time range when initialTimeRange prop changes
useEffect(() => {
if (initialTimeRange) {
setTimeRange(initialTimeRange);
}
}, [initialTimeRange]);
// Run backtest once after initial mount when timeRange is synchronized with initialTimeRange
useEffect(() => {
if (shouldRunInitialBacktest.current && initialTimeRange && isEqual(timeRange, initialTimeRange)) {
shouldRunInitialBacktest.current = false;
handleRunBacktest();
}
}, [initialTimeRange, timeRange, handleRunBacktest]);
const { dataFrames, historyRecords, commonLabels } = useRuleHistoryRecords(stateHistory, instancesFilter);
const { frameSubset, frameTimeRange } = useFrameSubset(dataFrames);
const onLogRecordLabelClick = useCallback(
(label: string) => {
const matcherString = combineMatcherStrings(instancesFilter, label);
setInstancesFilter(matcherString);
},
[instancesFilter]
);
const hasResults = stateHistory !== undefined;
const notices = stateHistory?.schema?.meta?.notices || [];
const errorMessage = mutationError ? messageFromError(mutationError) : null;
return (
<div>
<Stack direction="row" alignItems="flex-end" justifyContent="flex-end">
<TimeRangePicker
value={timeRange}
onChange={setTimeRange}
onChangeTimeZone={() => {}}
onMoveBackward={() => {}}
onMoveForward={() => {}}
onZoom={() => {}}
/>
<RefreshPicker
onRefresh={handleRunBacktest}
onIntervalChanged={() => {}}
isLoading={isLoading}
noIntervalPicker={true}
/>
</Stack>
<div className={styles.scrollableContent}>
{isLoading && <LoadingPlaceholder text={t('alerting.backtest.loading', 'Running backtest...')} />}
{errorMessage && (
<Alert title={t('alerting.backtest.error-title', 'Failed to run backtest')}>{errorMessage}</Alert>
)}
{!isLoading && !mutationError && hasResults && notices.length > 0 && (
<Stack direction="column" gap={1}>
{notices.map((notice, index) => (
<Alert key={index} severity={notice.severity || 'info'} title="">
{notice.text}
</Alert>
))}
</Stack>
)}
{!isLoading && !mutationError && hasResults && (
<div className={styles.resultsContainer}>
{!isEmpty(commonLabels) && (
<Stack gap={1} alignItems="center" wrap="wrap">
<Stack gap={0.5} alignItems="center" minWidth="fit-content">
<Text variant="bodySmall">
<Trans i18nKey="alerting.loki-state-history.common-labels">Common labels</Trans>
</Text>
<Tooltip
content={t(
'alerting.loki-state-history.tooltip-common-labels',
'Common labels are the ones attached to all of the alert instances'
)}
>
<Icon name="info-circle" size="sm" />
</Tooltip>
</Stack>
<AlertLabels labels={fromPairs(commonLabels)} size="sm" />
</Stack>
)}
<LogTimelineViewer frames={frameSubset} timeRange={frameTimeRange} />
<LogRecordViewerByTimestamp
records={historyRecords}
commonLabels={commonLabels}
onLabelClick={onLogRecordLabelClick}
/>
</div>
)}
</div>
</div>
);
}
const getStyles = (theme: GrafanaTheme2) => ({
scrollableContent: css({
flex: 1,
display: 'flex',
flexDirection: 'column',
paddingTop: theme.spacing(2),
overflow: 'hidden',
}),
resultsContainer: css({
display: 'flex',
flexDirection: 'column',
gap: theme.spacing(2),
flex: 1,
overflow: 'hidden',
}),
});
@@ -60,6 +60,7 @@ import {
formValuesToRulerRuleDTO,
} from '../../../utils/rule-form';
import { fromRulerRule, fromRulerRuleAndRuleGroupIdentifier } from '../../../utils/rule-id';
import { BacktestDropdownButton } from '../../backtesting/BacktestDropdownButton';
import { GrafanaRuleExporter } from '../../export/GrafanaRuleExporter';
import { AlertRuleNameAndMetric } from '../AlertRuleNameInput';
import AnnotationsStep from '../AnnotationsStep';
@@ -290,6 +291,8 @@ export const AlertRuleForm = ({ existing, prefill, isManualRestore }: Props) =>
<Trans i18nKey="alerting.alert-rule-form.action-buttons.edit-yaml">Edit YAML</Trans>
</Button>
)}
{config.featureToggles.alertingBacktesting && <BacktestDropdownButton ruleDefinition={watch()} />}
</Stack>
</Stack>
</div>
+2 -185
View File
@@ -1,11 +1,7 @@
import { AnnotationEvent, DataFrame, toDataFrame } from '@grafana/data';
import { config, getBackendSrv } from '@grafana/runtime';
import { getBackendSrv } from '@grafana/runtime';
import { StateHistoryItem } from 'app/types/unified-alerting';
import { getAPINamespace } from '../../api/utils';
import { ScopedResourceClient } from '../apiserver/client';
import type { Resource, ResourceForCreate, ListOptions } from '../apiserver/types';
import { AnnotationTagsResponse } from './types';
export interface AnnotationServer {
@@ -51,190 +47,11 @@ class LegacyAnnotationServer implements AnnotationServer {
}
}
// K8s-style annotation spec based on the CUE definition
interface K8sAnnotationSpec {
text: string;
time: number;
timeEnd?: number;
dashboardUID?: string;
panelID?: number;
tags?: string[];
}
interface K8sAnnotationTagsResponse {
tags: Array<{ tag: string; count: number }>;
}
const K8S_ANNOTATION_API_CONFIG = {
group: 'annotation.grafana.app',
version: 'v0alpha1',
resource: 'annotations',
};
class K8sAnnotationServer implements AnnotationServer {
private client: ScopedResourceClient<K8sAnnotationSpec>;
constructor() {
this.client = new ScopedResourceClient<K8sAnnotationSpec>(K8S_ANNOTATION_API_CONFIG);
}
async query(params: Record<string, unknown>, requestId: string): Promise<DataFrame> {
const listOpts: ListOptions = {};
if (params.limit) {
listOpts.limit = Number(params.limit);
}
const fieldSelectors: string[] = [];
if (params.dashboardUID) {
fieldSelectors.push(`spec.dashboardUID=${params.dashboardUID}`);
}
if (params.panelId) {
fieldSelectors.push(`spec.panelID=${params.panelId}`);
}
if (params.from) {
fieldSelectors.push(`spec.time=${params.from}`);
}
if (params.to) {
fieldSelectors.push(`spec.timeEnd=${params.to}`);
}
if (fieldSelectors.length > 0) {
listOpts.fieldSelector = fieldSelectors.join(',');
}
const result = await this.client.list(listOpts);
let annotations = result.items.map((item: Resource<K8sAnnotationSpec>) => ({
id: item.metadata.name,
...item.spec,
panelId: item.spec.panelID,
}));
// Client-side tag filtering (tags are not in SelectableFields since they're in an array)
if (params.tags && Array.isArray(params.tags) && params.tags.length > 0) {
const tags = params.tags;
annotations = annotations.filter((anno) => {
if (!anno.tags || anno.tags.length === 0) {
return false;
}
return tags.every((tag) => anno.tags!.includes(tag));
});
}
return toDataFrame(annotations);
}
async forAlert(alertUID: string): Promise<StateHistoryItem[]> {
// For now, we filter client-side since label selector support for alertUID may not be implemented
const result = await this.client.list({
limit: 1000,
});
// Filter by tags that contain the alertUID
// Alert annotations typically have the alert UID in their tags
return result.items
.filter((item: Resource<K8sAnnotationSpec>) => {
// Check if any tag contains the alertUID
return item.spec.tags?.some((tag) => tag.includes(alertUID));
})
.map((item: Resource<K8sAnnotationSpec>) => ({
id: item.metadata.name,
...item.spec,
panelId: item.spec.panelID,
})) as any;
}
async save(annotation: AnnotationEvent): Promise<AnnotationEvent> {
const resource: ResourceForCreate<K8sAnnotationSpec> = {
metadata: {
name: '', // Will be auto-generated by the server
},
spec: {
text: annotation.text || '',
time: annotation.time || Date.now(),
timeEnd: annotation.timeEnd,
dashboardUID: annotation.dashboardUID ?? undefined,
panelID: annotation.panelId,
tags: annotation.tags,
},
};
const result = await this.client.create(resource);
return {
...annotation,
id: result.metadata.name,
...result.spec,
panelId: result.spec.panelID,
};
}
async update(annotation: AnnotationEvent): Promise<unknown> {
if (!annotation.id) {
throw new Error('Annotation ID is required for update');
}
// Get the existing resource to preserve metadata (especially resourceVersion)
const existing = await this.client.get(String(annotation.id));
// Update only the spec fields, preserve all metadata
const updated: Resource<K8sAnnotationSpec> = {
apiVersion: existing.apiVersion,
kind: existing.kind,
metadata: {
...existing.metadata,
// Preserve critical metadata fields for update
},
spec: {
text: annotation.text !== undefined ? annotation.text : existing.spec.text,
time: annotation.time !== undefined ? annotation.time : existing.spec.time,
timeEnd: annotation.timeEnd !== undefined ? annotation.timeEnd : existing.spec.timeEnd,
dashboardUID:
annotation.dashboardUID !== undefined && annotation.dashboardUID !== null
? annotation.dashboardUID
: existing.spec.dashboardUID,
panelID: annotation.panelId !== undefined ? annotation.panelId : existing.spec.panelID,
tags: annotation.tags !== undefined ? annotation.tags : existing.spec.tags,
},
};
return this.client.update(updated);
}
async delete(annotation: AnnotationEvent): Promise<unknown> {
if (!annotation.id) {
throw new Error('Annotation ID is required for delete');
}
return this.client.delete(String(annotation.id), false);
}
async tags(): Promise<Array<{ term: string; count: number }>> {
// Use the custom /tags route defined in the CUE manifest
const namespace = getAPINamespace();
const url = `/apis/${K8S_ANNOTATION_API_CONFIG.group}/${K8S_ANNOTATION_API_CONFIG.version}/namespaces/${namespace}/tags`;
const response = await getBackendSrv().get<K8sAnnotationTagsResponse>(url, { limit: 1000 });
return response.tags.map(({ tag, count }) => ({
term: tag,
count,
}));
}
}
let instance: AnnotationServer | null = null;
export function annotationServer(): AnnotationServer {
if (!instance) {
if (config.featureToggles.kubernetesAnnotations) {
instance = new K8sAnnotationServer();
} else {
instance = new LegacyAnnotationServer();
}
instance = new LegacyAnnotationServer();
}
return instance;
}
@@ -12,7 +12,7 @@ import {
} from '@grafana/data';
import { getPanelPlugin } from '@grafana/data/test';
import { setPluginImportUtils, setRunRequest } from '@grafana/runtime';
import { SceneCanvasText, SceneDataTransformer, SceneQueryRunner, VizPanel } from '@grafana/scenes';
import { SceneCanvasText, SceneDataTransformer, SceneGridLayout, SceneQueryRunner, VizPanel } from '@grafana/scenes';
import * as libpanels from 'app/features/library-panels/state/api';
import { getStandardTransformers } from 'app/features/transformers/standardTransformers';
@@ -183,6 +183,51 @@ describe('InspectJsonTab', () => {
expect(tab.state.onClose).toHaveBeenCalled();
});
it('Can update gridPos and forces layout re-render', async () => {
const { tab, panel, scene } = await buildTestScene();
// Get the layout manager and spy on the grid's forceRender
const layoutManager = scene.state.body as DefaultGridLayoutManager;
const grid = layoutManager.state.grid as SceneGridLayout;
const forceRenderSpy = jest.spyOn(grid, 'forceRender');
const originalGridItem = panel.parent as DashboardGridItem;
expect(originalGridItem.state.x).toBe(0);
expect(originalGridItem.state.y).toBe(0);
expect(originalGridItem.state.width).toBe(8);
expect(originalGridItem.state.height).toBe(10);
tab.onCodeEditorBlur(`{
"id": 12,
"type": "table",
"title": "Panel A",
"gridPos": {
"x": 5,
"y": 10,
"w": 12,
"h": 8
},
"options": {},
"fieldConfig": {},
"transformations": [],
"transparent": false
}`);
tab.onApplyChange();
const panel2 = findVizPanelByKey(scene, panel.state.key)!;
const gridItem = panel2.parent as DashboardGridItem;
// Verify all gridPos properties are updated
expect(gridItem.state.x).toBe(5);
expect(gridItem.state.y).toBe(10);
expect(gridItem.state.width).toBe(12);
expect(gridItem.state.height).toBe(8);
// Verify forceRender was called on the layout to apply position changes
expect(forceRenderSpy).toHaveBeenCalled();
});
it('Can show panel json for V2 dashboard specification', async () => {
const { tab } = await buildTestSceneWithV2Spec();
@@ -9,6 +9,7 @@ import {
SceneDataTransformer,
sceneGraph,
SceneGridItemStateLike,
SceneGridLayout,
SceneObjectBase,
SceneObjectRef,
SceneObjectState,
@@ -168,6 +169,12 @@ export class InspectJsonTab extends SceneObjectBase<InspectJsonTabState> {
panel.parent.setState(newState);
// Force the grid layout to re-render with the new positions
const layout = sceneGraph.getLayout(panel);
if (layout instanceof SceneGridLayout) {
layout.forceRender();
}
//Report relevant updates
reportPanelInspectInteraction(InspectTab.JSON, 'apply', {
panel_type_changed: panel.state.pluginId !== panelModel.type,
@@ -1,10 +1,10 @@
import { AdHocVariableModel, EventBusSrv, GroupByVariableModel, VariableModel } from '@grafana/data';
import { BackendSrv, config, setBackendSrv } from '@grafana/runtime';
import { GroupByVariable, sceneGraph } from '@grafana/scenes';
import { GroupByVariable, sceneGraph, SceneQueryRunner } from '@grafana/scenes';
import { AdHocFilterItem, PanelContext } from '@grafana/ui';
import { transformSaveModelToScene } from '../serialization/transformSaveModelToScene';
import { findVizPanelByKey } from '../utils/utils';
import { findVizPanelByKey, getQueryRunnerFor } from '../utils/utils';
import { getAdHocFilterVariableFor, setDashboardPanelContext } from './setDashboardPanelContext';
@@ -159,6 +159,23 @@ describe('setDashboardPanelContext', () => {
// Verify existing filter value updated
expect(variable.state.filters[1].operator).toBe('!=');
});
it('Should use existing adhoc filter when panel has no panel-level datasource because queries have all the same datasources (v2 behavior)', () => {
const { scene, context } = buildTestScene({ existingFilterVariable: true, panelDatasourceUndefined: true });
const variable = getAdHocFilterVariableFor(scene, { uid: 'my-ds-uid' });
variable.setState({ filters: [] });
context.onAddAdHocFilter!({ key: 'hello', value: 'world', operator: '=' });
// Should use the existing adhoc filter variable, not create a new one
expect(variable.state.filters).toEqual([{ key: 'hello', value: 'world', operator: '=' }]);
// Verify no new adhoc variables were created
const variables = sceneGraph.getVariables(scene);
const adhocVars = variables.state.variables.filter((v) => v.state.type === 'adhoc');
expect(adhocVars.length).toBe(1);
});
});
describe('getFiltersBasedOnGrouping', () => {
@@ -312,6 +329,7 @@ interface SceneOptions {
existingFilterVariable?: boolean;
existingGroupByVariable?: boolean;
groupByDatasourceUid?: string;
panelDatasourceUndefined?: boolean;
}
function buildTestScene(options: SceneOptions) {
@@ -385,6 +403,19 @@ function buildTestScene(options: SceneOptions) {
});
const vizPanel = findVizPanelByKey(scene, 'panel-4')!;
// Simulate v2 dashboard behavior where non-mixed panels don't have panel-level datasource
// but the queries have their own datasources
if (options.panelDatasourceUndefined) {
const queryRunner = getQueryRunnerFor(vizPanel);
if (queryRunner instanceof SceneQueryRunner) {
queryRunner.setState({
datasource: undefined,
queries: [{ refId: 'A', datasource: { uid: 'my-ds-uid', type: 'prometheus' } }],
});
}
}
const context: PanelContext = {
eventBus: new EventBusSrv(),
eventsScope: 'global',
@@ -6,7 +6,12 @@ import { AdHocFilterItem, PanelContext } from '@grafana/ui';
import { annotationServer } from 'app/features/annotations/api';
import { dashboardSceneGraph } from '../utils/dashboardSceneGraph';
import { getDashboardSceneFor, getPanelIdForVizPanel, getQueryRunnerFor } from '../utils/utils';
import {
getDashboardSceneFor,
getDatasourceFromQueryRunner,
getPanelIdForVizPanel,
getQueryRunnerFor,
} from '../utils/utils';
import { DashboardScene } from './DashboardScene';
@@ -121,7 +126,7 @@ export function setDashboardPanelContext(vizPanel: VizPanel, context: PanelConte
context.eventBus.publish(new AnnotationChangeEvent({ id }));
};
context.onAddAdHocFilter = (newFilter: AdHocFilterItem) => {
context.onAddAdHocFilter = async (newFilter: AdHocFilterItem) => {
const dashboard = getDashboardSceneFor(vizPanel);
const queryRunner = getQueryRunnerFor(vizPanel);
@@ -129,7 +134,19 @@ export function setDashboardPanelContext(vizPanel: VizPanel, context: PanelConte
return;
}
const filterVar = getAdHocFilterVariableFor(dashboard, queryRunner.state.datasource);
let datasource = getDatasourceFromQueryRunner(queryRunner);
// If the datasource is type-only (e.g. it's possible that only group is set in V2 schema queries)
// we need to resolve it to a full datasource
if (datasource && !datasource.uid) {
const datasourceToLoad = await getDataSourceSrv().get(datasource);
datasource = {
uid: datasourceToLoad.uid,
type: datasourceToLoad.type,
};
}
const filterVar = getAdHocFilterVariableFor(dashboard, datasource);
updateAdHocFilterVariable(filterVar, newFilter);
};
@@ -141,7 +158,8 @@ export function setDashboardPanelContext(vizPanel: VizPanel, context: PanelConte
return [];
}
const groupByVar = getGroupByVariableFor(dashboard, queryRunner.state.datasource);
const datasource = getDatasourceFromQueryRunner(queryRunner);
const groupByVar = getGroupByVariableFor(dashboard, datasource);
if (!groupByVar) {
return [];
@@ -158,7 +176,7 @@ export function setDashboardPanelContext(vizPanel: VizPanel, context: PanelConte
.filter((item) => item !== undefined);
};
context.onAddAdHocFilters = (items: AdHocFilterItem[]) => {
context.onAddAdHocFilters = async (items: AdHocFilterItem[]) => {
const dashboard = getDashboardSceneFor(vizPanel);
const queryRunner = getQueryRunnerFor(vizPanel);
@@ -166,7 +184,18 @@ export function setDashboardPanelContext(vizPanel: VizPanel, context: PanelConte
return;
}
const filterVar = getAdHocFilterVariableFor(dashboard, queryRunner.state.datasource);
let datasource = getDatasourceFromQueryRunner(queryRunner);
// If the datasource is type-only (e.g. it's possible that only group is set in V2 schema queries)
// we need to resolve it to a full datasource
if (datasource && !datasource.uid) {
const datasourceToLoad = await getDataSourceSrv().get(datasource);
datasource = {
uid: datasourceToLoad.uid,
type: datasourceToLoad.type,
};
}
const filterVar = getAdHocFilterVariableFor(dashboard, datasource);
bulkUpdateAdHocFiltersVariable(filterVar, items);
};
@@ -144,7 +144,8 @@ export function transformSceneToSaveModelSchemaV2(scene: DashboardScene, isSnaps
try {
// validateDashboardSchemaV2 will throw an error if the dashboard is not valid
if (validateDashboardSchemaV2(dashboardSchemaV2)) {
return sortedDeepCloneWithoutNulls(dashboardSchemaV2, true);
// Strip BOMs from all strings to prevent CUE validation errors ("illegal byte order mark")
return sortedDeepCloneWithoutNulls(dashboardSchemaV2, true, true);
}
// should never reach this point, validation should throw an error
throw new Error('Error we could transform the dashboard to schema v2: ' + dashboardSchemaV2);
@@ -3,6 +3,8 @@ import { getDataSourceSrv } from '@grafana/runtime';
import { AdHocFiltersVariable, GroupByVariable, sceneGraph, SceneObject, SceneQueryRunner } from '@grafana/scenes';
import { DataSourceRef } from '@grafana/schema';
import { getDatasourceFromQueryRunner } from './utils';
export function verifyDrilldownApplicability(
sourceObject: SceneObject,
queriesDataSource: DataSourceRef | undefined,
@@ -26,7 +28,7 @@ export async function getDrilldownApplicability(
return;
}
const datasource = queryRunner.state.datasource;
const datasource = getDatasourceFromQueryRunner(queryRunner);
const queries = queryRunner.state.data?.request?.targets;
const ds = await getDataSourceSrv().get(datasource?.uid);
@@ -4,7 +4,7 @@ import { sceneGraph, VizPanel } from '@grafana/scenes';
import { contextSrv } from 'app/core/services/context_srv';
import { getExploreUrl } from 'app/core/utils/explore';
import { getQueryRunnerFor } from './utils';
import { getDatasourceFromQueryRunner, getQueryRunnerFor } from './utils';
export function getViewPanelUrl(vizPanel: VizPanel) {
return locationUtil.getUrlForPartial(locationService.getLocation(), {
@@ -27,10 +27,11 @@ export function tryGetExploreUrlForPanel(vizPanel: VizPanel): Promise<string | u
}
const timeRange = sceneGraph.getTimeRange(vizPanel);
const datasource = getDatasourceFromQueryRunner(queryRunner);
return getExploreUrl({
queries: queryRunner.state.queries,
dsRef: queryRunner.state.datasource,
dsRef: datasource,
timeRange: timeRange.state.value,
scopedVars: { __sceneObject: { value: vizPanel } },
adhocFilters: queryRunner.state.data?.request?.filters,
@@ -1,4 +1,4 @@
import { getDataSourceRef, IntervalVariableModel } from '@grafana/data';
import { DataSourceRef, getDataSourceRef, IntervalVariableModel } from '@grafana/data';
import { t } from '@grafana/i18n';
import { config, getDataSourceSrv } from '@grafana/runtime';
import {
@@ -237,6 +237,26 @@ export function getQueryRunnerFor(sceneObject: SceneObject | undefined): SceneQu
return undefined;
}
/**
* Gets the datasource from a query runner.
* When no panel-level datasource is set, it means all queries use the same datasource,
* so we extract the datasource from the first query.
*/
export function getDatasourceFromQueryRunner(queryRunner: SceneQueryRunner): DataSourceRef | null | undefined {
// Panel-level datasource is set for mixed datasource panels
if (queryRunner.state.datasource) {
return queryRunner.state.datasource;
}
// No panel-level datasource means all queries share the same datasource
const firstQuery = queryRunner.state.queries?.[0];
if (firstQuery?.datasource) {
return firstQuery.datasource;
}
return undefined;
}
export function getDashboardSceneFor(sceneObject: SceneObject): DashboardScene {
const root = sceneObject.getRoot();

Some files were not shown because too many files have changed in this diff Show More