Compare commits
11 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 2d1f51d02f | |||
| 9f2f93b401 | |||
| 9e399e0b19 | |||
| 2f520454ae | |||
| 72f7bd3900 | |||
| ba416eab4e | |||
| 189d50d815 | |||
| 450eaba447 | |||
| 87f5d5e741 | |||
| f9eb5b7360 | |||
| 785c578e2f |
@@ -270,7 +270,17 @@ Click **View in CloudWatch console** to interactively view, search, and analyze
|
||||
|
||||
### Query Log groups with OpenSearch SQL
|
||||
|
||||
When querying log groups with OpenSearch SQL, you **must** explicitly state the log group identifier or ARN in the `FROM` clause:
|
||||
When querying log groups with OpenSearch SQL, you can use the `$__logGroups` macro to automatically reference log groups selected in the query editor's log group selector. This is the recommended approach as it allows you to manage log groups through the UI.
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
FROM $__logGroups
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
The `$__logGroups` macro expands to the proper `logGroups(logGroupIdentifier: [...])` syntax with the log groups you've selected in the UI.
|
||||
|
||||
Alternatively, you can manually specify a single log group directly in the `FROM` clause:
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
@@ -278,7 +288,7 @@ FROM `log_group`
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
or, when querying multiple log groups:
|
||||
When querying multiple log groups you **must** use the `logGroups(logGroupIdentifier: [...])` syntax:
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
@@ -286,6 +296,8 @@ FROM `logGroups( logGroupIdentifier: ['LogGroup1', 'LogGroup2'])`
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
To reference log groups in a monitoring account, use ARNs instead of LogGroup names.
|
||||
|
||||
You can also write queries returning time series data by using the [`stats` command](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_Insights-Visualizing-Log-Data.html).
|
||||
When making `stats` queries in [Explore](ref:explore), ensure you are in Metrics Explore mode.
|
||||
|
||||
|
||||
+8
-2
@@ -4,7 +4,8 @@ comments: |
|
||||
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
||||
---
|
||||
|
||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||
You can pan the panel time range left and right, and zoom it and in and out.
|
||||
This, in turn, changes the dashboard time range.
|
||||
|
||||
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
||||
|
||||
@@ -16,4 +17,9 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
||||
- Next range: 8:30 - 10:29
|
||||
- Next range: 7:30 - 11:29
|
||||
|
||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#zoom-panel-time-range).
|
||||
**Pan** - Click and drag the x-axis area of the panel to pan the time range.
|
||||
|
||||
The time range shifts by the distance you drag.
|
||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||
|
||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#pan-and-zoom-panel-time-range).
|
||||
@@ -317,13 +317,16 @@ Click the **Copy time range to clipboard** icon to copy the current time range t
|
||||
|
||||
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
||||
|
||||
#### Zoom out (Cmd+Z or Ctrl+Z)
|
||||
#### Zoom out
|
||||
|
||||
Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualization.
|
||||
- Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualizations
|
||||
- Double click on the panel graph area (time series family visualizations only)
|
||||
- Type the `t-` keyboard shortcut
|
||||
|
||||
#### Zoom in (only applicable to graph visualizations)
|
||||
#### Zoom in
|
||||
|
||||
Click and drag to select the time range in the visualization that you want to view.
|
||||
- Click and drag horizontally in the panel graph area to select a time range (time series family visualizations only)
|
||||
- Type the `t+` keyboard shortcut
|
||||
|
||||
#### Refresh dashboard
|
||||
|
||||
|
||||
@@ -175,9 +175,10 @@ By hovering over a panel with the mouse you can use some shortcuts that will tar
|
||||
- `pl`: Hide or show legend
|
||||
- `pr`: Remove Panel
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||
You can pan the panel time range left and right, and zoom it and in and out.
|
||||
This, in turn, changes the dashboard time range.
|
||||
|
||||
This feature is supported for the following visualizations:
|
||||
|
||||
@@ -191,7 +192,7 @@ This feature is supported for the following visualizations:
|
||||
|
||||
Click and drag on the panel to zoom in on a particular time range.
|
||||
|
||||
The following screen recordings show this interaction in the time series and x visualizations:
|
||||
The following screen recordings show this interaction in the time series and candlestick visualizations:
|
||||
|
||||
Time series
|
||||
|
||||
@@ -211,7 +212,7 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
||||
- Next range: 8:30 - 10:29
|
||||
- Next range: 7:30 - 11:29
|
||||
|
||||
The following screen recordings demonstrate the preceding example in the time series and x visualizations:
|
||||
The following screen recordings demonstrate the preceding example in the time series and heatmap visualizations:
|
||||
|
||||
Time series
|
||||
|
||||
@@ -221,6 +222,19 @@ Heatmap
|
||||
|
||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
||||
|
||||
### Pan
|
||||
|
||||
Click and drag the x-axis area of the panel to pan the time range.
|
||||
|
||||
The time range shifts by the distance you drag.
|
||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||
|
||||
The following screen recordings show this interaction in the time series visualization:
|
||||
|
||||
Time series
|
||||
|
||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-ts-time-pan-mouse.mp4" >}}
|
||||
|
||||
## Add a panel
|
||||
|
||||
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
||||
|
||||
+2
-2
@@ -92,9 +92,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
@@ -79,9 +79,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -93,9 +93,9 @@ You can also create a state timeline visualization using time series data. To do
|
||||
|
||||

|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -85,9 +85,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -167,9 +167,9 @@ The following example shows three series: Min, Max, and Value. The Min and Max s
|
||||
|
||||
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
@@ -77,14 +77,5 @@ export {
|
||||
getCorrelationsService,
|
||||
setCorrelationsService,
|
||||
} from './services/CorrelationsService';
|
||||
export {
|
||||
getDashboardMutationAPI,
|
||||
setDashboardMutationAPI,
|
||||
type DashboardMutationAPI,
|
||||
type MutationResult,
|
||||
type MutationChange,
|
||||
type MutationRequest,
|
||||
type MCPToolDefinition,
|
||||
} from './services/dashboardMutationAPI';
|
||||
export { getAppPluginVersion, isAppPluginInstalled } from './services/pluginMeta/apps';
|
||||
export { useAppPluginInstalled, useAppPluginVersion } from './services/pluginMeta/hooks';
|
||||
|
||||
@@ -1,160 +0,0 @@
|
||||
/**
|
||||
* Dashboard Mutation API Service
|
||||
*
|
||||
* Provides a stable interface for programmatic dashboard modifications.
|
||||
*
|
||||
* The API is registered by DashboardScene when a dashboard is loaded and
|
||||
* cleared when the dashboard is deactivated.
|
||||
*/
|
||||
|
||||
/**
|
||||
* MCP Tool Definition - describes a tool that can be invoked
|
||||
* @see https://spec.modelcontextprotocol.io/specification/server/tools/
|
||||
*/
|
||||
export interface MCPToolDefinition {
|
||||
name: string;
|
||||
description: string;
|
||||
inputSchema: {
|
||||
type: 'object';
|
||||
properties: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
annotations?: {
|
||||
title?: string;
|
||||
readOnlyHint?: boolean;
|
||||
destructiveHint?: boolean;
|
||||
idempotentHint?: boolean;
|
||||
confirmationHint?: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
export interface MutationResult {
|
||||
success: boolean;
|
||||
/** ID of the affected panel (for panel operations) */
|
||||
panelId?: string;
|
||||
/** Error message if success is false */
|
||||
error?: string;
|
||||
/** List of changes made by the mutation */
|
||||
changes?: MutationChange[];
|
||||
/** Warnings (non-fatal issues) */
|
||||
warnings?: string[];
|
||||
/** Data returned by read-only operations (e.g., GET_DASHBOARD_INFO) */
|
||||
data?: unknown;
|
||||
}
|
||||
|
||||
export interface MutationChange {
|
||||
/** JSON path to the changed value */
|
||||
path: string;
|
||||
/** Value before the change */
|
||||
previousValue: unknown;
|
||||
/** Value after the change */
|
||||
newValue: unknown;
|
||||
}
|
||||
|
||||
export interface MutationRequest {
|
||||
/** Type of mutation (e.g., 'ADD_PANEL', 'REMOVE_PANEL', 'UPDATE_PANEL') */
|
||||
type: string;
|
||||
/** Payload specific to the mutation type */
|
||||
payload: unknown;
|
||||
}
|
||||
|
||||
/**
|
||||
* Dashboard info returned by getDashboardMutationAPI().getDashboardInfo()
|
||||
*/
|
||||
export interface DashboardMutationInfo {
|
||||
available: boolean;
|
||||
uid?: string;
|
||||
title?: string;
|
||||
canEdit: boolean;
|
||||
isEditing: boolean;
|
||||
availableTools: string[];
|
||||
}
|
||||
|
||||
export interface DashboardMutationAPI {
|
||||
/**
|
||||
* Execute a mutation on the dashboard
|
||||
*/
|
||||
execute(mutation: MutationRequest): Promise<MutationResult>;
|
||||
|
||||
/**
|
||||
* Check if the current user can edit the dashboard
|
||||
*/
|
||||
canEdit(): boolean;
|
||||
|
||||
/**
|
||||
* Get the UID of the currently loaded dashboard
|
||||
*/
|
||||
getDashboardUID(): string | undefined;
|
||||
|
||||
/**
|
||||
* Get the title of the currently loaded dashboard
|
||||
*/
|
||||
getDashboardTitle(): string | undefined;
|
||||
|
||||
/**
|
||||
* Check if the dashboard is in edit mode
|
||||
*/
|
||||
isEditing(): boolean;
|
||||
|
||||
/**
|
||||
* Enter edit mode if not already editing
|
||||
*/
|
||||
enterEditMode(): void;
|
||||
|
||||
/**
|
||||
* Get the available MCP tool definitions for this dashboard
|
||||
*/
|
||||
getTools(): MCPToolDefinition[];
|
||||
|
||||
/**
|
||||
* Get comprehensive dashboard info in a single call
|
||||
*/
|
||||
getDashboardInfo(): DashboardMutationInfo;
|
||||
}
|
||||
|
||||
// Singleton instance
|
||||
let _dashboardMutationAPI: DashboardMutationAPI | null = null;
|
||||
|
||||
// Expose on window for cross-bundle access (plugins use different bundle)
|
||||
declare global {
|
||||
interface Window {
|
||||
__grafanaDashboardMutationAPI?: DashboardMutationAPI | null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the dashboard mutation API instance.
|
||||
* Called by DashboardScene when a dashboard is activated.
|
||||
*
|
||||
* @param api - The mutation API instance, or null to clear
|
||||
* @internal
|
||||
*/
|
||||
export function setDashboardMutationAPI(api: DashboardMutationAPI | null): void {
|
||||
_dashboardMutationAPI = api;
|
||||
// Also expose on window for plugins that use a different @grafana/runtime bundle
|
||||
if (typeof window !== 'undefined') {
|
||||
window.__grafanaDashboardMutationAPI = api;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the dashboard mutation API for the currently loaded dashboard.
|
||||
*
|
||||
* @returns The mutation API, or null if no dashboard is loaded
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* import { getDashboardMutationAPI } from '@grafana/runtime';
|
||||
*
|
||||
* const api = getDashboardMutationAPI();
|
||||
* if (api && api.canEdit()) {
|
||||
* await api.execute({
|
||||
* type: 'ADD_PANEL',
|
||||
* payload: { ... }
|
||||
* });
|
||||
* }
|
||||
* ```
|
||||
*/
|
||||
export function getDashboardMutationAPI(): DashboardMutationAPI | null {
|
||||
return _dashboardMutationAPI;
|
||||
}
|
||||
@@ -42,13 +42,3 @@ export {
|
||||
export { setCurrentUser } from './user';
|
||||
export { RuntimeDataSource } from './RuntimeDataSource';
|
||||
export { ScopesContext, type ScopesContextValueState, type ScopesContextValue, useScopes } from './ScopesContext';
|
||||
export {
|
||||
getDashboardMutationAPI,
|
||||
setDashboardMutationAPI,
|
||||
type DashboardMutationAPI,
|
||||
type DashboardMutationInfo,
|
||||
type MutationResult,
|
||||
type MutationChange,
|
||||
type MutationRequest,
|
||||
type MCPToolDefinition,
|
||||
} from './dashboardMutationAPI';
|
||||
|
||||
@@ -117,6 +117,44 @@ export const MyComponent = () => {
|
||||
};
|
||||
```
|
||||
|
||||
### Custom Header Rendering
|
||||
|
||||
Column headers can be customized using strings, React elements, or renderer functions. The `header` property accepts any value that matches React Table's `Renderer` type.
|
||||
|
||||
**Important:** When using custom header content, prefer inline elements (like `<span>`) over block elements (like `<div>`) to avoid layout issues. Block-level elements can cause extra spacing and alignment problems in table headers because they disrupt the table's inline flow. Use `display: inline-flex` or `display: inline-block` when you need flexbox or block-like behavior.
|
||||
|
||||
```tsx
|
||||
const columns: Array<Column<TableData>> = [
|
||||
// React element header
|
||||
{
|
||||
id: 'checkbox',
|
||||
header: (
|
||||
<>
|
||||
<label htmlFor="select-all" className="sr-only">
|
||||
Select all rows
|
||||
</label>
|
||||
<Checkbox id="select-all" />
|
||||
</>
|
||||
),
|
||||
cell: () => <Checkbox aria-label="Select row" />,
|
||||
},
|
||||
|
||||
// Function renderer header
|
||||
{
|
||||
id: 'firstName',
|
||||
header: () => (
|
||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||
<Icon name="user" size="sm" />
|
||||
<span>First Name</span>
|
||||
</span>
|
||||
),
|
||||
},
|
||||
|
||||
// String header
|
||||
{ id: 'lastName', header: 'Last name' },
|
||||
];
|
||||
```
|
||||
|
||||
### Custom Cell Rendering
|
||||
|
||||
Individual cells can be rendered using custom content dy defining a `cell` property on the column definition.
|
||||
|
||||
@@ -3,8 +3,11 @@ import { useCallback, useMemo, useState } from 'react';
|
||||
import { CellProps } from 'react-table';
|
||||
|
||||
import { LinkButton } from '../Button/Button';
|
||||
import { Checkbox } from '../Forms/Checkbox';
|
||||
import { Field } from '../Forms/Field';
|
||||
import { Icon } from '../Icon/Icon';
|
||||
import { Input } from '../Input/Input';
|
||||
import { Text } from '../Text/Text';
|
||||
|
||||
import { FetchDataArgs, InteractiveTable, InteractiveTableHeaderTooltip } from './InteractiveTable';
|
||||
import mdx from './InteractiveTable.mdx';
|
||||
@@ -297,4 +300,40 @@ export const WithControlledSort: StoryFn<typeof InteractiveTable> = (args) => {
|
||||
return <InteractiveTable {...args} data={data} pageSize={15} fetchData={fetchData} />;
|
||||
};
|
||||
|
||||
export const WithCustomHeader: TableStoryObj = {
|
||||
args: {
|
||||
columns: [
|
||||
// React element header
|
||||
{
|
||||
id: 'checkbox',
|
||||
header: (
|
||||
<>
|
||||
<label htmlFor="select-all" className="sr-only">
|
||||
Select all rows
|
||||
</label>
|
||||
<Checkbox id="select-all" />
|
||||
</>
|
||||
),
|
||||
cell: () => <Checkbox aria-label="Select row" />,
|
||||
},
|
||||
// Function renderer header
|
||||
{
|
||||
id: 'firstName',
|
||||
header: () => (
|
||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||
<Icon name="user" size="sm" />
|
||||
<Text element="span">First Name</Text>
|
||||
</span>
|
||||
),
|
||||
sortType: 'string',
|
||||
},
|
||||
// String header
|
||||
{ id: 'lastName', header: 'Last name', sortType: 'string' },
|
||||
{ id: 'car', header: 'Car', sortType: 'string' },
|
||||
{ id: 'age', header: 'Age', sortType: 'number' },
|
||||
],
|
||||
data: pageableData.slice(0, 10),
|
||||
getRowId: (r) => r.id,
|
||||
},
|
||||
};
|
||||
export default meta;
|
||||
|
||||
@@ -2,6 +2,9 @@ import { render, screen, within } from '@testing-library/react';
|
||||
import userEvent from '@testing-library/user-event';
|
||||
import * as React from 'react';
|
||||
|
||||
import { Checkbox } from '../Forms/Checkbox';
|
||||
import { Icon } from '../Icon/Icon';
|
||||
|
||||
import { InteractiveTable } from './InteractiveTable';
|
||||
import { Column } from './types';
|
||||
|
||||
@@ -247,4 +250,104 @@ describe('InteractiveTable', () => {
|
||||
expect(fetchData).toHaveBeenCalledWith({ sortBy: [{ id: 'id', desc: false }] });
|
||||
});
|
||||
});
|
||||
|
||||
describe('custom header rendering', () => {
|
||||
it('should render string headers', () => {
|
||||
const columns: Array<Column<TableData>> = [{ id: 'id', header: 'ID' }];
|
||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||
|
||||
expect(screen.getByRole('columnheader', { name: 'ID' })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should render React element headers', () => {
|
||||
const columns: Array<Column<TableData>> = [
|
||||
{
|
||||
id: 'checkbox',
|
||||
header: (
|
||||
<>
|
||||
<label htmlFor="select-all" className="sr-only">
|
||||
Select all rows
|
||||
</label>
|
||||
<Checkbox id="select-all" data-testid="header-checkbox" />
|
||||
</>
|
||||
),
|
||||
cell: () => <Checkbox data-testid="cell-checkbox" aria-label="Select row" />,
|
||||
},
|
||||
];
|
||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||
|
||||
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
||||
expect(screen.getByTestId('cell-checkbox')).toBeInTheDocument();
|
||||
expect(screen.getByLabelText('Select all rows')).toBeInTheDocument();
|
||||
expect(screen.getByLabelText('Select row')).toBeInTheDocument();
|
||||
expect(screen.getByText('Select all rows')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should render function renderer headers', () => {
|
||||
const columns: Array<Column<TableData>> = [
|
||||
{
|
||||
id: 'firstName',
|
||||
header: () => (
|
||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||
<Icon name="user" size="sm" data-testid="header-icon" />
|
||||
<span>First Name</span>
|
||||
</span>
|
||||
),
|
||||
sortType: 'string',
|
||||
},
|
||||
];
|
||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||
|
||||
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
||||
expect(screen.getByRole('columnheader', { name: /first name/i })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should render all header types together', () => {
|
||||
const columns: Array<Column<TableData>> = [
|
||||
{
|
||||
id: 'checkbox',
|
||||
header: (
|
||||
<>
|
||||
<label htmlFor="select-all" className="sr-only">
|
||||
Select all rows
|
||||
</label>
|
||||
<Checkbox id="select-all" data-testid="header-checkbox" />
|
||||
</>
|
||||
),
|
||||
cell: () => <Checkbox aria-label="Select row" />,
|
||||
},
|
||||
{
|
||||
id: 'id',
|
||||
header: () => (
|
||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||
<Icon name="user" size="sm" data-testid="header-icon" />
|
||||
<span>ID</span>
|
||||
</span>
|
||||
),
|
||||
sortType: 'string',
|
||||
},
|
||||
{ id: 'country', header: 'Country', sortType: 'string' },
|
||||
{ id: 'value', header: 'Value' },
|
||||
];
|
||||
const data: TableData[] = [
|
||||
{ id: '1', value: 'Value 1', country: 'Sweden' },
|
||||
{ id: '2', value: 'Value 2', country: 'Norway' },
|
||||
];
|
||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||
|
||||
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
||||
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
||||
expect(screen.getByRole('columnheader', { name: 'Country' })).toBeInTheDocument();
|
||||
expect(screen.getByRole('columnheader', { name: 'Value' })).toBeInTheDocument();
|
||||
|
||||
// Verify data is rendered
|
||||
expect(screen.getByText('Sweden')).toBeInTheDocument();
|
||||
expect(screen.getByText('Norway')).toBeInTheDocument();
|
||||
expect(screen.getByText('Value 1')).toBeInTheDocument();
|
||||
expect(screen.getByText('Value 2')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { ReactNode } from 'react';
|
||||
import { CellProps, DefaultSortTypes, IdType, SortByFn } from 'react-table';
|
||||
import { CellProps, DefaultSortTypes, HeaderProps, IdType, Renderer, SortByFn } from 'react-table';
|
||||
|
||||
export interface Column<TableData extends object> {
|
||||
/**
|
||||
@@ -11,9 +11,9 @@ export interface Column<TableData extends object> {
|
||||
*/
|
||||
cell?: (props: CellProps<TableData>) => ReactNode;
|
||||
/**
|
||||
* Header name. if `undefined` the header will be empty. Useful for action columns.
|
||||
* Header name. Can be a string, renderer function, or undefined. If `undefined` the header will be empty. Useful for action columns.
|
||||
*/
|
||||
header?: string;
|
||||
header?: Renderer<HeaderProps<TableData>>;
|
||||
/**
|
||||
* Column sort type. If `undefined` the column will not be sortable.
|
||||
* */
|
||||
|
||||
@@ -76,21 +76,27 @@ func (hs *HTTPServer) CreateDashboardSnapshot(c *contextmodel.ReqContext) {
|
||||
return
|
||||
}
|
||||
|
||||
// Do not check permissions when the instance snapshot public mode is enabled
|
||||
if !hs.Cfg.SnapshotPublicMode {
|
||||
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
||||
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
||||
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
dashboardsnapshots.CreateDashboardSnapshot(c, snapshot.SnapshotSharingOptions{
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: hs.Cfg.SnapshotEnabled,
|
||||
ExternalEnabled: hs.Cfg.ExternalEnabled,
|
||||
ExternalSnapshotName: hs.Cfg.ExternalSnapshotName,
|
||||
ExternalSnapshotURL: hs.Cfg.ExternalSnapshotUrl,
|
||||
}, cmd, hs.dashboardsnapshotsService)
|
||||
}
|
||||
|
||||
if hs.Cfg.SnapshotPublicMode {
|
||||
// Public mode: no user or dashboard validation needed
|
||||
dashboardsnapshots.CreateDashboardSnapshotPublic(c, cfg, cmd, hs.dashboardsnapshotsService)
|
||||
return
|
||||
}
|
||||
|
||||
// Regular mode: check permissions
|
||||
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
||||
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
||||
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
||||
return
|
||||
}
|
||||
|
||||
dashboardsnapshots.CreateDashboardSnapshot(c, cfg, cmd, hs.dashboardsnapshotsService)
|
||||
}
|
||||
|
||||
// GET /api/snapshots/:key
|
||||
@@ -213,13 +219,6 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
||||
return response.Error(http.StatusUnauthorized, "OrgID mismatch", nil)
|
||||
}
|
||||
|
||||
if queryResult.External {
|
||||
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
||||
if err != nil {
|
||||
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Dashboard can be empty (creation error or external snapshot). This means that the mustInt here returns a 0,
|
||||
// which before RBAC would result in a dashboard which has no ACL. A dashboard without an ACL would fallback
|
||||
// to the user’s org role, which for editors and admins would essentially always be allowed here. With RBAC,
|
||||
@@ -239,6 +238,13 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
||||
}
|
||||
}
|
||||
|
||||
if queryResult.External {
|
||||
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
||||
if err != nil {
|
||||
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
||||
}
|
||||
}
|
||||
|
||||
cmd := &dashboardsnapshots.DeleteDashboardSnapshotCommand{DeleteKey: queryResult.DeleteKey}
|
||||
|
||||
if err := hs.dashboardsnapshotsService.DeleteDashboardSnapshot(c.Req.Context(), cmd); err != nil {
|
||||
|
||||
@@ -32,6 +32,8 @@ import (
|
||||
var (
|
||||
logger = glog.New("data-proxy-log")
|
||||
client = newHTTPClient()
|
||||
|
||||
errPluginProxyRouteAccessDenied = errors.New("plugin proxy route access denied")
|
||||
)
|
||||
|
||||
type DataSourceProxy struct {
|
||||
@@ -308,12 +310,21 @@ func (proxy *DataSourceProxy) validateRequest() error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
// issues/116273: When we have an empty input route (or input that becomes relative to "."), we do not want it
|
||||
// to be ".". This is because the `CleanRelativePath` function will never return "./" prefixes, and as such,
|
||||
// the common prefix we need is an empty string.
|
||||
if r1 == "." && proxy.proxyPath != "." {
|
||||
r1 = ""
|
||||
}
|
||||
if r2 == "." && route.Path != "." {
|
||||
r2 = ""
|
||||
}
|
||||
if !strings.HasPrefix(r1, r2) {
|
||||
continue
|
||||
}
|
||||
|
||||
if !proxy.hasAccessToRoute(route) {
|
||||
return errors.New("plugin proxy route access denied")
|
||||
return errPluginProxyRouteAccessDenied
|
||||
}
|
||||
|
||||
proxy.matchedRoute = route
|
||||
|
||||
@@ -673,6 +673,94 @@ func TestIntegrationDataSourceProxy_routeRule(t *testing.T) {
|
||||
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Regression of 116273: Fallback routes should apply fallback route roles", func(t *testing.T) {
|
||||
for _, tc := range []struct {
|
||||
InputPath string
|
||||
ConfigurationPath string
|
||||
ExpectError bool
|
||||
}{
|
||||
{
|
||||
InputPath: "api/v2/leak-ur-secrets",
|
||||
ConfigurationPath: "",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: "",
|
||||
ConfigurationPath: "",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: ".",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: "",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: false,
|
||||
},
|
||||
{
|
||||
InputPath: "api",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: false,
|
||||
},
|
||||
} {
|
||||
orEmptyStr := func(s string) string {
|
||||
if s == "" {
|
||||
return "<empty>"
|
||||
}
|
||||
return s
|
||||
}
|
||||
t.Run(
|
||||
fmt.Sprintf("with inputPath=%s, configurationPath=%s, expectError=%v",
|
||||
orEmptyStr(tc.InputPath), orEmptyStr(tc.ConfigurationPath), tc.ExpectError),
|
||||
func(t *testing.T) {
|
||||
ds := &datasources.DataSource{
|
||||
UID: "dsUID",
|
||||
JsonData: simplejson.New(),
|
||||
}
|
||||
routes := []*plugins.Route{
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "GET",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "POST",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "PUT",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "DELETE",
|
||||
},
|
||||
}
|
||||
|
||||
req, err := http.NewRequestWithContext(t.Context(), "GET", "http://localhost/"+tc.InputPath, nil)
|
||||
require.NoError(t, err, "failed to create HTTP request")
|
||||
ctx := &contextmodel.ReqContext{
|
||||
Context: &web.Context{Req: req},
|
||||
SignedInUser: &user.SignedInUser{OrgRole: org.RoleViewer},
|
||||
}
|
||||
proxy, err := setupDSProxyTest(t, ctx, ds, routes, tc.InputPath)
|
||||
require.NoError(t, err, "failed to setup proxy test")
|
||||
err = proxy.validateRequest()
|
||||
if tc.ExpectError {
|
||||
require.ErrorIs(t, err, errPluginProxyRouteAccessDenied, "request was not denied due to access denied?")
|
||||
} else {
|
||||
require.NoError(t, err, "request was unexpectedly denied access")
|
||||
}
|
||||
},
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// test DataSourceProxy request handling.
|
||||
|
||||
@@ -36,6 +36,9 @@ var client = &http.Client{
|
||||
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
|
||||
}
|
||||
|
||||
// CreateDashboardSnapshot creates a snapshot when running Grafana in regular mode.
|
||||
// It validates the user and dashboard exist before creating the snapshot.
|
||||
// This mode supports both local and external snapshots.
|
||||
func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
||||
if !cfg.SnapshotsEnabled {
|
||||
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
||||
@@ -43,6 +46,7 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
||||
}
|
||||
|
||||
uid := cmd.Dashboard.GetNestedString("uid")
|
||||
|
||||
user, err := identity.GetRequester(c.Req.Context())
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusBadRequest, "missing user in context", nil)
|
||||
@@ -59,21 +63,18 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
||||
return
|
||||
}
|
||||
|
||||
cmd.ExternalURL = ""
|
||||
cmd.OrgID = user.GetOrgID()
|
||||
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
||||
|
||||
if cmd.Name == "" {
|
||||
cmd.Name = "Unnamed snapshot"
|
||||
}
|
||||
|
||||
var snapshotUrl string
|
||||
cmd.ExternalURL = ""
|
||||
cmd.OrgID = user.GetOrgID()
|
||||
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
||||
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
||||
return
|
||||
}
|
||||
var snapshotURL string
|
||||
|
||||
if cmd.External {
|
||||
// Handle external snapshot creation
|
||||
if !cfg.ExternalEnabled {
|
||||
c.JsonApiErr(http.StatusForbidden, "External dashboard creation is disabled", nil)
|
||||
return
|
||||
@@ -85,40 +86,83 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
||||
return
|
||||
}
|
||||
|
||||
snapshotUrl = resp.Url
|
||||
cmd.Key = resp.Key
|
||||
cmd.DeleteKey = resp.DeleteKey
|
||||
cmd.ExternalURL = resp.Url
|
||||
cmd.ExternalDeleteURL = resp.DeleteUrl
|
||||
cmd.Dashboard = &common.Unstructured{}
|
||||
snapshotURL = resp.Url
|
||||
|
||||
metrics.MApiDashboardSnapshotExternal.Inc()
|
||||
} else {
|
||||
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
||||
|
||||
if cmd.Key == "" {
|
||||
var err error
|
||||
cmd.Key, err = util.GetRandomString(32)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||
return
|
||||
}
|
||||
// Handle local snapshot creation
|
||||
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
||||
return
|
||||
}
|
||||
|
||||
if cmd.DeleteKey == "" {
|
||||
var err error
|
||||
cmd.DeleteKey, err = util.GetRandomString(32)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||
return
|
||||
}
|
||||
snapshotURL, err = prepareLocalSnapshot(&cmd, originalDashboardURL)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||
return
|
||||
}
|
||||
|
||||
snapshotUrl = setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key)
|
||||
|
||||
metrics.MApiDashboardSnapshotCreate.Inc()
|
||||
}
|
||||
|
||||
saveAndRespond(c, svc, cmd, snapshotURL)
|
||||
}
|
||||
|
||||
// CreateDashboardSnapshotPublic creates a snapshot when running Grafana in public mode.
|
||||
// In public mode, there is no user or dashboard information to validate.
|
||||
// Only local snapshots are supported (external snapshots are not available).
|
||||
func CreateDashboardSnapshotPublic(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
||||
if !cfg.SnapshotsEnabled {
|
||||
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
||||
return
|
||||
}
|
||||
|
||||
if cmd.Name == "" {
|
||||
cmd.Name = "Unnamed snapshot"
|
||||
}
|
||||
|
||||
snapshotURL, err := prepareLocalSnapshot(&cmd, "")
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||
return
|
||||
}
|
||||
|
||||
metrics.MApiDashboardSnapshotCreate.Inc()
|
||||
|
||||
saveAndRespond(c, svc, cmd, snapshotURL)
|
||||
}
|
||||
|
||||
// prepareLocalSnapshot prepares the command for a local snapshot and returns the snapshot URL.
|
||||
func prepareLocalSnapshot(cmd *CreateDashboardSnapshotCommand, originalDashboardURL string) (string, error) {
|
||||
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
||||
|
||||
if cmd.Key == "" {
|
||||
key, err := util.GetRandomString(32)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
cmd.Key = key
|
||||
}
|
||||
|
||||
if cmd.DeleteKey == "" {
|
||||
deleteKey, err := util.GetRandomString(32)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
cmd.DeleteKey = deleteKey
|
||||
}
|
||||
|
||||
return setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key), nil
|
||||
}
|
||||
|
||||
// saveAndRespond saves the snapshot and sends the response.
|
||||
func saveAndRespond(c *contextmodel.ReqContext, svc Service, cmd CreateDashboardSnapshotCommand, snapshotURL string) {
|
||||
result, err := svc.CreateDashboardSnapshot(c.Req.Context(), &cmd)
|
||||
if err != nil {
|
||||
c.JsonApiErr(http.StatusInternalServerError, "Failed to create snapshot", err)
|
||||
@@ -128,7 +172,7 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
||||
c.JSON(http.StatusOK, snapshot.DashboardCreateResponse{
|
||||
Key: result.Key,
|
||||
DeleteKey: result.DeleteKey,
|
||||
URL: snapshotUrl,
|
||||
URL: snapshotURL,
|
||||
DeleteURL: setting.ToAbsUrl("api/snapshots-delete/" + result.DeleteKey),
|
||||
})
|
||||
}
|
||||
|
||||
@@ -20,40 +20,30 @@ import (
|
||||
"github.com/grafana/grafana/pkg/web"
|
||||
)
|
||||
|
||||
func TestCreateDashboardSnapshot_DashboardNotFound(t *testing.T) {
|
||||
mockService := &MockService{}
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
ExternalEnabled: false,
|
||||
func createTestDashboard(t *testing.T) *common.Unstructured {
|
||||
t.Helper()
|
||||
dashboard := &common.Unstructured{}
|
||||
dashboardData := map[string]any{
|
||||
"uid": "test-dashboard-uid",
|
||||
"id": 123,
|
||||
}
|
||||
testUser := &user.SignedInUser{
|
||||
dashboardBytes, _ := json.Marshal(dashboardData)
|
||||
_ = json.Unmarshal(dashboardBytes, dashboard)
|
||||
return dashboard
|
||||
}
|
||||
|
||||
func createTestUser() *user.SignedInUser {
|
||||
return &user.SignedInUser{
|
||||
UserID: 1,
|
||||
OrgID: 1,
|
||||
Login: "testuser",
|
||||
Name: "Test User",
|
||||
Email: "test@example.com",
|
||||
}
|
||||
dashboard := &common.Unstructured{}
|
||||
dashboardData := map[string]interface{}{
|
||||
"uid": "test-dashboard-uid",
|
||||
"id": 123,
|
||||
}
|
||||
dashboardBytes, _ := json.Marshal(dashboardData)
|
||||
_ = json.Unmarshal(dashboardBytes, dashboard)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test Snapshot",
|
||||
},
|
||||
}
|
||||
|
||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||
Return(dashboards.ErrDashboardNotFound)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||
}
|
||||
|
||||
func createReqContext(t *testing.T, req *http.Request, testUser *user.SignedInUser) (*contextmodel.ReqContext, *httptest.ResponseRecorder) {
|
||||
t.Helper()
|
||||
recorder := httptest.NewRecorder()
|
||||
ctx := &contextmodel.ReqContext{
|
||||
Context: &web.Context{
|
||||
@@ -63,13 +53,319 @@ func TestCreateDashboardSnapshot_DashboardNotFound(t *testing.T) {
|
||||
SignedInUser: testUser,
|
||||
Logger: log.NewNopLogger(),
|
||||
}
|
||||
|
||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
||||
var response map[string]interface{}
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "Dashboard not found", response["message"])
|
||||
return ctx, recorder
|
||||
}
|
||||
|
||||
// TestCreateDashboardSnapshot tests snapshot creation in regular mode (non-public instance).
|
||||
// These tests cover scenarios when Grafana is running as a regular server with user authentication.
|
||||
func TestCreateDashboardSnapshot(t *testing.T) {
|
||||
t.Run("should return error when dashboard not found", func(t *testing.T) {
|
||||
mockService := &MockService{}
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
ExternalEnabled: false,
|
||||
}
|
||||
testUser := createTestUser()
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test Snapshot",
|
||||
},
|
||||
}
|
||||
|
||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||
Return(dashboards.ErrDashboardNotFound)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||
ctx, recorder := createReqContext(t, req, testUser)
|
||||
|
||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "Dashboard not found", response["message"])
|
||||
})
|
||||
|
||||
t.Run("should create external snapshot when external is enabled", func(t *testing.T) {
|
||||
externalServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
assert.Equal(t, "/api/snapshots", r.URL.Path)
|
||||
assert.Equal(t, "POST", r.Method)
|
||||
|
||||
response := map[string]any{
|
||||
"key": "external-key",
|
||||
"deleteKey": "external-delete-key",
|
||||
"url": "https://external.example.com/dashboard/snapshot/external-key",
|
||||
"deleteUrl": "https://external.example.com/api/snapshots-delete/external-delete-key",
|
||||
}
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
_ = json.NewEncoder(w).Encode(response)
|
||||
}))
|
||||
defer externalServer.Close()
|
||||
|
||||
mockService := NewMockService(t)
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
ExternalEnabled: true,
|
||||
ExternalSnapshotURL: externalServer.URL,
|
||||
}
|
||||
testUser := createTestUser()
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test External Snapshot",
|
||||
External: true,
|
||||
},
|
||||
}
|
||||
|
||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||
Return(nil)
|
||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||
Return(&DashboardSnapshot{
|
||||
Key: "external-key",
|
||||
DeleteKey: "external-delete-key",
|
||||
}, nil)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||
ctx, recorder := createReqContext(t, req, testUser)
|
||||
|
||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "external-key", response["key"])
|
||||
assert.Equal(t, "external-delete-key", response["deleteKey"])
|
||||
assert.Equal(t, "https://external.example.com/dashboard/snapshot/external-key", response["url"])
|
||||
})
|
||||
|
||||
t.Run("should return forbidden when external is disabled", func(t *testing.T) {
|
||||
mockService := NewMockService(t)
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
ExternalEnabled: false,
|
||||
}
|
||||
testUser := createTestUser()
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test External Snapshot",
|
||||
External: true,
|
||||
},
|
||||
}
|
||||
|
||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||
Return(nil)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||
ctx, recorder := createReqContext(t, req, testUser)
|
||||
|
||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
||||
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "External dashboard creation is disabled", response["message"])
|
||||
})
|
||||
|
||||
t.Run("should create local snapshot", func(t *testing.T) {
|
||||
mockService := NewMockService(t)
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
}
|
||||
testUser := createTestUser()
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test Local Snapshot",
|
||||
},
|
||||
Key: "local-key",
|
||||
DeleteKey: "local-delete-key",
|
||||
}
|
||||
|
||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||
Return(nil)
|
||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||
Return(&DashboardSnapshot{
|
||||
Key: "local-key",
|
||||
DeleteKey: "local-delete-key",
|
||||
}, nil)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||
ctx, recorder := createReqContext(t, req, testUser)
|
||||
|
||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "local-key", response["key"])
|
||||
assert.Equal(t, "local-delete-key", response["deleteKey"])
|
||||
assert.Contains(t, response["url"], "dashboard/snapshot/local-key")
|
||||
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/local-delete-key")
|
||||
})
|
||||
}
|
||||
|
||||
// TestCreateDashboardSnapshotPublic tests snapshot creation in public mode.
|
||||
// These tests cover scenarios when Grafana is running as a public snapshot server
|
||||
// where no user authentication or dashboard validation is required.
|
||||
func TestCreateDashboardSnapshotPublic(t *testing.T) {
|
||||
t.Run("should create local snapshot without user context", func(t *testing.T) {
|
||||
mockService := NewMockService(t)
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: true,
|
||||
}
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test Snapshot",
|
||||
},
|
||||
Key: "test-key",
|
||||
DeleteKey: "test-delete-key",
|
||||
}
|
||||
|
||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||
Return(&DashboardSnapshot{
|
||||
Key: "test-key",
|
||||
DeleteKey: "test-delete-key",
|
||||
}, nil)
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
recorder := httptest.NewRecorder()
|
||||
ctx := &contextmodel.ReqContext{
|
||||
Context: &web.Context{
|
||||
Req: req,
|
||||
Resp: web.NewResponseWriter("POST", recorder),
|
||||
},
|
||||
Logger: log.NewNopLogger(),
|
||||
}
|
||||
|
||||
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
||||
|
||||
mockService.AssertExpectations(t)
|
||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "test-key", response["key"])
|
||||
assert.Equal(t, "test-delete-key", response["deleteKey"])
|
||||
assert.Contains(t, response["url"], "dashboard/snapshot/test-key")
|
||||
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/test-delete-key")
|
||||
})
|
||||
|
||||
t.Run("should return forbidden when snapshots are disabled", func(t *testing.T) {
|
||||
mockService := NewMockService(t)
|
||||
cfg := snapshot.SnapshotSharingOptions{
|
||||
SnapshotsEnabled: false,
|
||||
}
|
||||
dashboard := createTestDashboard(t)
|
||||
|
||||
cmd := CreateDashboardSnapshotCommand{
|
||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||
Dashboard: dashboard,
|
||||
Name: "Test Snapshot",
|
||||
},
|
||||
}
|
||||
|
||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||
recorder := httptest.NewRecorder()
|
||||
ctx := &contextmodel.ReqContext{
|
||||
Context: &web.Context{
|
||||
Req: req,
|
||||
Resp: web.NewResponseWriter("POST", recorder),
|
||||
},
|
||||
Logger: log.NewNopLogger(),
|
||||
}
|
||||
|
||||
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
||||
|
||||
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
||||
|
||||
var response map[string]any
|
||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||
require.NoError(t, err)
|
||||
assert.Equal(t, "Dashboard Snapshots are disabled", response["message"])
|
||||
})
|
||||
}
|
||||
|
||||
// TestDeleteExternalDashboardSnapshot tests deletion of external snapshots.
|
||||
// This function is called in public mode and doesn't require user context.
|
||||
func TestDeleteExternalDashboardSnapshot(t *testing.T) {
|
||||
t.Run("should return nil on successful deletion", func(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
assert.Equal(t, "GET", r.Method)
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||
assert.NoError(t, err)
|
||||
})
|
||||
|
||||
t.Run("should gracefully handle already deleted snapshot", func(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusInternalServerError)
|
||||
response := map[string]any{
|
||||
"message": "Failed to get dashboard snapshot",
|
||||
}
|
||||
_ = json.NewEncoder(w).Encode(response)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||
assert.NoError(t, err)
|
||||
})
|
||||
|
||||
t.Run("should return error on unexpected status code", func(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusNotFound)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||
assert.Error(t, err)
|
||||
assert.Contains(t, err.Error(), "unexpected response when deleting external snapshot")
|
||||
assert.Contains(t, err.Error(), "404")
|
||||
})
|
||||
|
||||
t.Run("should return error on 500 with different message", func(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusInternalServerError)
|
||||
response := map[string]any{
|
||||
"message": "Some other error",
|
||||
}
|
||||
_ = json.NewEncoder(w).Encode(response)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||
assert.Error(t, err)
|
||||
assert.Contains(t, err.Error(), "500")
|
||||
})
|
||||
}
|
||||
|
||||
@@ -14,6 +14,7 @@ import (
|
||||
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/rvmanager"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
||||
gocache "github.com/patrickmn/go-cache"
|
||||
)
|
||||
@@ -868,10 +869,18 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
if key.Action == DataActionDeleted {
|
||||
generation = 0
|
||||
}
|
||||
|
||||
// In compatibility mode, the previous RV, when available, is saved as a microsecond
|
||||
// timestamp, as is done in the SQL backend.
|
||||
previousRV := event.PreviousRV
|
||||
if event.PreviousRV > 0 && isSnowflake(event.PreviousRV) {
|
||||
previousRV = rvmanager.RVFromSnowflake(event.PreviousRV)
|
||||
}
|
||||
|
||||
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
||||
SQLTemplate: sqltemplate.New(kv.dialect),
|
||||
GUID: key.GUID,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
Generation: generation,
|
||||
})
|
||||
|
||||
@@ -900,7 +909,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
Name: key.Name,
|
||||
Action: action,
|
||||
Folder: key.Folder,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -916,7 +925,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
Name: key.Name,
|
||||
Action: action,
|
||||
Folder: key.Folder,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -938,3 +947,15 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// isSnowflake returns whether the argument passed is a snowflake ID (new) or a microsecond timestamp (old).
|
||||
// We try to interpret the number as a microsecond timestamp first. If it represents a time in the past,
|
||||
// it is considered a microsecond timestamp. Snowflake IDs are much larger integers and would lead
|
||||
// to dates in the future if interpreted as a microsecond timestamp.
|
||||
func isSnowflake(rv int64) bool {
|
||||
ts := time.UnixMicro(rv)
|
||||
oneHourFromNow := time.Now().Add(time.Hour)
|
||||
isMicroSecRV := ts.Before(oneHourFromNow)
|
||||
|
||||
return !isMicroSecRV
|
||||
}
|
||||
|
||||
@@ -19,13 +19,18 @@ const (
|
||||
defaultBufferSize = 10000
|
||||
)
|
||||
|
||||
type notifier struct {
|
||||
type notifier interface {
|
||||
Watch(context.Context, watchOptions) <-chan Event
|
||||
}
|
||||
|
||||
type pollingNotifier struct {
|
||||
eventStore *eventStore
|
||||
log logging.Logger
|
||||
}
|
||||
|
||||
type notifierOptions struct {
|
||||
log logging.Logger
|
||||
log logging.Logger
|
||||
useChannelNotifier bool
|
||||
}
|
||||
|
||||
type watchOptions struct {
|
||||
@@ -44,15 +49,26 @@ func defaultWatchOptions() watchOptions {
|
||||
}
|
||||
}
|
||||
|
||||
func newNotifier(eventStore *eventStore, opts notifierOptions) *notifier {
|
||||
func newNotifier(eventStore *eventStore, opts notifierOptions) notifier {
|
||||
if opts.log == nil {
|
||||
opts.log = &logging.NoOpLogger{}
|
||||
}
|
||||
return ¬ifier{eventStore: eventStore, log: opts.log}
|
||||
|
||||
if opts.useChannelNotifier {
|
||||
return &channelNotifier{}
|
||||
}
|
||||
|
||||
return &pollingNotifier{eventStore: eventStore, log: opts.log}
|
||||
}
|
||||
|
||||
type channelNotifier struct{}
|
||||
|
||||
func (cn *channelNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Return the last resource version from the event store
|
||||
func (n *notifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
||||
func (n *pollingNotifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
||||
e, err := n.eventStore.LastEventKey(ctx)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
@@ -60,11 +76,11 @@ func (n *notifier) lastEventResourceVersion(ctx context.Context) (int64, error)
|
||||
return e.ResourceVersion, nil
|
||||
}
|
||||
|
||||
func (n *notifier) cacheKey(evt Event) string {
|
||||
func (n *pollingNotifier) cacheKey(evt Event) string {
|
||||
return fmt.Sprintf("%s~%s~%s~%s~%d", evt.Namespace, evt.Group, evt.Resource, evt.Name, evt.ResourceVersion)
|
||||
}
|
||||
|
||||
func (n *notifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||
func (n *pollingNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||
if opts.MinBackoff <= 0 {
|
||||
opts.MinBackoff = defaultMinBackoff
|
||||
}
|
||||
|
||||
@@ -13,7 +13,7 @@ import (
|
||||
"github.com/stretchr/testify/require"
|
||||
)
|
||||
|
||||
func setupTestNotifier(t *testing.T) (*notifier, *eventStore) {
|
||||
func setupTestNotifier(t *testing.T) (*pollingNotifier, *eventStore) {
|
||||
db := setupTestBadgerDB(t)
|
||||
t.Cleanup(func() {
|
||||
err := db.Close()
|
||||
@@ -22,10 +22,10 @@ func setupTestNotifier(t *testing.T) (*notifier, *eventStore) {
|
||||
kv := NewBadgerKV(db)
|
||||
eventStore := newEventStore(kv)
|
||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||
return notifier, eventStore
|
||||
return notifier.(*pollingNotifier), eventStore
|
||||
}
|
||||
|
||||
func setupTestNotifierSqlKv(t *testing.T) (*notifier, *eventStore) {
|
||||
func setupTestNotifierSqlKv(t *testing.T) (*pollingNotifier, *eventStore) {
|
||||
dbstore := db.InitTestDB(t)
|
||||
eDB, err := dbimpl.ProvideResourceDB(dbstore, setting.NewCfg(), nil)
|
||||
require.NoError(t, err)
|
||||
@@ -33,7 +33,7 @@ func setupTestNotifierSqlKv(t *testing.T) (*notifier, *eventStore) {
|
||||
require.NoError(t, err)
|
||||
eventStore := newEventStore(kv)
|
||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||
return notifier, eventStore
|
||||
return notifier.(*pollingNotifier), eventStore
|
||||
}
|
||||
|
||||
func TestNewNotifier(t *testing.T) {
|
||||
@@ -49,7 +49,7 @@ func TestDefaultWatchOptions(t *testing.T) {
|
||||
assert.Equal(t, defaultBufferSize, opts.BufferSize)
|
||||
}
|
||||
|
||||
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*notifier, *eventStore), testFn func(*testing.T, context.Context, *notifier, *eventStore)) {
|
||||
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*pollingNotifier, *eventStore), testFn func(*testing.T, context.Context, *pollingNotifier, *eventStore)) {
|
||||
t.Run(storeName, func(t *testing.T) {
|
||||
ctx := context.Background()
|
||||
notifier, eventStore := newStoreFn(t)
|
||||
@@ -62,7 +62,7 @@ func TestNotifier_lastEventResourceVersion(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierLastEventResourceVersion)
|
||||
}
|
||||
|
||||
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
// Test with no events
|
||||
rv, err := notifier.lastEventResourceVersion(ctx)
|
||||
assert.Error(t, err)
|
||||
@@ -113,7 +113,7 @@ func TestNotifier_cachekey(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierCachekey)
|
||||
}
|
||||
|
||||
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
tests := []struct {
|
||||
name string
|
||||
event Event
|
||||
@@ -167,7 +167,7 @@ func TestNotifier_Watch_NoEvents(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchNoEvents)
|
||||
}
|
||||
|
||||
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
ctx, cancel := context.WithTimeout(ctx, 500*time.Millisecond)
|
||||
defer cancel()
|
||||
|
||||
@@ -208,7 +208,7 @@ func TestNotifier_Watch_WithExistingEvents(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchWithExistingEvents)
|
||||
}
|
||||
|
||||
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||
defer cancel()
|
||||
|
||||
@@ -282,7 +282,7 @@ func TestNotifier_Watch_EventDeduplication(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchEventDeduplication)
|
||||
}
|
||||
|
||||
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||
defer cancel()
|
||||
|
||||
@@ -348,7 +348,7 @@ func TestNotifier_Watch_ContextCancellation(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchContextCancellation)
|
||||
}
|
||||
|
||||
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
ctx, cancel := context.WithCancel(ctx)
|
||||
|
||||
// Add an initial event so that lastEventResourceVersion doesn't return ErrNotFound
|
||||
@@ -394,7 +394,7 @@ func TestNotifier_Watch_MultipleEvents(t *testing.T) {
|
||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchMultipleEvents)
|
||||
}
|
||||
|
||||
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||
ctx, cancel := context.WithTimeout(ctx, 3*time.Second)
|
||||
defer cancel()
|
||||
rv := time.Now().UnixNano()
|
||||
@@ -456,33 +456,27 @@ func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier
|
||||
},
|
||||
}
|
||||
|
||||
errCh := make(chan error)
|
||||
go func() {
|
||||
for _, event := range testEvents {
|
||||
err := eventStore.Save(ctx, event)
|
||||
require.NoError(t, err)
|
||||
errCh <- eventStore.Save(ctx, event)
|
||||
}
|
||||
}()
|
||||
|
||||
// Receive events
|
||||
receivedEvents := make([]Event, 0, len(testEvents))
|
||||
for i := 0; i < len(testEvents); i++ {
|
||||
receivedEvents := make([]string, 0, len(testEvents))
|
||||
for len(receivedEvents) != len(testEvents) {
|
||||
select {
|
||||
case event := <-events:
|
||||
receivedEvents = append(receivedEvents, event)
|
||||
receivedEvents = append(receivedEvents, event.Name)
|
||||
case err := <-errCh:
|
||||
require.NoError(t, err)
|
||||
case <-time.After(1 * time.Second):
|
||||
t.Fatalf("Timed out waiting for event %d", i+1)
|
||||
t.Fatalf("Timed out waiting for event %d", len(receivedEvents)+1)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify all events were received
|
||||
assert.Len(t, receivedEvents, len(testEvents))
|
||||
|
||||
// Verify the events match and ordered by resource version
|
||||
receivedNames := make([]string, len(receivedEvents))
|
||||
for i, event := range receivedEvents {
|
||||
receivedNames[i] = event.Name
|
||||
}
|
||||
|
||||
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
||||
assert.ElementsMatch(t, expectedNames, receivedNames)
|
||||
assert.ElementsMatch(t, expectedNames, receivedEvents)
|
||||
}
|
||||
|
||||
@@ -473,8 +473,6 @@ func (k *sqlKV) Delete(ctx context.Context, section string, key string) error {
|
||||
return ErrNotFound
|
||||
}
|
||||
|
||||
// TODO reflect change to resource table
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
|
||||
@@ -61,7 +61,7 @@ type kvStorageBackend struct {
|
||||
bulkLock *BulkLock
|
||||
dataStore *dataStore
|
||||
eventStore *eventStore
|
||||
notifier *notifier
|
||||
notifier notifier
|
||||
builder DocumentBuilder
|
||||
log logging.Logger
|
||||
withPruner bool
|
||||
@@ -91,6 +91,7 @@ type KVBackendOptions struct {
|
||||
Tracer trace.Tracer // TODO add tracing
|
||||
Reg prometheus.Registerer // TODO add metrics
|
||||
|
||||
UseChannelNotifier bool
|
||||
// Adding RvManager overrides the RV generated with snowflake in order to keep backwards compatibility with
|
||||
// unified/sql
|
||||
RvManager *rvmanager.ResourceVersionManager
|
||||
@@ -121,7 +122,7 @@ func NewKVStorageBackend(opts KVBackendOptions) (KVBackend, error) {
|
||||
bulkLock: NewBulkLock(),
|
||||
dataStore: newDataStore(kv),
|
||||
eventStore: eventStore,
|
||||
notifier: newNotifier(eventStore, notifierOptions{}),
|
||||
notifier: newNotifier(eventStore, notifierOptions{useChannelNotifier: opts.UseChannelNotifier}),
|
||||
snowflake: s,
|
||||
builder: StandardDocumentBuilder(), // For now we use the standard document builder.
|
||||
log: &logging.NoOpLogger{}, // Make this configurable
|
||||
@@ -346,7 +347,7 @@ func (k *kvStorageBackend) WriteEvent(ctx context.Context, event WriteEvent) (in
|
||||
return 0, fmt.Errorf("failed to write data: %w", err)
|
||||
}
|
||||
|
||||
rv = rvmanager.SnowflakeFromRv(rv)
|
||||
rv = rvmanager.SnowflakeFromRV(rv)
|
||||
dataKey.ResourceVersion = rv
|
||||
} else {
|
||||
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
||||
|
||||
@@ -307,7 +307,7 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
||||
// Allocate the RVs
|
||||
for i, guid := range guids {
|
||||
guidToRV[guid] = rv
|
||||
guidToSnowflakeRV[guid] = SnowflakeFromRv(rv)
|
||||
guidToSnowflakeRV[guid] = SnowflakeFromRV(rv)
|
||||
rvs[i] = rv
|
||||
rv++
|
||||
}
|
||||
@@ -364,12 +364,20 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
||||
}
|
||||
}
|
||||
|
||||
// takes a unix microsecond rv and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||
// takes a unix microsecond RV and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
||||
func SnowflakeFromRv(rv int64) int64 {
|
||||
func SnowflakeFromRV(rv int64) int64 {
|
||||
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
||||
}
|
||||
|
||||
// It is generally not possible to convert from a snowflakeID to a microsecond RV due to the loss in precision
|
||||
// (snowflake ID stores timestamp in milliseconds). However, this implementation stores the microsecond fraction
|
||||
// in the step bits (see SnowflakeFromRV), allowing us to compute the microsecond timestamp.
|
||||
func RVFromSnowflake(snowflakeID int64) int64 {
|
||||
microSecFraction := snowflakeID & ((1 << snowflake.StepBits) - 1)
|
||||
return ((snowflakeID>>(snowflake.NodeBits+snowflake.StepBits))+snowflake.Epoch)*1000 + microSecFraction
|
||||
}
|
||||
|
||||
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
||||
// if comparison fails
|
||||
func IsRvEqual(rv1, rv2 int64) bool {
|
||||
@@ -377,7 +385,7 @@ func IsRvEqual(rv1, rv2 int64) bool {
|
||||
return true
|
||||
}
|
||||
|
||||
return rv1 == SnowflakeFromRv(rv2)
|
||||
return rv1 == SnowflakeFromRV(rv2)
|
||||
}
|
||||
|
||||
// Lock locks the resource version for the given key
|
||||
|
||||
@@ -63,3 +63,13 @@ func TestResourceVersionManager(t *testing.T) {
|
||||
require.Equal(t, rv, int64(200))
|
||||
})
|
||||
}
|
||||
|
||||
func TestSnowflakeFromRVRoundtrips(t *testing.T) {
|
||||
// 2026-01-12 19:33:58.806211 +0000 UTC
|
||||
offset := int64(1768246438806211) // in microseconds
|
||||
|
||||
for n := range int64(100) {
|
||||
ts := offset + n
|
||||
require.Equal(t, ts, RVFromSnowflake(SnowflakeFromRV(ts)))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -99,6 +99,9 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
||||
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
||||
|
||||
if opts.Cfg.EnableSQLKVBackend {
|
||||
sqlkv, err := resource.NewSQLKV(eDB)
|
||||
if err != nil {
|
||||
@@ -106,9 +109,10 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
||||
}
|
||||
|
||||
kvBackendOpts := resource.KVBackendOptions{
|
||||
KvStore: sqlkv,
|
||||
Tracer: opts.Tracer,
|
||||
Reg: opts.Reg,
|
||||
KvStore: sqlkv,
|
||||
Tracer: opts.Tracer,
|
||||
Reg: opts.Reg,
|
||||
UseChannelNotifier: !isHA,
|
||||
}
|
||||
|
||||
ctx := context.Background()
|
||||
@@ -140,9 +144,6 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
||||
serverOptions.Backend = kvBackend
|
||||
serverOptions.Diagnostics = kvBackend
|
||||
} else {
|
||||
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
||||
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
||||
|
||||
backend, err := NewBackend(BackendOptions{
|
||||
DBProvider: eDB,
|
||||
Reg: opts.Reg,
|
||||
|
||||
@@ -200,7 +200,7 @@ func verifyKeyPath(t *testing.T, db sqldb.DB, ctx context.Context, key *resource
|
||||
var keyPathRV int64
|
||||
if isSqlBackend {
|
||||
// Convert microsecond RV to snowflake for key_path construction
|
||||
keyPathRV = rvmanager.SnowflakeFromRv(resourceVersion)
|
||||
keyPathRV = rvmanager.SnowflakeFromRV(resourceVersion)
|
||||
} else {
|
||||
// KV backend already provides snowflake RV
|
||||
keyPathRV = resourceVersion
|
||||
@@ -434,9 +434,6 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
|
||||
rows, err := db.QueryContext(ctx, query, namespace)
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceHistoryRecord
|
||||
for rows.Next() {
|
||||
@@ -460,33 +457,34 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
for resourceIdx, res := range resources {
|
||||
// Check create record (action=1, generation=1)
|
||||
createRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, createRecord, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||
verifyResourceHistoryRecord(t, createRecord, namespace, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||
recordIndex++
|
||||
}
|
||||
|
||||
for resourceIdx, res := range resources {
|
||||
// Check update record (action=2, generation=2)
|
||||
updateRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, updateRecord, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||
verifyResourceHistoryRecord(t, updateRecord, namespace, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||
recordIndex++
|
||||
}
|
||||
|
||||
for resourceIdx, res := range resources[:2] {
|
||||
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
||||
deleteRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, deleteRecord, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||
verifyResourceHistoryRecord(t, deleteRecord, namespace, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||
recordIndex++
|
||||
}
|
||||
}
|
||||
|
||||
// verifyResourceHistoryRecord validates a single resource_history record
|
||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, namespace string, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||
// Validate GUID (should be non-empty)
|
||||
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
||||
|
||||
// Validate group/resource/namespace/name
|
||||
require.Equal(t, "playlist.grafana.app", record.Group)
|
||||
require.Equal(t, "playlists", record.Resource)
|
||||
require.Equal(t, namespace, record.Namespace)
|
||||
require.Equal(t, expectedRes.name, record.Name)
|
||||
|
||||
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
||||
@@ -513,8 +511,12 @@ func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, exp
|
||||
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
||||
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
||||
if strings.Contains(record.Namespace, "-kv") {
|
||||
require.True(t, rvmanager.IsRvEqual(expectedPrevRV, record.PreviousResourceVersion),
|
||||
"Previous resource version should match (KV backend snowflake format)")
|
||||
if expectedPrevRV == 0 {
|
||||
require.Zero(t, record.PreviousResourceVersion)
|
||||
} else {
|
||||
require.Equal(t, expectedPrevRV, rvmanager.SnowflakeFromRV(record.PreviousResourceVersion),
|
||||
"Previous resource version should match (KV backend snowflake format)")
|
||||
}
|
||||
} else {
|
||||
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
||||
}
|
||||
@@ -546,9 +548,6 @@ func verifyResourceTable(t *testing.T, db sqldb.DB, namespace string, resources
|
||||
|
||||
rows, err := db.QueryContext(ctx, query, namespace)
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceRecord
|
||||
for rows.Next() {
|
||||
@@ -612,9 +611,6 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
||||
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceVersionRecord
|
||||
for rows.Next() {
|
||||
@@ -649,7 +645,7 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
isKvBackend := strings.Contains(namespace, "-kv")
|
||||
recordResourceVersion := record.ResourceVersion
|
||||
if isKvBackend {
|
||||
recordResourceVersion = rvmanager.SnowflakeFromRv(record.ResourceVersion)
|
||||
recordResourceVersion = rvmanager.SnowflakeFromRV(record.ResourceVersion)
|
||||
}
|
||||
|
||||
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
||||
@@ -841,24 +837,20 @@ func runMixedConcurrentOperations(t *testing.T, sqlServer, kvServer resource.Res
|
||||
}
|
||||
|
||||
// SQL backend operations
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
wg.Go(func() {
|
||||
<-startBarrier // Wait for signal to start
|
||||
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
||||
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
||||
}
|
||||
}()
|
||||
})
|
||||
|
||||
// KV backend operations
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
wg.Go(func() {
|
||||
<-startBarrier // Wait for signal to start
|
||||
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
||||
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
||||
}
|
||||
}()
|
||||
})
|
||||
|
||||
// Start both goroutines simultaneously
|
||||
close(startBarrier)
|
||||
|
||||
@@ -8,6 +8,7 @@ import (
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||
"github.com/grafana/grafana/pkg/util/testutil"
|
||||
)
|
||||
|
||||
func TestBadgerKVStorageBackend(t *testing.T) {
|
||||
@@ -36,7 +37,9 @@ func TestBadgerKVStorageBackend(t *testing.T) {
|
||||
})
|
||||
}
|
||||
|
||||
func TestSQLKVStorageBackend(t *testing.T) {
|
||||
func TestIntegrationSQLKVStorageBackend(t *testing.T) {
|
||||
testutil.SkipIntegrationTestInShortMode(t)
|
||||
|
||||
skipTests := map[string]bool{
|
||||
TestWatchWriteEvents: true,
|
||||
TestList: true,
|
||||
|
||||
@@ -30,6 +30,7 @@ const (
|
||||
defaultLogGroupLimit = int32(50)
|
||||
logIdentifierInternal = "__log__grafana_internal__"
|
||||
logStreamIdentifierInternal = "__logstream__grafana_internal__"
|
||||
logGroupsMacro = "$__logGroups"
|
||||
)
|
||||
|
||||
type AWSError struct {
|
||||
@@ -189,6 +190,47 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
logsQuery.QueryLanguage = &cwli
|
||||
}
|
||||
|
||||
region := logsQuery.Region
|
||||
if region == "" || region == defaultRegion {
|
||||
region = ds.Settings.Region
|
||||
}
|
||||
|
||||
useARN := false
|
||||
if len(logsQuery.LogGroups) > 0 && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) && region != "" {
|
||||
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
||||
if err != nil {
|
||||
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
||||
} else {
|
||||
useARN = isMonitoringAccount
|
||||
}
|
||||
}
|
||||
|
||||
var logGroupIdentifiers []string
|
||||
if len(logsQuery.LogGroups) > 0 {
|
||||
// Log queries should use ARNs when querying a monitoring account because log group names are not unique across accounts.
|
||||
if useARN {
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Arn != "" {
|
||||
// The startQuery api does not support arns with a trailing * so we need to remove it
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, strings.TrimSuffix(lg.Arn, "*"))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
||||
seen := make(map[string]struct{}, len(logsQuery.LogGroups))
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Name == "" {
|
||||
continue
|
||||
}
|
||||
if _, exists := seen[lg.Name]; exists {
|
||||
continue
|
||||
}
|
||||
seen[lg.Name] = struct{}{}
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, lg.Name)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
finalQueryString := logsQuery.QueryString
|
||||
// Only for CWLI queries
|
||||
// The fields @log and @logStream are always included in the results of a user's query
|
||||
@@ -200,6 +242,21 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
logStreamIdentifierInternal + "|" + logsQuery.QueryString
|
||||
}
|
||||
|
||||
// Expand $__logGroups macro for SQL queries
|
||||
if *logsQuery.QueryLanguage == dataquery.LogsQueryLanguageSQL {
|
||||
if strings.Contains(finalQueryString, logGroupsMacro) {
|
||||
if len(logGroupIdentifiers) == 0 {
|
||||
return nil, backend.DownstreamError(fmt.Errorf("query contains %s but no log groups are selected", logGroupsMacro))
|
||||
}
|
||||
quoted := make([]string, len(logGroupIdentifiers))
|
||||
for i, id := range logGroupIdentifiers {
|
||||
quoted[i] = fmt.Sprintf("'%s'", id)
|
||||
}
|
||||
replacement := fmt.Sprintf("`logGroups(logGroupIdentifier: [%s])`", strings.Join(quoted, ", "))
|
||||
finalQueryString = strings.Replace(finalQueryString, logGroupsMacro, replacement, 1)
|
||||
}
|
||||
}
|
||||
|
||||
startQueryInput := &cloudwatchlogs.StartQueryInput{
|
||||
StartTime: aws.Int64(startTime.Unix()),
|
||||
// Usually grafana time range allows only second precision, but you can create ranges with milliseconds
|
||||
@@ -213,47 +270,13 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
|
||||
// log group identifiers can be left out if the query is an SQL query
|
||||
if *logsQuery.QueryLanguage != dataquery.LogsQueryLanguageSQL {
|
||||
useLogGroupIdentifiers := false
|
||||
logGroupsFromQuery := len(logsQuery.LogGroups) > 0
|
||||
if logGroupsFromQuery && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) {
|
||||
region := logsQuery.Region
|
||||
if region == "" || region == defaultRegion {
|
||||
region = ds.Settings.Region
|
||||
}
|
||||
if region != "" {
|
||||
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
||||
if err != nil {
|
||||
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
||||
} else if isMonitoringAccount {
|
||||
// monitoring accounts require querying by log group identifiers because log group names are not unique across accounts.
|
||||
var logGroupIdentifiers []string
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
// due to a bug in the startQuery api, we remove * from the arn, otherwise it throws an error
|
||||
arn := strings.TrimSuffix(lg.Arn, "*")
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, arn)
|
||||
}
|
||||
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
||||
useLogGroupIdentifiers = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !useLogGroupIdentifiers {
|
||||
if useARN {
|
||||
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
||||
} else {
|
||||
// even though logsQuery.LogGroupNames is deprecated, we still need to support it for backwards compatibility and alert queries
|
||||
startQueryInput.LogGroupNames = append([]string(nil), logsQuery.LogGroupNames...)
|
||||
if len(startQueryInput.LogGroupNames) == 0 && logGroupsFromQuery {
|
||||
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
||||
seenLogGroupNames := make(map[string]struct{}, len(logsQuery.LogGroups))
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Name == "" {
|
||||
continue
|
||||
}
|
||||
if _, exists := seenLogGroupNames[lg.Name]; exists {
|
||||
continue
|
||||
}
|
||||
seenLogGroupNames[lg.Name] = struct{}{}
|
||||
startQueryInput.LogGroupNames = append(startQueryInput.LogGroupNames, lg.Name)
|
||||
}
|
||||
if len(startQueryInput.LogGroupNames) == 0 && len(logGroupIdentifiers) > 0 {
|
||||
startQueryInput.LogGroupNames = logGroupIdentifiers
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -873,6 +873,204 @@ func TestQuery_GetQueryResults(t *testing.T) {
|
||||
}, resp)
|
||||
}
|
||||
|
||||
func Test_expandLogGroupsMacro(t *testing.T) {
|
||||
origNewCWLogsClient := NewCWLogsClient
|
||||
t.Cleanup(func() {
|
||||
NewCWLogsClient = origNewCWLogsClient
|
||||
})
|
||||
|
||||
var cli fakeCWLogsClient
|
||||
|
||||
NewCWLogsClient = func(cfg aws.Config) models.CWLogsClient {
|
||||
return &cli
|
||||
}
|
||||
|
||||
t.Run("expands $__logGroups macro with log group names when not a monitoring account", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['group1', 'group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("expands $__logGroups macro with ARNs when monitoring account", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource(func(ds *DataSource) {
|
||||
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||
})
|
||||
|
||||
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}],
|
||||
"region": "us-east-1"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1', 'arn:aws:logs:us-east-1:123456789012:log-group:group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("strips trailing * from ARNs when expanding macro", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource(func(ds *DataSource) {
|
||||
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||
})
|
||||
|
||||
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1*", "name": "group1"}],
|
||||
"region": "us-east-1"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("returns error when $__logGroups macro is used but no log groups are selected", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
resp, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
assert.Contains(t, resp.Responses["A"].Error.Error(), "query contains $__logGroups but no log groups are selected")
|
||||
})
|
||||
|
||||
t.Run("does not expand macro when query does not contain $__logGroups", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM ` + "`logGroups(logGroupIdentifier: ['my-log-group'])`" + `"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['my-log-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("does not expand macro for non-SQL query languages", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "CWLI",
|
||||
"queryString":"fields @message | $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Contains(t, *cli.calls.startQuery[0].QueryString, "$__logGroups")
|
||||
})
|
||||
|
||||
t.Run("expands macro with single log group", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:single-group", "name": "single-group"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['single-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
}
|
||||
|
||||
func TestGroupResponseFrame(t *testing.T) {
|
||||
t.Run("Doesn't group results without time field", func(t *testing.T) {
|
||||
frame := data.NewFrameOfFieldTypes("test", 0, data.FieldTypeString, data.FieldTypeInt32)
|
||||
|
||||
@@ -1,288 +0,0 @@
|
||||
/**
|
||||
* Mutation Executor
|
||||
*
|
||||
* Executes dashboard mutations with transaction support and event emission.
|
||||
*/
|
||||
|
||||
import { v4 as uuidv4 } from 'uuid';
|
||||
|
||||
import type { DashboardScene } from '../scene/DashboardScene';
|
||||
|
||||
import {
|
||||
handleAddPanel,
|
||||
handleRemovePanel,
|
||||
handleUpdatePanel,
|
||||
handleMovePanel,
|
||||
handleAddVariable,
|
||||
handleRemoveVariable,
|
||||
handleAddRow,
|
||||
handleUpdateTimeSettings,
|
||||
handleUpdateDashboardMeta,
|
||||
handleGetDashboardInfo,
|
||||
type MutationContext,
|
||||
type MutationTransactionInternal,
|
||||
type MutationHandler,
|
||||
} from './handlers';
|
||||
import {
|
||||
type Mutation,
|
||||
type MutationType,
|
||||
type MutationResult,
|
||||
type MutationEvent,
|
||||
type MutationPayloadMap,
|
||||
} from './types';
|
||||
|
||||
// ============================================================================
|
||||
// Event Bus
|
||||
// ============================================================================
|
||||
|
||||
type MutationEventListener = (event: MutationEvent) => void;
|
||||
|
||||
class MutationEventBus {
|
||||
private listeners: Set<MutationEventListener> = new Set();
|
||||
|
||||
subscribe(listener: MutationEventListener): () => void {
|
||||
this.listeners.add(listener);
|
||||
return () => this.listeners.delete(listener);
|
||||
}
|
||||
|
||||
emit(event: MutationEvent): void {
|
||||
this.listeners.forEach((listener) => {
|
||||
try {
|
||||
listener(event);
|
||||
} catch (error) {
|
||||
console.error('Event listener error:', error);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Mutation Executor
|
||||
// ============================================================================
|
||||
|
||||
export class MutationExecutor {
|
||||
private scene!: DashboardScene;
|
||||
private handlers: Map<MutationType, MutationHandler> = new Map();
|
||||
private eventBus = new MutationEventBus();
|
||||
private _currentTransaction: MutationTransactionInternal | null = null;
|
||||
|
||||
constructor() {
|
||||
this.registerDefaultHandlers();
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the dashboard scene to operate on
|
||||
*/
|
||||
setScene(scene: DashboardScene): void {
|
||||
this.scene = scene;
|
||||
}
|
||||
|
||||
/**
|
||||
* Subscribe to mutation events
|
||||
*/
|
||||
onMutation(listener: MutationEventListener): () => void {
|
||||
return this.eventBus.subscribe(listener);
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute a single mutation
|
||||
*/
|
||||
async execute(mutation: Mutation): Promise<MutationResult> {
|
||||
const results = await this.executeBatch([mutation]);
|
||||
return results[0];
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute multiple mutations atomically
|
||||
*/
|
||||
async executeBatch(mutations: Mutation[]): Promise<MutationResult[]> {
|
||||
if (!this.scene) {
|
||||
throw new Error('No scene set. Call setScene() first.');
|
||||
}
|
||||
|
||||
// Create transaction
|
||||
const transaction: MutationTransactionInternal = {
|
||||
id: uuidv4(),
|
||||
mutations,
|
||||
status: 'pending',
|
||||
startedAt: Date.now(),
|
||||
changes: [],
|
||||
};
|
||||
|
||||
this._currentTransaction = transaction;
|
||||
|
||||
const results: MutationResult[] = [];
|
||||
const context: MutationContext = { scene: this.scene, transaction };
|
||||
|
||||
try {
|
||||
// Execute each mutation
|
||||
for (const mutation of mutations) {
|
||||
const handler = this.handlers.get(mutation.type);
|
||||
if (!handler) {
|
||||
throw new Error(`No handler registered for mutation type: ${mutation.type}`);
|
||||
}
|
||||
|
||||
const result = await handler(mutation.payload, context);
|
||||
results.push(result);
|
||||
|
||||
if (!result.success) {
|
||||
throw new Error(result.error || `Mutation ${mutation.type} failed`);
|
||||
}
|
||||
|
||||
// Emit success event
|
||||
this.eventBus.emit({
|
||||
type: 'mutation_applied',
|
||||
mutation,
|
||||
result,
|
||||
transaction,
|
||||
timestamp: Date.now(),
|
||||
source: 'assistant',
|
||||
});
|
||||
}
|
||||
|
||||
// Commit transaction
|
||||
transaction.status = 'committed';
|
||||
transaction.completedAt = Date.now();
|
||||
|
||||
// Trigger scene refresh
|
||||
this.scene.forceRender();
|
||||
|
||||
return results;
|
||||
} catch (error) {
|
||||
// Probably need a rollback mechanism here... but skipping this for POC
|
||||
console.error('Mutation batch failed:', error);
|
||||
|
||||
transaction.status = 'rolled_back';
|
||||
transaction.completedAt = Date.now();
|
||||
|
||||
// Emit failure event
|
||||
this.eventBus.emit({
|
||||
type: 'mutation_rolled_back',
|
||||
mutation: mutations[0],
|
||||
result: { success: false, error: String(error), changes: [] },
|
||||
transaction,
|
||||
timestamp: Date.now(),
|
||||
source: 'assistant',
|
||||
});
|
||||
|
||||
// Return error results for remaining mutations
|
||||
const errorMessage = error instanceof Error ? error.message : String(error);
|
||||
while (results.length < mutations.length) {
|
||||
results.push({
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
changes: [],
|
||||
});
|
||||
}
|
||||
|
||||
return results;
|
||||
} finally {
|
||||
this._currentTransaction = null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current transaction (for debugging)
|
||||
*/
|
||||
get currentTransaction(): MutationTransactionInternal | null {
|
||||
return this._currentTransaction;
|
||||
}
|
||||
|
||||
// ==========================================================================
|
||||
// Handler Registration
|
||||
// ==========================================================================
|
||||
|
||||
private registerDefaultHandlers(): void {
|
||||
// Panel operations
|
||||
this.registerHandler('ADD_PANEL', handleAddPanel);
|
||||
this.registerHandler('REMOVE_PANEL', handleRemovePanel);
|
||||
this.registerHandler('UPDATE_PANEL', handleUpdatePanel);
|
||||
this.registerHandler('MOVE_PANEL', handleMovePanel);
|
||||
this.registerHandler('DUPLICATE_PANEL', this.notImplemented('DUPLICATE_PANEL'));
|
||||
|
||||
// Variable operations
|
||||
this.registerHandler('ADD_VARIABLE', handleAddVariable);
|
||||
this.registerHandler('REMOVE_VARIABLE', handleRemoveVariable);
|
||||
this.registerHandler('UPDATE_VARIABLE', this.notImplemented('UPDATE_VARIABLE'));
|
||||
|
||||
// Row operations
|
||||
this.registerHandler('ADD_ROW', handleAddRow);
|
||||
this.registerHandler('REMOVE_ROW', this.notImplemented('REMOVE_ROW'));
|
||||
this.registerHandler('COLLAPSE_ROW', this.notImplemented('COLLAPSE_ROW'));
|
||||
|
||||
// Tab operations
|
||||
this.registerHandler('ADD_TAB', this.notImplemented('ADD_TAB'));
|
||||
this.registerHandler('REMOVE_TAB', this.notImplemented('REMOVE_TAB'));
|
||||
|
||||
// Library panel operations
|
||||
this.registerHandler('ADD_LIBRARY_PANEL', this.notImplemented('ADD_LIBRARY_PANEL'));
|
||||
this.registerHandler('UNLINK_LIBRARY_PANEL', this.notImplemented('UNLINK_LIBRARY_PANEL'));
|
||||
this.registerHandler('SAVE_AS_LIBRARY_PANEL', this.notImplemented('SAVE_AS_LIBRARY_PANEL'));
|
||||
|
||||
// Repeat configuration
|
||||
this.registerHandler('CONFIGURE_PANEL_REPEAT', this.notImplemented('CONFIGURE_PANEL_REPEAT'));
|
||||
this.registerHandler('CONFIGURE_ROW_REPEAT', this.notImplemented('CONFIGURE_ROW_REPEAT'));
|
||||
|
||||
// Conditional rendering
|
||||
this.registerHandler('SET_CONDITIONAL_RENDERING', this.notImplemented('SET_CONDITIONAL_RENDERING'));
|
||||
|
||||
// Layout
|
||||
this.registerHandler('CHANGE_LAYOUT_TYPE', this.notImplemented('CHANGE_LAYOUT_TYPE'));
|
||||
|
||||
// Annotation operations
|
||||
this.registerHandler('ADD_ANNOTATION', this.notImplemented('ADD_ANNOTATION'));
|
||||
this.registerHandler('UPDATE_ANNOTATION', this.notImplemented('UPDATE_ANNOTATION'));
|
||||
this.registerHandler('REMOVE_ANNOTATION', this.notImplemented('REMOVE_ANNOTATION'));
|
||||
|
||||
// Link operations
|
||||
this.registerHandler('ADD_DASHBOARD_LINK', this.notImplemented('ADD_DASHBOARD_LINK'));
|
||||
this.registerHandler('REMOVE_DASHBOARD_LINK', this.notImplemented('REMOVE_DASHBOARD_LINK'));
|
||||
this.registerHandler('ADD_PANEL_LINK', this.notImplemented('ADD_PANEL_LINK'));
|
||||
this.registerHandler('ADD_DATA_LINK', this.notImplemented('ADD_DATA_LINK'));
|
||||
|
||||
// Field configuration
|
||||
this.registerHandler('ADD_FIELD_OVERRIDE', this.notImplemented('ADD_FIELD_OVERRIDE'));
|
||||
this.registerHandler('ADD_VALUE_MAPPING', this.notImplemented('ADD_VALUE_MAPPING'));
|
||||
this.registerHandler('ADD_TRANSFORMATION', this.notImplemented('ADD_TRANSFORMATION'));
|
||||
|
||||
// Dashboard settings
|
||||
this.registerHandler('UPDATE_TIME_SETTINGS', handleUpdateTimeSettings);
|
||||
this.registerHandler('UPDATE_DASHBOARD_META', handleUpdateDashboardMeta);
|
||||
|
||||
// Dashboard management (backend operations)
|
||||
this.registerHandler('MOVE_TO_FOLDER', this.notImplemented('MOVE_TO_FOLDER'));
|
||||
this.registerHandler('TOGGLE_FAVORITE', this.notImplemented('TOGGLE_FAVORITE'));
|
||||
|
||||
// Version management (backend operations)
|
||||
this.registerHandler('LIST_VERSIONS', this.notImplemented('LIST_VERSIONS'));
|
||||
this.registerHandler('COMPARE_VERSIONS', this.notImplemented('COMPARE_VERSIONS'));
|
||||
this.registerHandler('RESTORE_VERSION', this.notImplemented('RESTORE_VERSION'));
|
||||
|
||||
// Read-only operations
|
||||
this.registerHandler('GET_DASHBOARD_INFO', handleGetDashboardInfo);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a stub handler for not-yet-implemented mutations
|
||||
*/
|
||||
private notImplemented(mutationType: string): MutationHandler {
|
||||
return async (): Promise<MutationResult> => {
|
||||
return {
|
||||
success: false,
|
||||
changes: [],
|
||||
error: `${mutationType} is not fully implemented in POC`,
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a mutation handler
|
||||
*/
|
||||
private registerHandler<T extends MutationType>(
|
||||
type: T,
|
||||
handler: (payload: MutationPayloadMap[T], context: MutationContext) => Promise<MutationResult>
|
||||
): void {
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
this.handlers.set(type, handler as MutationHandler);
|
||||
}
|
||||
}
|
||||
@@ -1,159 +0,0 @@
|
||||
/**
|
||||
* Dashboard settings mutation handlers
|
||||
*/
|
||||
|
||||
import type { TimeSettingsSpec } from '@grafana/schema/src/schema/dashboard/v2beta1/types.spec.gen';
|
||||
|
||||
import { DASHBOARD_MCP_TOOLS } from '../mcpTools';
|
||||
import type { MutationResult, MutationChange, UpdateDashboardMetaPayload, AddRowPayload } from '../types';
|
||||
|
||||
import type { MutationContext } from './types';
|
||||
|
||||
/**
|
||||
* Add a row (stub - not fully implemented)
|
||||
*/
|
||||
export async function handleAddRow(_payload: AddRowPayload, _context: MutationContext): Promise<MutationResult> {
|
||||
return {
|
||||
success: true,
|
||||
changes: [],
|
||||
warnings: ['Add row is not fully implemented in POC - requires RowsLayout'],
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Update dashboard time settings
|
||||
*/
|
||||
export async function handleUpdateTimeSettings(
|
||||
payload: Partial<TimeSettingsSpec>,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
const { from, to, timezone, autoRefresh } = payload;
|
||||
|
||||
try {
|
||||
const timeRange = scene.state.$timeRange;
|
||||
if (!timeRange) {
|
||||
throw new Error('Dashboard has no time range');
|
||||
}
|
||||
|
||||
const previousState = { ...timeRange.state };
|
||||
|
||||
// Apply updates based on TimeSettingsSpec fields
|
||||
const updates: Record<string, unknown> = {};
|
||||
if (from !== undefined) {
|
||||
updates.from = from;
|
||||
}
|
||||
if (to !== undefined) {
|
||||
updates.to = to;
|
||||
}
|
||||
if (timezone !== undefined) {
|
||||
updates.timeZone = timezone;
|
||||
}
|
||||
if (autoRefresh !== undefined) {
|
||||
// autoRefresh would be applied to the dashboard refresh interval
|
||||
updates.refreshInterval = autoRefresh;
|
||||
}
|
||||
|
||||
timeRange.setState(updates);
|
||||
|
||||
const changes: MutationChange[] = [{ path: '/timeSettings', previousValue: previousState, newValue: updates }];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
inverseMutation: {
|
||||
type: 'UPDATE_TIME_SETTINGS',
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
payload: previousState as Partial<TimeSettingsSpec>,
|
||||
},
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update dashboard metadata (title, description, tags, etc.)
|
||||
*/
|
||||
export async function handleUpdateDashboardMeta(
|
||||
payload: UpdateDashboardMetaPayload,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
const { title, description, tags, editable } = payload;
|
||||
|
||||
try {
|
||||
const previousState = {
|
||||
title: scene.state.title,
|
||||
description: scene.state.description,
|
||||
tags: scene.state.tags,
|
||||
editable: scene.state.editable,
|
||||
};
|
||||
|
||||
// Apply updates
|
||||
const updates: Partial<UpdateDashboardMetaPayload> = {};
|
||||
if (title !== undefined) {
|
||||
updates.title = title;
|
||||
}
|
||||
if (description !== undefined) {
|
||||
updates.description = description;
|
||||
}
|
||||
if (tags !== undefined) {
|
||||
updates.tags = tags;
|
||||
}
|
||||
if (editable !== undefined) {
|
||||
updates.editable = editable;
|
||||
}
|
||||
|
||||
scene.setState(updates);
|
||||
|
||||
const changes: MutationChange[] = [{ path: '/meta', previousValue: previousState, newValue: updates }];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
inverseMutation: {
|
||||
type: 'UPDATE_DASHBOARD_META',
|
||||
payload: previousState,
|
||||
},
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get dashboard info (read-only operation)
|
||||
*/
|
||||
export async function handleGetDashboardInfo(
|
||||
_payload: Record<string, never>,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene } = context;
|
||||
|
||||
// Return dashboard info in the result's data field
|
||||
const info = {
|
||||
available: true,
|
||||
uid: scene.state.uid,
|
||||
title: scene.state.title,
|
||||
canEdit: scene.canEditDashboard(),
|
||||
isEditing: scene.state.isEditing ?? false,
|
||||
availableTools: DASHBOARD_MCP_TOOLS.map((t) => t.name),
|
||||
};
|
||||
|
||||
return {
|
||||
success: true,
|
||||
changes: [],
|
||||
data: info,
|
||||
};
|
||||
}
|
||||
@@ -1,27 +0,0 @@
|
||||
/**
|
||||
* Mutation Handlers
|
||||
*
|
||||
* Pure functions that implement dashboard mutations.
|
||||
* Each handler receives a payload and context, and returns a MutationResult.
|
||||
*/
|
||||
|
||||
// Types
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export type { MutationContext, MutationTransactionInternal, MutationHandler } from './types';
|
||||
|
||||
// Panel handlers
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export { handleAddPanel, handleRemovePanel, handleUpdatePanel, handleMovePanel } from './panelHandlers';
|
||||
|
||||
// Variable handlers
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export { handleAddVariable, handleRemoveVariable } from './variableHandlers';
|
||||
|
||||
// Dashboard handlers
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export {
|
||||
handleAddRow,
|
||||
handleUpdateTimeSettings,
|
||||
handleUpdateDashboardMeta,
|
||||
handleGetDashboardInfo,
|
||||
} from './dashboardHandlers';
|
||||
@@ -1,224 +0,0 @@
|
||||
/**
|
||||
* Panel mutation handlers
|
||||
*/
|
||||
|
||||
import type { MutationResult, MutationChange, AddPanelPayload, RemovePanelPayload, UpdatePanelPayload } from '../types';
|
||||
|
||||
import type { MutationContext } from './types';
|
||||
|
||||
/**
|
||||
* Add a new panel to the dashboard
|
||||
*/
|
||||
export async function handleAddPanel(payload: AddPanelPayload, context: MutationContext): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
|
||||
try {
|
||||
// Extract values with defaults
|
||||
// Top-level fields take precedence, then spec fields, then defaults
|
||||
const title = payload.title ?? payload.spec?.title ?? 'New Panel';
|
||||
// VizConfigKind.group contains the plugin ID
|
||||
const vizType = payload.vizType ?? payload.spec?.vizConfig?.group ?? 'timeseries';
|
||||
const description = payload.description ?? payload.spec?.description ?? '';
|
||||
|
||||
// Position is for future layout placement (not yet implemented)
|
||||
const _position = payload.position;
|
||||
void _position; // Suppress unused variable warning until layout positioning is implemented
|
||||
|
||||
// Generate unique element name
|
||||
const elementName = `panel-${title.toLowerCase().replace(/[^a-z0-9]/g, '-')}-${Date.now()}`;
|
||||
|
||||
// Use scene's addPanel method (simplified for POC)
|
||||
const body = scene.state.body;
|
||||
if (!body) {
|
||||
throw new Error('Dashboard has no body');
|
||||
}
|
||||
|
||||
// For POC: Create a basic panel using VizPanel directly
|
||||
// Real implementation would use proper panel building utilities
|
||||
const { VizPanel } = await import('@grafana/scenes');
|
||||
|
||||
const vizPanel = new VizPanel({
|
||||
title,
|
||||
pluginId: vizType,
|
||||
description,
|
||||
options: {},
|
||||
fieldConfig: { defaults: {}, overrides: [] },
|
||||
key: elementName,
|
||||
});
|
||||
|
||||
// Add panel to scene
|
||||
scene.addPanel(vizPanel);
|
||||
|
||||
const changes: MutationChange[] = [
|
||||
{ path: `/elements/${elementName}`, previousValue: undefined, newValue: { title, vizType } },
|
||||
];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
inverseMutation: {
|
||||
type: 'REMOVE_PANEL',
|
||||
payload: { elementName },
|
||||
},
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove a panel from the dashboard
|
||||
*/
|
||||
export async function handleRemovePanel(
|
||||
payload: RemovePanelPayload,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
const { elementName, panelId } = payload;
|
||||
|
||||
try {
|
||||
// Find the panel
|
||||
const body = scene.state.body;
|
||||
if (!body) {
|
||||
throw new Error('Dashboard has no body');
|
||||
}
|
||||
|
||||
// Find panel by element name or ID
|
||||
const { VizPanel } = await import('@grafana/scenes');
|
||||
let panelToRemove: InstanceType<typeof VizPanel> | null = null;
|
||||
let panelState: Record<string, unknown> = {};
|
||||
|
||||
// Search through the scene's panels
|
||||
const panels = body.getVizPanels?.() || [];
|
||||
for (const panel of panels) {
|
||||
const state = panel.state;
|
||||
if (elementName && state.key === elementName) {
|
||||
panelToRemove = panel;
|
||||
panelState = { ...state };
|
||||
break;
|
||||
}
|
||||
// panelId is stored internally, use key for matching
|
||||
if (panelId !== undefined && state.key && String(state.key).includes(String(panelId))) {
|
||||
panelToRemove = panel;
|
||||
panelState = { ...state };
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!panelToRemove) {
|
||||
throw new Error(`Panel not found: ${elementName || panelId}`);
|
||||
}
|
||||
|
||||
// Remove the panel
|
||||
scene.removePanel(panelToRemove);
|
||||
|
||||
const changes: MutationChange[] = [
|
||||
{ path: `/elements/${elementName || panelId}`, previousValue: panelState, newValue: undefined },
|
||||
];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
inverseMutation: {
|
||||
type: 'REMOVE_PANEL',
|
||||
payload: { elementName: String(panelState.key) },
|
||||
},
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an existing panel
|
||||
*/
|
||||
export async function handleUpdatePanel(
|
||||
payload: UpdatePanelPayload,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
const { elementName, panelId, updates } = payload;
|
||||
|
||||
try {
|
||||
// Find the panel
|
||||
const body = scene.state.body;
|
||||
if (!body) {
|
||||
throw new Error('Dashboard has no body');
|
||||
}
|
||||
|
||||
const { VizPanel } = await import('@grafana/scenes');
|
||||
const panels = body.getVizPanels?.() || [];
|
||||
let panelToUpdate: InstanceType<typeof VizPanel> | null = null;
|
||||
|
||||
for (const panel of panels) {
|
||||
const state = panel.state;
|
||||
if (elementName && state.key === elementName) {
|
||||
panelToUpdate = panel;
|
||||
break;
|
||||
}
|
||||
// panelId is stored internally, use key for matching
|
||||
if (panelId !== undefined && state.key && String(state.key).includes(String(panelId))) {
|
||||
panelToUpdate = panel;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!panelToUpdate) {
|
||||
throw new Error(`Panel not found: ${elementName || panelId}`);
|
||||
}
|
||||
|
||||
// Store previous state for rollback
|
||||
const previousState = { ...panelToUpdate.state };
|
||||
|
||||
// Apply updates from PanelSpec
|
||||
if (updates.title !== undefined) {
|
||||
panelToUpdate.setState({ title: updates.title });
|
||||
}
|
||||
if (updates.description !== undefined) {
|
||||
panelToUpdate.setState({ description: updates.description });
|
||||
}
|
||||
// More updates would be handled here based on PanelSpec fields
|
||||
|
||||
const changes: MutationChange[] = [
|
||||
{ path: `/elements/${elementName || panelId}`, previousValue: previousState, newValue: updates },
|
||||
];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
inverseMutation: {
|
||||
type: 'UPDATE_PANEL',
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
payload: { elementName, panelId, updates: previousState as UpdatePanelPayload['updates'] },
|
||||
},
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Move a panel (stub - not fully implemented)
|
||||
*/
|
||||
export async function handleMovePanel(): Promise<MutationResult> {
|
||||
return {
|
||||
success: true,
|
||||
changes: [],
|
||||
warnings: ['Move panel is not fully implemented in POC'],
|
||||
};
|
||||
}
|
||||
@@ -1,29 +0,0 @@
|
||||
/**
|
||||
* Shared types for mutation handlers
|
||||
*/
|
||||
|
||||
import type { DashboardScene } from '../../scene/DashboardScene';
|
||||
import type { MutationResult, MutationChange, MutationType, MutationPayloadMap, MutationTransaction } from '../types';
|
||||
|
||||
/**
|
||||
* Context passed to all mutation handlers
|
||||
*/
|
||||
export interface MutationContext {
|
||||
scene: DashboardScene;
|
||||
transaction: MutationTransactionInternal;
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal transaction type with mutable changes array
|
||||
*/
|
||||
export interface MutationTransactionInternal extends MutationTransaction {
|
||||
changes: MutationChange[];
|
||||
}
|
||||
|
||||
/**
|
||||
* A mutation handler function
|
||||
*/
|
||||
export type MutationHandler<T extends MutationType = MutationType> = (
|
||||
payload: MutationPayloadMap[T],
|
||||
context: MutationContext
|
||||
) => Promise<MutationResult>;
|
||||
@@ -1,72 +0,0 @@
|
||||
/**
|
||||
* Variable mutation handlers
|
||||
*/
|
||||
|
||||
import type { MutationResult, MutationChange, AddVariablePayload, RemoveVariablePayload } from '../types';
|
||||
|
||||
import type { MutationContext } from './types';
|
||||
|
||||
/**
|
||||
* Add a variable (stub - not fully implemented)
|
||||
*/
|
||||
export async function handleAddVariable(
|
||||
_payload: AddVariablePayload,
|
||||
_context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
// TODO: Variable creation requires access to internal serialization functions
|
||||
// (createSceneVariableFromVariableModel) which are not currently exported.
|
||||
// This needs to be addressed by exporting the function or creating a public API.
|
||||
return {
|
||||
success: true,
|
||||
changes: [],
|
||||
warnings: ['Add variable is not fully implemented in POC - requires exported variable factory'],
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove a variable from the dashboard
|
||||
*/
|
||||
export async function handleRemoveVariable(
|
||||
payload: RemoveVariablePayload,
|
||||
context: MutationContext
|
||||
): Promise<MutationResult> {
|
||||
const { scene, transaction } = context;
|
||||
const { name } = payload;
|
||||
|
||||
try {
|
||||
const variables = scene.state.$variables;
|
||||
if (!variables) {
|
||||
throw new Error('Dashboard has no variable set');
|
||||
}
|
||||
|
||||
const variable = variables.getByName(name);
|
||||
if (!variable) {
|
||||
throw new Error(`Variable '${name}' not found`);
|
||||
}
|
||||
|
||||
const previousState = variable.state;
|
||||
|
||||
// Remove variable
|
||||
variables.setState({
|
||||
variables: variables.state.variables.filter((v: { state: { name: string } }) => v.state.name !== name),
|
||||
});
|
||||
|
||||
const changes: MutationChange[] = [
|
||||
{ path: `/variables/${name}`, previousValue: previousState, newValue: undefined },
|
||||
];
|
||||
transaction.changes.push(...changes);
|
||||
|
||||
// inverse mutation would need to reconstruct the VariableKind from SceneVariable state
|
||||
// This is simplified for POC
|
||||
return {
|
||||
success: true,
|
||||
changes,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
changes: [],
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -1,76 +0,0 @@
|
||||
/**
|
||||
* Dashboard Mutation API
|
||||
*
|
||||
* This module provides a stable API for programmatic dashboard modifications.
|
||||
* It is designed for use by Grafana Assistant and other tools that need to modify dashboards.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* import { getDashboardMutationAPI } from '@grafana/runtime';
|
||||
*
|
||||
* const api = getDashboardMutationAPI();
|
||||
* if (api && api.canEdit()) {
|
||||
* // Simple: just title and vizType
|
||||
* const result = await api.execute({
|
||||
* type: 'ADD_PANEL',
|
||||
* payload: { title: 'CPU Usage', vizType: 'timeseries' },
|
||||
* });
|
||||
*
|
||||
* // Advanced: with full spec
|
||||
* const result2 = await api.execute({
|
||||
* type: 'ADD_PANEL',
|
||||
* payload: {
|
||||
* title: 'Memory Usage',
|
||||
* spec: {
|
||||
* vizConfig: { kind: 'VizConfig', spec: { pluginId: 'stat' } },
|
||||
* data: { kind: 'QueryGroup', spec: { queries: [] } },
|
||||
* },
|
||||
* },
|
||||
* });
|
||||
* }
|
||||
* ```
|
||||
*/
|
||||
|
||||
// Types - intentionally re-exported as public API surface
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export type {
|
||||
// Mutation types
|
||||
MutationType,
|
||||
Mutation,
|
||||
MutationPayloadMap,
|
||||
MutationResult,
|
||||
MutationChange,
|
||||
MutationTransaction,
|
||||
MutationEvent,
|
||||
|
||||
// Payload types (use schema types directly where possible)
|
||||
AddPanelPayload,
|
||||
RemovePanelPayload,
|
||||
UpdatePanelPayload,
|
||||
MovePanelPayload,
|
||||
DuplicatePanelPayload,
|
||||
AddVariablePayload,
|
||||
RemoveVariablePayload,
|
||||
UpdateVariablePayload,
|
||||
AddRowPayload,
|
||||
RemoveRowPayload,
|
||||
CollapseRowPayload,
|
||||
UpdateTimeSettingsPayload,
|
||||
UpdateDashboardMetaPayload,
|
||||
|
||||
// Supporting types
|
||||
LayoutPosition,
|
||||
|
||||
// MCP types
|
||||
MCPToolDefinition,
|
||||
MCPResourceDefinition,
|
||||
MCPPromptDefinition,
|
||||
} from './types';
|
||||
|
||||
// Mutation Executor
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export { MutationExecutor } from './MutationExecutor';
|
||||
|
||||
// MCP Tool Definitions
|
||||
// eslint-disable-next-line no-barrel-files/no-barrel-files
|
||||
export { DASHBOARD_MCP_TOOLS, DASHBOARD_MCP_RESOURCES, DASHBOARD_MCP_PROMPTS } from './mcpTools';
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,420 +0,0 @@
|
||||
/**
|
||||
* Dashboard Mutation API - Core Types
|
||||
*
|
||||
* This module defines the types for the MCP-based dashboard mutation API.
|
||||
* It provides a standardized interface for programmatic dashboard modifications.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Import v2 schema types - these are the source of truth.
|
||||
*
|
||||
* The mutation API uses these types directly to ensure compatibility with the dashboard schema.
|
||||
* No custom payload types are created - we use schema types with Omit for auto-generated fields.
|
||||
*/
|
||||
import type {
|
||||
// Panel types
|
||||
PanelSpec,
|
||||
DataLink,
|
||||
// Variable types
|
||||
VariableKind,
|
||||
// Layout types
|
||||
GridLayoutItemSpec,
|
||||
RowsLayoutRowSpec,
|
||||
TabsLayoutTabSpec,
|
||||
AutoGridLayoutSpec,
|
||||
RepeatOptions,
|
||||
ConditionalRenderingGroupSpec,
|
||||
// Annotation types
|
||||
AnnotationQuerySpec,
|
||||
// Dashboard types
|
||||
DashboardLink,
|
||||
TimeSettingsSpec,
|
||||
// Field config types
|
||||
DynamicConfigValue,
|
||||
MatcherConfig,
|
||||
ValueMapping,
|
||||
DataTransformerConfig,
|
||||
} from '@grafana/schema/src/schema/dashboard/v2beta1/types.spec.gen';
|
||||
|
||||
// ============================================================================
|
||||
// Mutation Types
|
||||
// ============================================================================
|
||||
|
||||
export type MutationType =
|
||||
// Panel operations
|
||||
| 'ADD_PANEL'
|
||||
| 'REMOVE_PANEL'
|
||||
| 'UPDATE_PANEL'
|
||||
| 'MOVE_PANEL'
|
||||
| 'DUPLICATE_PANEL'
|
||||
// Variable operations
|
||||
| 'ADD_VARIABLE'
|
||||
| 'REMOVE_VARIABLE'
|
||||
| 'UPDATE_VARIABLE'
|
||||
// Row operations
|
||||
| 'ADD_ROW'
|
||||
| 'REMOVE_ROW'
|
||||
| 'COLLAPSE_ROW'
|
||||
// Tab operations
|
||||
| 'ADD_TAB'
|
||||
| 'REMOVE_TAB'
|
||||
// Library panel operations
|
||||
| 'ADD_LIBRARY_PANEL'
|
||||
| 'UNLINK_LIBRARY_PANEL'
|
||||
| 'SAVE_AS_LIBRARY_PANEL'
|
||||
// Repeat configuration
|
||||
| 'CONFIGURE_PANEL_REPEAT'
|
||||
| 'CONFIGURE_ROW_REPEAT'
|
||||
// Conditional rendering
|
||||
| 'SET_CONDITIONAL_RENDERING'
|
||||
// Layout
|
||||
| 'CHANGE_LAYOUT_TYPE'
|
||||
// Annotation operations
|
||||
| 'ADD_ANNOTATION'
|
||||
| 'UPDATE_ANNOTATION'
|
||||
| 'REMOVE_ANNOTATION'
|
||||
// Link operations
|
||||
| 'ADD_DASHBOARD_LINK'
|
||||
| 'REMOVE_DASHBOARD_LINK'
|
||||
| 'ADD_PANEL_LINK'
|
||||
| 'ADD_DATA_LINK'
|
||||
// Field configuration
|
||||
| 'ADD_FIELD_OVERRIDE'
|
||||
| 'ADD_VALUE_MAPPING'
|
||||
| 'ADD_TRANSFORMATION'
|
||||
// Dashboard settings
|
||||
| 'UPDATE_TIME_SETTINGS'
|
||||
| 'UPDATE_DASHBOARD_META'
|
||||
// Dashboard management (backend)
|
||||
| 'MOVE_TO_FOLDER'
|
||||
| 'TOGGLE_FAVORITE'
|
||||
// Version management (backend)
|
||||
| 'LIST_VERSIONS'
|
||||
| 'COMPARE_VERSIONS'
|
||||
| 'RESTORE_VERSION'
|
||||
// Read-only operations
|
||||
| 'GET_DASHBOARD_INFO';
|
||||
|
||||
// ============================================================================
|
||||
// Mutation Payloads
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Payload for adding a panel.
|
||||
*
|
||||
* Uses Partial<PanelSpec> so callers can provide just the fields they care about.
|
||||
* Missing fields are filled with sensible defaults (title defaults to "New Panel", etc.)
|
||||
* The `id` field is always auto-generated by the system.
|
||||
*
|
||||
* Minimal example: { title: "My Panel" }
|
||||
* Full example: { title: "My Panel", description: "...", vizConfig: {...}, data: {...} }
|
||||
*/
|
||||
export interface AddPanelPayload {
|
||||
/** Panel title (required for meaningful panels) */
|
||||
title?: string;
|
||||
/** Visualization type shorthand (e.g., "timeseries", "stat", "table") */
|
||||
vizType?: string;
|
||||
/** Panel description */
|
||||
description?: string;
|
||||
/** Full panel spec - for advanced use cases. Fields here override top-level fields. */
|
||||
spec?: Partial<Omit<PanelSpec, 'id'>>;
|
||||
/** Position in the layout */
|
||||
position?: LayoutPosition;
|
||||
}
|
||||
|
||||
export interface RemovePanelPayload {
|
||||
/** Element name in the elements map */
|
||||
elementName?: string;
|
||||
/** Alternative: Panel ID */
|
||||
panelId?: number;
|
||||
}
|
||||
|
||||
export interface UpdatePanelPayload {
|
||||
/** Element name or panel ID to update */
|
||||
elementName?: string;
|
||||
panelId?: number;
|
||||
/** Updates to apply - partial PanelSpec (id cannot be changed) */
|
||||
updates: Partial<Omit<PanelSpec, 'id'>>;
|
||||
}
|
||||
|
||||
export interface MovePanelPayload {
|
||||
/** Element name to move */
|
||||
elementName: string;
|
||||
/** Target position */
|
||||
targetPosition: LayoutPosition;
|
||||
}
|
||||
|
||||
export interface DuplicatePanelPayload {
|
||||
/** Element name to duplicate */
|
||||
elementName: string;
|
||||
/** New title (optional, defaults to "Copy of {original}") */
|
||||
newTitle?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Payload for adding a variable.
|
||||
* Uses VariableKind from schema directly - the union of all variable types.
|
||||
*/
|
||||
export interface AddVariablePayload {
|
||||
/** The complete variable definition from v2 schema */
|
||||
variable: VariableKind;
|
||||
/** Position in the variables array (optional, appends if not specified) */
|
||||
position?: number;
|
||||
}
|
||||
|
||||
export interface RemoveVariablePayload {
|
||||
/** Variable name to remove */
|
||||
name: string;
|
||||
}
|
||||
|
||||
export interface UpdateVariablePayload {
|
||||
/** Variable name to update */
|
||||
name: string;
|
||||
/** The updated variable definition - replaces the existing one */
|
||||
variable: VariableKind;
|
||||
}
|
||||
|
||||
/**
|
||||
* Payload for adding a row.
|
||||
* Uses RowsLayoutRowSpec from schema, but layout is optional (created empty).
|
||||
*/
|
||||
export interface AddRowPayload {
|
||||
/** Row spec - uses schema type. Layout is created empty if not provided. */
|
||||
spec: Omit<RowsLayoutRowSpec, 'layout'> & {
|
||||
layout?: RowsLayoutRowSpec['layout'];
|
||||
};
|
||||
/** Position index (0 = first) */
|
||||
position?: number;
|
||||
}
|
||||
|
||||
export interface RemoveRowPayload {
|
||||
/** Row title or index to identify the row */
|
||||
rowTitle?: string;
|
||||
rowIndex?: number;
|
||||
/** What to do with panels in the row */
|
||||
panelHandling?: 'delete' | 'moveToRoot';
|
||||
}
|
||||
|
||||
export interface CollapseRowPayload {
|
||||
/** Row title or index to identify the row */
|
||||
rowTitle?: string;
|
||||
rowIndex?: number;
|
||||
/** Whether to collapse or expand */
|
||||
collapsed: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Payload for updating time settings.
|
||||
* Uses TimeSettingsSpec from schema.
|
||||
*/
|
||||
export type UpdateTimeSettingsPayload = Partial<TimeSettingsSpec>;
|
||||
|
||||
/**
|
||||
* Payload for updating dashboard metadata.
|
||||
* These are top-level DashboardV2Spec fields.
|
||||
*/
|
||||
export interface UpdateDashboardMetaPayload {
|
||||
title?: string;
|
||||
description?: string;
|
||||
tags?: string[];
|
||||
editable?: boolean;
|
||||
preload?: boolean;
|
||||
liveNow?: boolean;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Supporting Types - derived from schema types
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Layout position for placing elements.
|
||||
* Combines GridLayoutItemSpec position fields with container targeting.
|
||||
*/
|
||||
export type LayoutPosition = Pick<GridLayoutItemSpec, 'x' | 'y' | 'width' | 'height' | 'repeat'> & {
|
||||
/** Target row title (for RowsLayout) */
|
||||
targetRow?: string;
|
||||
/** Target tab title (for TabsLayout) */
|
||||
targetTab?: string;
|
||||
};
|
||||
|
||||
// ============================================================================
|
||||
// Mutation Definition
|
||||
// ============================================================================
|
||||
|
||||
export interface Mutation<T extends MutationType = MutationType> {
|
||||
type: T;
|
||||
payload: MutationPayloadMap[T];
|
||||
}
|
||||
|
||||
export interface MutationPayloadMap {
|
||||
// Panel operations
|
||||
ADD_PANEL: AddPanelPayload;
|
||||
REMOVE_PANEL: RemovePanelPayload;
|
||||
UPDATE_PANEL: UpdatePanelPayload;
|
||||
MOVE_PANEL: MovePanelPayload;
|
||||
DUPLICATE_PANEL: DuplicatePanelPayload;
|
||||
|
||||
// Variable operations
|
||||
ADD_VARIABLE: AddVariablePayload;
|
||||
REMOVE_VARIABLE: RemoveVariablePayload;
|
||||
UPDATE_VARIABLE: UpdateVariablePayload;
|
||||
|
||||
// Row operations
|
||||
ADD_ROW: AddRowPayload;
|
||||
REMOVE_ROW: RemoveRowPayload;
|
||||
COLLAPSE_ROW: CollapseRowPayload;
|
||||
|
||||
// Tab operations - uses TabsLayoutTabSpec from schema
|
||||
ADD_TAB: {
|
||||
spec: Omit<TabsLayoutTabSpec, 'layout'> & { layout?: TabsLayoutTabSpec['layout'] };
|
||||
position?: number;
|
||||
};
|
||||
REMOVE_TAB: { tabTitle?: string; tabIndex?: number; panelHandling?: 'delete' | 'moveToRoot' };
|
||||
|
||||
// Library panel operations
|
||||
ADD_LIBRARY_PANEL: { libraryPanelUid?: string; libraryPanelName?: string; position?: LayoutPosition };
|
||||
UNLINK_LIBRARY_PANEL: { elementName: string };
|
||||
SAVE_AS_LIBRARY_PANEL: { elementName: string; libraryPanelName: string; folderUid?: string };
|
||||
|
||||
// Repeat configuration - uses RepeatOptions from schema
|
||||
CONFIGURE_PANEL_REPEAT: { elementName: string; repeat: RepeatOptions | null };
|
||||
CONFIGURE_ROW_REPEAT: { rowTitle?: string; rowIndex?: number; repeat: RowsLayoutRowSpec['repeat'] | null };
|
||||
|
||||
// Conditional rendering - uses ConditionalRenderingGroupSpec from schema
|
||||
SET_CONDITIONAL_RENDERING: {
|
||||
elementName: string;
|
||||
conditionalRendering: ConditionalRenderingGroupSpec | null;
|
||||
};
|
||||
|
||||
// Layout - uses AutoGridLayoutSpec for options
|
||||
CHANGE_LAYOUT_TYPE: {
|
||||
layoutType: 'GridLayout' | 'RowsLayout' | 'AutoGridLayout' | 'TabsLayout';
|
||||
options?: Partial<AutoGridLayoutSpec>;
|
||||
};
|
||||
|
||||
// Annotation operations - uses AnnotationQuerySpec from schema
|
||||
ADD_ANNOTATION: Omit<AnnotationQuerySpec, 'query'> & { query?: AnnotationQuerySpec['query'] };
|
||||
UPDATE_ANNOTATION: { name: string; updates: Partial<AnnotationQuerySpec> };
|
||||
REMOVE_ANNOTATION: { name: string };
|
||||
|
||||
// Link operations - uses DashboardLink from schema
|
||||
ADD_DASHBOARD_LINK: DashboardLink;
|
||||
REMOVE_DASHBOARD_LINK: { title?: string; index?: number };
|
||||
|
||||
// Panel link operations - uses DataLink from schema
|
||||
ADD_PANEL_LINK: { elementName: string; link: DataLink };
|
||||
ADD_DATA_LINK: { elementName: string; link: DataLink };
|
||||
|
||||
// Field configuration - uses schema types
|
||||
ADD_FIELD_OVERRIDE: {
|
||||
elementName: string;
|
||||
matcher: MatcherConfig;
|
||||
properties: DynamicConfigValue[];
|
||||
};
|
||||
ADD_VALUE_MAPPING: { elementName: string; mapping: ValueMapping };
|
||||
ADD_TRANSFORMATION: { elementName: string; transformation: Omit<DataTransformerConfig, 'id'> & { id: string } };
|
||||
|
||||
// Dashboard settings
|
||||
UPDATE_TIME_SETTINGS: UpdateTimeSettingsPayload;
|
||||
UPDATE_DASHBOARD_META: UpdateDashboardMetaPayload;
|
||||
|
||||
// Dashboard management (backend)
|
||||
MOVE_TO_FOLDER: { folderUid?: string; folderTitle?: string };
|
||||
TOGGLE_FAVORITE: { favorite: boolean };
|
||||
|
||||
// Version management (backend)
|
||||
LIST_VERSIONS: { limit?: number };
|
||||
COMPARE_VERSIONS: { baseVersion: number; newVersion: number };
|
||||
RESTORE_VERSION: { version: number };
|
||||
|
||||
// Read-only operations (no payload required)
|
||||
GET_DASHBOARD_INFO: Record<string, never>;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Mutation Result
|
||||
// ============================================================================
|
||||
|
||||
export interface MutationResult {
|
||||
success: boolean;
|
||||
/** Mutation to apply to undo this change */
|
||||
inverseMutation?: Mutation;
|
||||
/** Changes that were applied */
|
||||
changes: MutationChange[];
|
||||
/** Error message if failed */
|
||||
error?: string;
|
||||
/** Warnings (non-fatal issues) */
|
||||
warnings?: string[];
|
||||
/** Data returned by read-only operations (e.g., GET_DASHBOARD_INFO) */
|
||||
data?: unknown;
|
||||
}
|
||||
|
||||
export interface MutationChange {
|
||||
path: string;
|
||||
previousValue: unknown;
|
||||
newValue: unknown;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Transaction
|
||||
// ============================================================================
|
||||
|
||||
export interface MutationTransaction {
|
||||
id: string;
|
||||
mutations: Mutation[];
|
||||
status: 'pending' | 'committed' | 'rolled_back';
|
||||
startedAt: number;
|
||||
completedAt?: number;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Event Types
|
||||
// ============================================================================
|
||||
|
||||
export interface MutationEvent {
|
||||
type: 'mutation_applied' | 'mutation_failed' | 'mutation_rolled_back';
|
||||
mutation: Mutation;
|
||||
result: MutationResult;
|
||||
transaction?: MutationTransaction;
|
||||
timestamp: number;
|
||||
source: 'assistant' | 'ui' | 'api';
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// MCP Tool Types
|
||||
// ============================================================================
|
||||
|
||||
export interface MCPToolDefinition {
|
||||
name: string;
|
||||
description: string;
|
||||
inputSchema: {
|
||||
type: 'object';
|
||||
properties: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
annotations?: {
|
||||
title?: string;
|
||||
readOnlyHint?: boolean;
|
||||
destructiveHint?: boolean;
|
||||
idempotentHint?: boolean;
|
||||
confirmationHint?: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
export interface MCPResourceDefinition {
|
||||
uri: string;
|
||||
uriTemplate?: boolean;
|
||||
name: string;
|
||||
description: string;
|
||||
mimeType: string;
|
||||
}
|
||||
|
||||
export interface MCPPromptDefinition {
|
||||
name: string;
|
||||
description: string;
|
||||
arguments: Array<{
|
||||
name: string;
|
||||
description: string;
|
||||
required: boolean;
|
||||
}>;
|
||||
}
|
||||
@@ -2,13 +2,7 @@ import * as H from 'history';
|
||||
|
||||
import { CoreApp, DataQueryRequest, locationUtil, NavIndex, NavModelItem } from '@grafana/data';
|
||||
import { t } from '@grafana/i18n';
|
||||
import {
|
||||
config,
|
||||
locationService,
|
||||
RefreshEvent,
|
||||
setDashboardMutationAPI,
|
||||
type DashboardMutationAPI,
|
||||
} from '@grafana/runtime';
|
||||
import { config, locationService, RefreshEvent } from '@grafana/runtime';
|
||||
import {
|
||||
sceneGraph,
|
||||
SceneObject,
|
||||
@@ -51,9 +45,6 @@ import {
|
||||
} from '../../apiserver/types';
|
||||
import { DashboardEditPane } from '../edit-pane/DashboardEditPane';
|
||||
import { dashboardEditActions } from '../edit-pane/shared';
|
||||
import { MutationExecutor } from '../mutation-api/MutationExecutor';
|
||||
import { DASHBOARD_MCP_TOOLS } from '../mutation-api/mcpTools';
|
||||
import type { Mutation } from '../mutation-api/types';
|
||||
import { PanelEditor } from '../panel-edit/PanelEditor';
|
||||
import { DashboardSceneChangeTracker } from '../saving/DashboardSceneChangeTracker';
|
||||
import { SaveDashboardDrawer } from '../saving/SaveDashboardDrawer';
|
||||
@@ -102,11 +93,6 @@ import { clearClipboard } from './layouts-shared/paste';
|
||||
import { DashboardLayoutManager } from './types/DashboardLayoutManager';
|
||||
import { LayoutParent } from './types/LayoutParent';
|
||||
|
||||
// Type for window with mutation API (for cross-bundle access with plugins)
|
||||
interface WindowWithMutationAPI extends Window {
|
||||
__grafanaDashboardMutationAPI?: DashboardMutationAPI | null;
|
||||
}
|
||||
|
||||
export const PERSISTED_PROPS = ['title', 'description', 'tags', 'editable', 'graphTooltip', 'links', 'meta', 'preload'];
|
||||
export const PANEL_SEARCH_VAR = 'systemPanelFilterVar';
|
||||
export const PANELS_PER_ROW_VAR = 'systemDynamicRowSizeVar';
|
||||
@@ -233,9 +219,6 @@ export class DashboardScene extends SceneObjectBase<DashboardSceneState> impleme
|
||||
|
||||
window.__grafanaSceneContext = this;
|
||||
|
||||
// Register Dashboard Mutation API for Grafana Assistant and other tools
|
||||
this._registerMutationAPI();
|
||||
|
||||
this._initializePanelSearch();
|
||||
|
||||
if (this.state.isEditing) {
|
||||
@@ -264,10 +247,6 @@ export class DashboardScene extends SceneObjectBase<DashboardSceneState> impleme
|
||||
// Deactivation logic
|
||||
return () => {
|
||||
window.__grafanaSceneContext = prevSceneContext;
|
||||
// Clear mutation API
|
||||
setDashboardMutationAPI(null);
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
(window as WindowWithMutationAPI).__grafanaDashboardMutationAPI = null;
|
||||
clearKeyBindings();
|
||||
this._changeTracker.terminate();
|
||||
oldDashboardWrapper.destroy();
|
||||
@@ -275,49 +254,6 @@ export class DashboardScene extends SceneObjectBase<DashboardSceneState> impleme
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Register the Dashboard Mutation API for use by Grafana Assistant and other tools.
|
||||
* This provides a stable interface for programmatic dashboard modifications.
|
||||
*
|
||||
* The API is exposed on window.__grafanaDashboardMutationAPI for cross-bundle access,
|
||||
* since plugins use a different @grafana/runtime bundle.
|
||||
*/
|
||||
private _registerMutationAPI() {
|
||||
const dashboard = this;
|
||||
const executor = new MutationExecutor();
|
||||
executor.setScene(this);
|
||||
|
||||
const api: DashboardMutationAPI = {
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
execute: (mutation) => executor.executeMutation(mutation as Mutation),
|
||||
canEdit: () => dashboard.canEditDashboard(),
|
||||
getDashboardUID: () => dashboard.state.uid,
|
||||
getDashboardTitle: () => dashboard.state.title,
|
||||
isEditing: () => dashboard.state.isEditing ?? false,
|
||||
enterEditMode: () => {
|
||||
if (!dashboard.state.isEditing) {
|
||||
dashboard.onEnterEditMode();
|
||||
}
|
||||
},
|
||||
getTools: () => DASHBOARD_MCP_TOOLS,
|
||||
getDashboardInfo: () => ({
|
||||
available: true,
|
||||
uid: dashboard.state.uid,
|
||||
title: dashboard.state.title,
|
||||
canEdit: dashboard.canEditDashboard(),
|
||||
isEditing: dashboard.state.isEditing ?? false,
|
||||
availableTools: DASHBOARD_MCP_TOOLS.map((t) => t.name),
|
||||
}),
|
||||
};
|
||||
|
||||
// Register via @grafana/runtime for same-bundle access
|
||||
setDashboardMutationAPI(api);
|
||||
|
||||
// Also expose on window for cross-bundle access (plugins use different bundle)
|
||||
// eslint-disable-next-line @typescript-eslint/consistent-type-assertions
|
||||
(window as WindowWithMutationAPI).__grafanaDashboardMutationAPI = api;
|
||||
}
|
||||
|
||||
private _initializePanelSearch() {
|
||||
const systemPanelFilter = sceneGraph.lookupVariable(PANEL_SEARCH_VAR, this)?.getValue();
|
||||
if (typeof systemPanelFilter === 'string') {
|
||||
|
||||
@@ -25,6 +25,10 @@ export class ExportAsCode extends ShareExportTab {
|
||||
public getTabLabel(): string {
|
||||
return t('export.json.title', 'Export dashboard');
|
||||
}
|
||||
|
||||
public getSubtitle(): string | undefined {
|
||||
return t('export.json.info-text', 'Copy or download a file containing the definition of your dashboard');
|
||||
}
|
||||
}
|
||||
|
||||
function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
||||
@@ -53,12 +57,6 @@ function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
||||
|
||||
return (
|
||||
<div data-testid={selector.container} className={styles.container}>
|
||||
<p>
|
||||
<Trans i18nKey="export.json.info-text">
|
||||
Copy or download a file containing the definition of your dashboard
|
||||
</Trans>
|
||||
</p>
|
||||
|
||||
{config.featureToggles.kubernetesDashboards ? (
|
||||
<ResourceExport
|
||||
dashboardJson={dashboardJson}
|
||||
|
||||
@@ -0,0 +1,189 @@
|
||||
import { render, screen, within } from '@testing-library/react';
|
||||
import userEvent from '@testing-library/user-event';
|
||||
import { AsyncState } from 'react-use/lib/useAsync';
|
||||
|
||||
import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
||||
import { Dashboard } from '@grafana/schema';
|
||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||
|
||||
import { ExportMode, ResourceExport } from './ResourceExport';
|
||||
|
||||
type DashboardJsonState = AsyncState<{
|
||||
json: Dashboard | DashboardV2Spec | { error: unknown };
|
||||
hasLibraryPanels?: boolean;
|
||||
initialSaveModelVersion: 'v1' | 'v2';
|
||||
}>;
|
||||
|
||||
const selector = e2eSelectors.pages.ExportDashboardDrawer.ExportAsJson;
|
||||
|
||||
const createDefaultProps = (overrides?: Partial<Parameters<typeof ResourceExport>[0]>) => {
|
||||
const defaultProps: Parameters<typeof ResourceExport>[0] = {
|
||||
dashboardJson: {
|
||||
loading: false,
|
||||
value: {
|
||||
json: { title: 'Test Dashboard' } as Dashboard,
|
||||
hasLibraryPanels: false,
|
||||
initialSaveModelVersion: 'v1',
|
||||
},
|
||||
} as DashboardJsonState,
|
||||
isSharingExternally: false,
|
||||
exportMode: ExportMode.Classic,
|
||||
isViewingYAML: false,
|
||||
onExportModeChange: jest.fn(),
|
||||
onShareExternallyChange: jest.fn(),
|
||||
onViewYAML: jest.fn(),
|
||||
};
|
||||
|
||||
return { ...defaultProps, ...overrides };
|
||||
};
|
||||
|
||||
const createV2DashboardJson = (hasLibraryPanels = false): DashboardJsonState => ({
|
||||
loading: false,
|
||||
value: {
|
||||
json: {
|
||||
title: 'Test V2 Dashboard',
|
||||
spec: {
|
||||
elements: {},
|
||||
},
|
||||
} as unknown as DashboardV2Spec,
|
||||
hasLibraryPanels,
|
||||
initialSaveModelVersion: 'v2',
|
||||
},
|
||||
});
|
||||
|
||||
const expandOptions = async () => {
|
||||
const button = screen.getByRole('button', { expanded: false });
|
||||
await userEvent.click(button);
|
||||
};
|
||||
|
||||
describe('ResourceExport', () => {
|
||||
describe('export mode options for v1 dashboard', () => {
|
||||
it('should show three export mode options in correct order: Classic, V1 Resource, V2 Resource', async () => {
|
||||
render(<ResourceExport {...createDefaultProps()} />);
|
||||
await expandOptions();
|
||||
|
||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||
const labels = within(radioGroup)
|
||||
.getAllByRole('radio')
|
||||
.map((radio) => radio.parentElement?.textContent?.trim());
|
||||
|
||||
expect(labels).toHaveLength(3);
|
||||
expect(labels).toEqual(['Classic', 'V1 Resource', 'V2 Resource']);
|
||||
});
|
||||
|
||||
it('should have first option selected by default when exportMode is Classic', async () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||
await expandOptions();
|
||||
|
||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||
const radios = within(radioGroup).getAllByRole('radio');
|
||||
expect(radios[0]).toBeChecked();
|
||||
});
|
||||
|
||||
it('should call onExportModeChange when export mode is changed', async () => {
|
||||
const onExportModeChange = jest.fn();
|
||||
render(<ResourceExport {...createDefaultProps({ onExportModeChange })} />);
|
||||
await expandOptions();
|
||||
|
||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||
const radios = within(radioGroup).getAllByRole('radio');
|
||||
await userEvent.click(radios[1]); // V1 Resource
|
||||
expect(onExportModeChange).toHaveBeenCalledWith(ExportMode.V1Resource);
|
||||
});
|
||||
});
|
||||
|
||||
describe('export mode options for v2 dashboard', () => {
|
||||
it('should not show export mode options', async () => {
|
||||
render(<ResourceExport {...createDefaultProps({ dashboardJson: createV2DashboardJson() })} />);
|
||||
await expandOptions();
|
||||
|
||||
expect(screen.queryByRole('radiogroup', { name: /model/i })).not.toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
describe('format options', () => {
|
||||
it('should not show format options when export mode is Classic', async () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||
await expandOptions();
|
||||
|
||||
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
||||
expect(screen.queryByRole('radiogroup', { name: /format/i })).not.toBeInTheDocument();
|
||||
});
|
||||
|
||||
it.each([ExportMode.V1Resource, ExportMode.V2Resource])(
|
||||
'should show format options when export mode is %s',
|
||||
async (exportMode) => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode })} />);
|
||||
await expandOptions();
|
||||
|
||||
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
||||
expect(screen.getByRole('radiogroup', { name: /format/i })).toBeInTheDocument();
|
||||
}
|
||||
);
|
||||
|
||||
it('should have first format option selected when isViewingYAML is false', async () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: false })} />);
|
||||
await expandOptions();
|
||||
|
||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||
expect(formatRadios[0]).toBeChecked(); // JSON
|
||||
});
|
||||
|
||||
it('should have second format option selected when isViewingYAML is true', async () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: true })} />);
|
||||
await expandOptions();
|
||||
|
||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||
expect(formatRadios[1]).toBeChecked(); // YAML
|
||||
});
|
||||
|
||||
it('should call onViewYAML when format is changed', async () => {
|
||||
const onViewYAML = jest.fn();
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, onViewYAML })} />);
|
||||
await expandOptions();
|
||||
|
||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||
await userEvent.click(formatRadios[1]); // YAML
|
||||
expect(onViewYAML).toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
||||
describe('share externally switch', () => {
|
||||
it('should show share externally switch for Classic mode', () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||
|
||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should show share externally switch for V2Resource mode with V2 dashboard', () => {
|
||||
render(
|
||||
<ResourceExport
|
||||
{...createDefaultProps({
|
||||
dashboardJson: createV2DashboardJson(),
|
||||
exportMode: ExportMode.V2Resource,
|
||||
})}
|
||||
/>
|
||||
);
|
||||
|
||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
||||
});
|
||||
|
||||
it('should call onShareExternallyChange when switch is toggled', async () => {
|
||||
const onShareExternallyChange = jest.fn();
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, onShareExternallyChange })} />);
|
||||
|
||||
const switchElement = screen.getByTestId(selector.exportExternallyToggle);
|
||||
await userEvent.click(switchElement);
|
||||
expect(onShareExternallyChange).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should reflect isSharingExternally value in switch', () => {
|
||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, isSharingExternally: true })} />);
|
||||
|
||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeChecked();
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -4,7 +4,8 @@ import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
||||
import { Trans, t } from '@grafana/i18n';
|
||||
import { Dashboard } from '@grafana/schema';
|
||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||
import { Alert, Label, RadioButtonGroup, Stack, Switch } from '@grafana/ui';
|
||||
import { Alert, Icon, Label, RadioButtonGroup, Stack, Switch, Box, Tooltip } from '@grafana/ui';
|
||||
import { QueryOperationRow } from 'app/core/components/QueryOperationRow/QueryOperationRow';
|
||||
import { DashboardJson } from 'app/features/manage-dashboards/types';
|
||||
|
||||
import { ExportableResource } from '../ShareExportTab';
|
||||
@@ -48,80 +49,90 @@ export function ResourceExport({
|
||||
|
||||
const switchExportLabel =
|
||||
exportMode === ExportMode.V2Resource
|
||||
? t('export.json.export-remove-ds-refs', 'Remove deployment details')
|
||||
: t('share-modal.export.share-externally-label', `Export for sharing externally`);
|
||||
? t('dashboard-scene.resource-export.share-externally', 'Share dashboard with another instance')
|
||||
: t('share-modal.export.share-externally-label', 'Export for sharing externally');
|
||||
const switchExportTooltip = t(
|
||||
'dashboard-scene.resource-export.share-externally-tooltip',
|
||||
'Removes all instance-specific metadata and data source references from the resource before export.'
|
||||
);
|
||||
const switchExportModeLabel = t('export.json.export-mode', 'Model');
|
||||
const switchExportFormatLabel = t('export.json.export-format', 'Format');
|
||||
|
||||
const exportResourceOptions = [
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.classic', 'Classic'),
|
||||
value: ExportMode.Classic,
|
||||
},
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||
value: ExportMode.V1Resource,
|
||||
},
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||
value: ExportMode.V2Resource,
|
||||
},
|
||||
];
|
||||
|
||||
return (
|
||||
<Stack gap={2} direction="column">
|
||||
<Stack gap={1} direction="column">
|
||||
{initialSaveModelVersion === 'v1' && (
|
||||
<Stack alignItems="center">
|
||||
<Label>{switchExportModeLabel}</Label>
|
||||
<RadioButtonGroup
|
||||
options={[
|
||||
{ label: t('dashboard-scene.resource-export.label.classic', 'Classic'), value: ExportMode.Classic },
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||
value: ExportMode.V1Resource,
|
||||
},
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||
value: ExportMode.V2Resource,
|
||||
},
|
||||
]}
|
||||
value={exportMode}
|
||||
onChange={(value) => onExportModeChange(value)}
|
||||
/>
|
||||
<>
|
||||
<QueryOperationRow
|
||||
id="Advanced options"
|
||||
index={0}
|
||||
title={t('dashboard-scene.resource-export.label.advanced-options', 'Advanced options')}
|
||||
isOpen={false}
|
||||
>
|
||||
<Box marginTop={2}>
|
||||
<Stack gap={1} direction="column">
|
||||
{initialSaveModelVersion === 'v1' && (
|
||||
<Stack gap={1} alignItems="center">
|
||||
<Label>{switchExportModeLabel}</Label>
|
||||
<RadioButtonGroup
|
||||
options={exportResourceOptions}
|
||||
value={exportMode}
|
||||
onChange={(value) => onExportModeChange(value)}
|
||||
aria-label={switchExportModeLabel}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
|
||||
{exportMode !== ExportMode.Classic && (
|
||||
<Stack gap={1} alignItems="center">
|
||||
<Label>{switchExportFormatLabel}</Label>
|
||||
<RadioButtonGroup
|
||||
options={[
|
||||
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
||||
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
||||
]}
|
||||
value={isViewingYAML ? 'yaml' : 'json'}
|
||||
onChange={onViewYAML}
|
||||
aria-label={switchExportFormatLabel}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
</Stack>
|
||||
)}
|
||||
{initialSaveModelVersion === 'v2' && (
|
||||
<Stack alignItems="center">
|
||||
<Label>{switchExportModeLabel}</Label>
|
||||
<RadioButtonGroup
|
||||
options={[
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||
value: ExportMode.V2Resource,
|
||||
},
|
||||
{
|
||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||
value: ExportMode.V1Resource,
|
||||
},
|
||||
]}
|
||||
value={exportMode}
|
||||
onChange={(value) => onExportModeChange(value)}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
{exportMode !== ExportMode.Classic && (
|
||||
<Stack gap={1} alignItems="center">
|
||||
<Label>{switchExportFormatLabel}</Label>
|
||||
<RadioButtonGroup
|
||||
options={[
|
||||
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
||||
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
||||
]}
|
||||
value={isViewingYAML ? 'yaml' : 'json'}
|
||||
onChange={onViewYAML}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
{(isV2Dashboard ||
|
||||
exportMode === ExportMode.Classic ||
|
||||
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
||||
<Stack gap={1} alignItems="start">
|
||||
<Label>{switchExportLabel}</Label>
|
||||
<Switch
|
||||
label={switchExportLabel}
|
||||
value={isSharingExternally}
|
||||
onChange={onShareExternallyChange}
|
||||
data-testid={selector.exportExternallyToggle}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
</Stack>
|
||||
</Box>
|
||||
</QueryOperationRow>
|
||||
|
||||
{(isV2Dashboard ||
|
||||
exportMode === ExportMode.Classic ||
|
||||
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
||||
<Stack gap={1} alignItems="start">
|
||||
<Label>
|
||||
<Stack gap={0.5} alignItems="center">
|
||||
<Tooltip content={switchExportTooltip} placement="bottom">
|
||||
<Icon name="info-circle" size="sm" />
|
||||
</Tooltip>
|
||||
{switchExportLabel}
|
||||
</Stack>
|
||||
</Label>
|
||||
<Switch
|
||||
label={switchExportLabel}
|
||||
value={isSharingExternally}
|
||||
onChange={onShareExternallyChange}
|
||||
data-testid={selector.exportExternallyToggle}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
|
||||
{showV2LibPanelAlert && (
|
||||
<Alert
|
||||
@@ -130,6 +141,7 @@ export function ResourceExport({
|
||||
'Library panels will be converted to regular panels'
|
||||
)}
|
||||
severity="warning"
|
||||
topSpacing={2}
|
||||
>
|
||||
<Trans i18nKey="dashboard-scene.save-dashboard-form.schema-v2-library-panels-export">
|
||||
Due to limitations in the new dashboard schema (V2), library panels will be converted to regular panels with
|
||||
@@ -137,6 +149,6 @@ export function ResourceExport({
|
||||
</Trans>
|
||||
</Alert>
|
||||
)}
|
||||
</Stack>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -66,7 +66,12 @@ function ShareDrawerRenderer({ model }: SceneComponentProps<ShareDrawer>) {
|
||||
const dashboard = getDashboardSceneFor(model);
|
||||
|
||||
return (
|
||||
<Drawer title={activeShare?.getTabLabel()} onClose={model.onDismiss} size="md">
|
||||
<Drawer
|
||||
title={activeShare?.getTabLabel()}
|
||||
subtitle={activeShare?.getSubtitle?.()}
|
||||
onClose={model.onDismiss}
|
||||
size="md"
|
||||
>
|
||||
<ShareDrawerContext.Provider value={{ dashboard, onDismiss: model.onDismiss }}>
|
||||
{activeShare && <activeShare.Component model={activeShare} />}
|
||||
</ShareDrawerContext.Provider>
|
||||
|
||||
@@ -66,6 +66,10 @@ export class ShareExportTab extends SceneObjectBase<ShareExportTabState> impleme
|
||||
return t('share-modal.tab-title.export', 'Export');
|
||||
}
|
||||
|
||||
public getSubtitle(): string | undefined {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
public onShareExternallyChange = () => {
|
||||
this.setState({
|
||||
isSharingExternally: !this.state.isSharingExternally,
|
||||
|
||||
@@ -15,5 +15,6 @@ export interface SceneShareTab<T extends SceneShareTabState = SceneShareTabState
|
||||
|
||||
export interface ShareView extends SceneObject {
|
||||
getTabLabel(): string;
|
||||
getSubtitle?(): string | undefined;
|
||||
onDismiss?: () => void;
|
||||
}
|
||||
|
||||
@@ -36,7 +36,7 @@ export const DEFAULT_ANNOTATIONS_QUERY: Omit<CloudWatchAnnotationQuery, 'refId'>
|
||||
export const DEFAULT_CWLI_QUERY_STRING = 'fields @timestamp, @message |\nsort @timestamp desc |\nlimit 20';
|
||||
export const DEFAULT_PPL_QUERY_STRING = 'fields `@timestamp`, `@message`\n| sort - `@timestamp`\n| head 25s';
|
||||
export const DEFAULT_SQL_QUERY_STRING =
|
||||
'SELECT `@timestamp`, `@message`\nFROM `log_group`\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
||||
'SELECT `@timestamp`, `@message`\nFROM $__logGroups\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
||||
|
||||
export const getDefaultLogsQuery = (
|
||||
defaultLogGroups?: LogGroup[],
|
||||
|
||||
+11
-3
@@ -97,14 +97,22 @@ describe('LogsSQLCompletionItemProvider', () => {
|
||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 103 });
|
||||
const suggestionLabels = suggestions.map((s) => s.label);
|
||||
expect(suggestionLabels).toEqual(
|
||||
expect.arrayContaining([FROM, `${FROM} \`logGroups(logGroupIdentifier: [...])\``, CASE, ...ALL_FUNCTIONS])
|
||||
expect.arrayContaining([
|
||||
FROM,
|
||||
`${FROM} $__logGroups`,
|
||||
`${FROM} \`logGroups(logGroupIdentifier: [...])\``,
|
||||
CASE,
|
||||
...ALL_FUNCTIONS,
|
||||
])
|
||||
);
|
||||
});
|
||||
|
||||
it('returns logGroups suggestion after from keyword', async () => {
|
||||
it('returns logGroups and $__logGroups suggestion after from keyword', async () => {
|
||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 108 });
|
||||
const suggestionLabels = suggestions.map((s) => s.label);
|
||||
expect(suggestionLabels).toEqual(expect.arrayContaining(['`logGroups(logGroupIdentifier: [...])`']));
|
||||
expect(suggestionLabels).toEqual(
|
||||
expect.arrayContaining(['$__logGroups', '`logGroups(logGroupIdentifier: [...])`'])
|
||||
);
|
||||
});
|
||||
|
||||
it('returns where, having, limit, group by, order by, and join suggestions after from arguments', async () => {
|
||||
|
||||
+12
@@ -142,6 +142,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
||||
command: TRIGGER_SUGGEST,
|
||||
sortText: CompletionItemPriority.MediumHigh,
|
||||
});
|
||||
addSuggestion(`${FROM} $__logGroups`, {
|
||||
insertText: `${FROM} $__logGroups`,
|
||||
kind: monaco.languages.CompletionItemKind.Snippet,
|
||||
sortText: CompletionItemPriority.High,
|
||||
detail: 'Use selected log groups from the selector',
|
||||
});
|
||||
addSuggestion(`${FROM} \`logGroups(logGroupIdentifier: [...])\``, {
|
||||
insertText: `${FROM} \`logGroups(logGroupIdentifier: [$0])\``,
|
||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||
@@ -152,6 +158,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
||||
break;
|
||||
|
||||
case SuggestionKind.AfterFromKeyword:
|
||||
addSuggestion('$__logGroups', {
|
||||
insertText: '$__logGroups',
|
||||
kind: monaco.languages.CompletionItemKind.Variable,
|
||||
sortText: CompletionItemPriority.High,
|
||||
detail: 'Expands to selected log groups',
|
||||
});
|
||||
addSuggestion('`logGroups(logGroupIdentifier: [...])`', {
|
||||
insertText: '`logGroups(logGroupIdentifier: [$0])`',
|
||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||
|
||||
@@ -488,6 +488,7 @@ export const language: CloudWatchLanguage = {
|
||||
root: [
|
||||
{ include: '@comments' },
|
||||
{ include: '@whitespace' },
|
||||
{ include: '@macros' },
|
||||
{ include: '@customParams' },
|
||||
{ include: '@numbers' },
|
||||
{ include: '@binaries' },
|
||||
@@ -519,6 +520,7 @@ export const language: CloudWatchLanguage = {
|
||||
[/\*\//, { token: 'comment.quote', next: '@pop' }],
|
||||
[/./, 'comment'],
|
||||
],
|
||||
macros: [[/\$__[a-zA-Z0-9_]+/, 'type']],
|
||||
customParams: [
|
||||
[/\${[A-Za-z0-9._-]*}/, 'variable'],
|
||||
[/\@\@{[A-Za-z0-9._-]*}/, 'variable'],
|
||||
|
||||
@@ -6383,12 +6383,15 @@
|
||||
},
|
||||
"resource-export": {
|
||||
"label": {
|
||||
"advanced-options": "Advanced options",
|
||||
"classic": "Classic",
|
||||
"json": "JSON",
|
||||
"v1-resource": "V1 Resource",
|
||||
"v2-resource": "V2 Resource",
|
||||
"yaml": "YAML"
|
||||
}
|
||||
},
|
||||
"share-externally": "Share dashboard with another instance",
|
||||
"share-externally-tooltip": "Removes all instance-specific metadata and data source references from the resource before export."
|
||||
},
|
||||
"revert-dashboard-modal": {
|
||||
"body-restore-version": "Are you sure you want to restore the dashboard to version {{version}}? All unsaved changes will be lost.",
|
||||
@@ -7842,7 +7845,6 @@
|
||||
"export-externally-label": "Export the dashboard to use in another instance",
|
||||
"export-format": "Format",
|
||||
"export-mode": "Model",
|
||||
"export-remove-ds-refs": "Remove deployment details",
|
||||
"info-text": "Copy or download a file containing the definition of your dashboard",
|
||||
"title": "Export dashboard"
|
||||
},
|
||||
|
||||
Reference in New Issue
Block a user