Compare commits
7 Commits
ihm/251217
...
wb/pkg-plu
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2c6e870a3a | ||
|
|
975480f10e | ||
|
|
3445cd3378 | ||
|
|
df790a4279 | ||
|
|
f3e28cf440 | ||
|
|
f8ab6ecc79 | ||
|
|
6a955eb6d1 |
@@ -4,8 +4,7 @@ comments: |
|
|||||||
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
||||||
---
|
---
|
||||||
|
|
||||||
You can pan the panel time range left and right, and zoom it and in and out.
|
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||||
This, in turn, changes the dashboard time range.
|
|
||||||
|
|
||||||
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
||||||
|
|
||||||
@@ -17,9 +16,4 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
|||||||
- Next range: 8:30 - 10:29
|
- Next range: 8:30 - 10:29
|
||||||
- Next range: 7:30 - 11:29
|
- Next range: 7:30 - 11:29
|
||||||
|
|
||||||
**Pan** - Click and drag the x-axis area of the panel to pan the time range.
|
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#zoom-panel-time-range).
|
||||||
|
|
||||||
The time range shifts by the distance you drag.
|
|
||||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
|
||||||
|
|
||||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#pan-and-zoom-panel-time-range).
|
|
||||||
@@ -72,8 +72,6 @@ Each panel needs at least one query to display a visualization.
|
|||||||
|
|
||||||
## Create a dashboard
|
## Create a dashboard
|
||||||
|
|
||||||
{{< docs/list >}}
|
|
||||||
{{< shared id="create-dashboard" >}}
|
|
||||||
To create a dashboard, follow these steps:
|
To create a dashboard, follow these steps:
|
||||||
|
|
||||||
1. Click **Dashboards** in the main menu.
|
1. Click **Dashboards** in the main menu.
|
||||||
|
|||||||
@@ -317,16 +317,13 @@ Click the **Copy time range to clipboard** icon to copy the current time range t
|
|||||||
|
|
||||||
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
||||||
|
|
||||||
#### Zoom out
|
#### Zoom out (Cmd+Z or Ctrl+Z)
|
||||||
|
|
||||||
- Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualizations
|
Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualization.
|
||||||
- Double click on the panel graph area (time series family visualizations only)
|
|
||||||
- Type the `t-` keyboard shortcut
|
|
||||||
|
|
||||||
#### Zoom in
|
#### Zoom in (only applicable to graph visualizations)
|
||||||
|
|
||||||
- Click and drag horizontally in the panel graph area to select a time range (time series family visualizations only)
|
Click and drag to select the time range in the visualization that you want to view.
|
||||||
- Type the `t+` keyboard shortcut
|
|
||||||
|
|
||||||
#### Refresh dashboard
|
#### Refresh dashboard
|
||||||
|
|
||||||
|
|||||||
@@ -175,10 +175,9 @@ By hovering over a panel with the mouse you can use some shortcuts that will tar
|
|||||||
- `pl`: Hide or show legend
|
- `pl`: Hide or show legend
|
||||||
- `pr`: Remove Panel
|
- `pr`: Remove Panel
|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
You can pan the panel time range left and right, and zoom it and in and out.
|
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||||
This, in turn, changes the dashboard time range.
|
|
||||||
|
|
||||||
This feature is supported for the following visualizations:
|
This feature is supported for the following visualizations:
|
||||||
|
|
||||||
@@ -192,7 +191,7 @@ This feature is supported for the following visualizations:
|
|||||||
|
|
||||||
Click and drag on the panel to zoom in on a particular time range.
|
Click and drag on the panel to zoom in on a particular time range.
|
||||||
|
|
||||||
The following screen recordings show this interaction in the time series and candlestick visualizations:
|
The following screen recordings show this interaction in the time series and x visualizations:
|
||||||
|
|
||||||
Time series
|
Time series
|
||||||
|
|
||||||
@@ -212,7 +211,7 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
|||||||
- Next range: 8:30 - 10:29
|
- Next range: 8:30 - 10:29
|
||||||
- Next range: 7:30 - 11:29
|
- Next range: 7:30 - 11:29
|
||||||
|
|
||||||
The following screen recordings demonstrate the preceding example in the time series and heatmap visualizations:
|
The following screen recordings demonstrate the preceding example in the time series and x visualizations:
|
||||||
|
|
||||||
Time series
|
Time series
|
||||||
|
|
||||||
@@ -222,19 +221,6 @@ Heatmap
|
|||||||
|
|
||||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
||||||
|
|
||||||
### Pan
|
|
||||||
|
|
||||||
Click and drag the x-axis area of the panel to pan the time range.
|
|
||||||
|
|
||||||
The time range shifts by the distance you drag.
|
|
||||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
|
||||||
|
|
||||||
The following screen recordings show this interaction in the time series visualization:
|
|
||||||
|
|
||||||
Time series
|
|
||||||
|
|
||||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-ts-time-pan-mouse.mp4" >}}
|
|
||||||
|
|
||||||
## Add a panel
|
## Add a panel
|
||||||
|
|
||||||
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
||||||
|
|||||||
@@ -92,9 +92,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -79,9 +79,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -93,9 +93,9 @@ You can also create a state timeline visualization using time series data. To do
|
|||||||
|
|
||||||

|

|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -85,9 +85,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -167,9 +167,9 @@ The following example shows three series: Min, Max, and Value. The Min and Max s
|
|||||||
|
|
||||||
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
||||||
|
|
||||||
## Pan and zoom panel time range
|
## Zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -117,44 +117,6 @@ export const MyComponent = () => {
|
|||||||
};
|
};
|
||||||
```
|
```
|
||||||
|
|
||||||
### Custom Header Rendering
|
|
||||||
|
|
||||||
Column headers can be customized using strings, React elements, or renderer functions. The `header` property accepts any value that matches React Table's `Renderer` type.
|
|
||||||
|
|
||||||
**Important:** When using custom header content, prefer inline elements (like `<span>`) over block elements (like `<div>`) to avoid layout issues. Block-level elements can cause extra spacing and alignment problems in table headers because they disrupt the table's inline flow. Use `display: inline-flex` or `display: inline-block` when you need flexbox or block-like behavior.
|
|
||||||
|
|
||||||
```tsx
|
|
||||||
const columns: Array<Column<TableData>> = [
|
|
||||||
// React element header
|
|
||||||
{
|
|
||||||
id: 'checkbox',
|
|
||||||
header: (
|
|
||||||
<>
|
|
||||||
<label htmlFor="select-all" className="sr-only">
|
|
||||||
Select all rows
|
|
||||||
</label>
|
|
||||||
<Checkbox id="select-all" />
|
|
||||||
</>
|
|
||||||
),
|
|
||||||
cell: () => <Checkbox aria-label="Select row" />,
|
|
||||||
},
|
|
||||||
|
|
||||||
// Function renderer header
|
|
||||||
{
|
|
||||||
id: 'firstName',
|
|
||||||
header: () => (
|
|
||||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
|
||||||
<Icon name="user" size="sm" />
|
|
||||||
<span>First Name</span>
|
|
||||||
</span>
|
|
||||||
),
|
|
||||||
},
|
|
||||||
|
|
||||||
// String header
|
|
||||||
{ id: 'lastName', header: 'Last name' },
|
|
||||||
];
|
|
||||||
```
|
|
||||||
|
|
||||||
### Custom Cell Rendering
|
### Custom Cell Rendering
|
||||||
|
|
||||||
Individual cells can be rendered using custom content dy defining a `cell` property on the column definition.
|
Individual cells can be rendered using custom content dy defining a `cell` property on the column definition.
|
||||||
|
|||||||
@@ -3,11 +3,8 @@ import { useCallback, useMemo, useState } from 'react';
|
|||||||
import { CellProps } from 'react-table';
|
import { CellProps } from 'react-table';
|
||||||
|
|
||||||
import { LinkButton } from '../Button/Button';
|
import { LinkButton } from '../Button/Button';
|
||||||
import { Checkbox } from '../Forms/Checkbox';
|
|
||||||
import { Field } from '../Forms/Field';
|
import { Field } from '../Forms/Field';
|
||||||
import { Icon } from '../Icon/Icon';
|
|
||||||
import { Input } from '../Input/Input';
|
import { Input } from '../Input/Input';
|
||||||
import { Text } from '../Text/Text';
|
|
||||||
|
|
||||||
import { FetchDataArgs, InteractiveTable, InteractiveTableHeaderTooltip } from './InteractiveTable';
|
import { FetchDataArgs, InteractiveTable, InteractiveTableHeaderTooltip } from './InteractiveTable';
|
||||||
import mdx from './InteractiveTable.mdx';
|
import mdx from './InteractiveTable.mdx';
|
||||||
@@ -300,40 +297,4 @@ export const WithControlledSort: StoryFn<typeof InteractiveTable> = (args) => {
|
|||||||
return <InteractiveTable {...args} data={data} pageSize={15} fetchData={fetchData} />;
|
return <InteractiveTable {...args} data={data} pageSize={15} fetchData={fetchData} />;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const WithCustomHeader: TableStoryObj = {
|
|
||||||
args: {
|
|
||||||
columns: [
|
|
||||||
// React element header
|
|
||||||
{
|
|
||||||
id: 'checkbox',
|
|
||||||
header: (
|
|
||||||
<>
|
|
||||||
<label htmlFor="select-all" className="sr-only">
|
|
||||||
Select all rows
|
|
||||||
</label>
|
|
||||||
<Checkbox id="select-all" />
|
|
||||||
</>
|
|
||||||
),
|
|
||||||
cell: () => <Checkbox aria-label="Select row" />,
|
|
||||||
},
|
|
||||||
// Function renderer header
|
|
||||||
{
|
|
||||||
id: 'firstName',
|
|
||||||
header: () => (
|
|
||||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
|
||||||
<Icon name="user" size="sm" />
|
|
||||||
<Text element="span">First Name</Text>
|
|
||||||
</span>
|
|
||||||
),
|
|
||||||
sortType: 'string',
|
|
||||||
},
|
|
||||||
// String header
|
|
||||||
{ id: 'lastName', header: 'Last name', sortType: 'string' },
|
|
||||||
{ id: 'car', header: 'Car', sortType: 'string' },
|
|
||||||
{ id: 'age', header: 'Age', sortType: 'number' },
|
|
||||||
],
|
|
||||||
data: pageableData.slice(0, 10),
|
|
||||||
getRowId: (r) => r.id,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
export default meta;
|
export default meta;
|
||||||
|
|||||||
@@ -2,9 +2,6 @@ import { render, screen, within } from '@testing-library/react';
|
|||||||
import userEvent from '@testing-library/user-event';
|
import userEvent from '@testing-library/user-event';
|
||||||
import * as React from 'react';
|
import * as React from 'react';
|
||||||
|
|
||||||
import { Checkbox } from '../Forms/Checkbox';
|
|
||||||
import { Icon } from '../Icon/Icon';
|
|
||||||
|
|
||||||
import { InteractiveTable } from './InteractiveTable';
|
import { InteractiveTable } from './InteractiveTable';
|
||||||
import { Column } from './types';
|
import { Column } from './types';
|
||||||
|
|
||||||
@@ -250,104 +247,4 @@ describe('InteractiveTable', () => {
|
|||||||
expect(fetchData).toHaveBeenCalledWith({ sortBy: [{ id: 'id', desc: false }] });
|
expect(fetchData).toHaveBeenCalledWith({ sortBy: [{ id: 'id', desc: false }] });
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('custom header rendering', () => {
|
|
||||||
it('should render string headers', () => {
|
|
||||||
const columns: Array<Column<TableData>> = [{ id: 'id', header: 'ID' }];
|
|
||||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
|
||||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
|
||||||
|
|
||||||
expect(screen.getByRole('columnheader', { name: 'ID' })).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should render React element headers', () => {
|
|
||||||
const columns: Array<Column<TableData>> = [
|
|
||||||
{
|
|
||||||
id: 'checkbox',
|
|
||||||
header: (
|
|
||||||
<>
|
|
||||||
<label htmlFor="select-all" className="sr-only">
|
|
||||||
Select all rows
|
|
||||||
</label>
|
|
||||||
<Checkbox id="select-all" data-testid="header-checkbox" />
|
|
||||||
</>
|
|
||||||
),
|
|
||||||
cell: () => <Checkbox data-testid="cell-checkbox" aria-label="Select row" />,
|
|
||||||
},
|
|
||||||
];
|
|
||||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
|
||||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
|
||||||
|
|
||||||
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
|
||||||
expect(screen.getByTestId('cell-checkbox')).toBeInTheDocument();
|
|
||||||
expect(screen.getByLabelText('Select all rows')).toBeInTheDocument();
|
|
||||||
expect(screen.getByLabelText('Select row')).toBeInTheDocument();
|
|
||||||
expect(screen.getByText('Select all rows')).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should render function renderer headers', () => {
|
|
||||||
const columns: Array<Column<TableData>> = [
|
|
||||||
{
|
|
||||||
id: 'firstName',
|
|
||||||
header: () => (
|
|
||||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
|
||||||
<Icon name="user" size="sm" data-testid="header-icon" />
|
|
||||||
<span>First Name</span>
|
|
||||||
</span>
|
|
||||||
),
|
|
||||||
sortType: 'string',
|
|
||||||
},
|
|
||||||
];
|
|
||||||
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
|
||||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
|
||||||
|
|
||||||
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
|
||||||
expect(screen.getByRole('columnheader', { name: /first name/i })).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should render all header types together', () => {
|
|
||||||
const columns: Array<Column<TableData>> = [
|
|
||||||
{
|
|
||||||
id: 'checkbox',
|
|
||||||
header: (
|
|
||||||
<>
|
|
||||||
<label htmlFor="select-all" className="sr-only">
|
|
||||||
Select all rows
|
|
||||||
</label>
|
|
||||||
<Checkbox id="select-all" data-testid="header-checkbox" />
|
|
||||||
</>
|
|
||||||
),
|
|
||||||
cell: () => <Checkbox aria-label="Select row" />,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'id',
|
|
||||||
header: () => (
|
|
||||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
|
||||||
<Icon name="user" size="sm" data-testid="header-icon" />
|
|
||||||
<span>ID</span>
|
|
||||||
</span>
|
|
||||||
),
|
|
||||||
sortType: 'string',
|
|
||||||
},
|
|
||||||
{ id: 'country', header: 'Country', sortType: 'string' },
|
|
||||||
{ id: 'value', header: 'Value' },
|
|
||||||
];
|
|
||||||
const data: TableData[] = [
|
|
||||||
{ id: '1', value: 'Value 1', country: 'Sweden' },
|
|
||||||
{ id: '2', value: 'Value 2', country: 'Norway' },
|
|
||||||
];
|
|
||||||
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
|
||||||
|
|
||||||
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
|
||||||
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
|
||||||
expect(screen.getByRole('columnheader', { name: 'Country' })).toBeInTheDocument();
|
|
||||||
expect(screen.getByRole('columnheader', { name: 'Value' })).toBeInTheDocument();
|
|
||||||
|
|
||||||
// Verify data is rendered
|
|
||||||
expect(screen.getByText('Sweden')).toBeInTheDocument();
|
|
||||||
expect(screen.getByText('Norway')).toBeInTheDocument();
|
|
||||||
expect(screen.getByText('Value 1')).toBeInTheDocument();
|
|
||||||
expect(screen.getByText('Value 2')).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { ReactNode } from 'react';
|
import { ReactNode } from 'react';
|
||||||
import { CellProps, DefaultSortTypes, HeaderProps, IdType, Renderer, SortByFn } from 'react-table';
|
import { CellProps, DefaultSortTypes, IdType, SortByFn } from 'react-table';
|
||||||
|
|
||||||
export interface Column<TableData extends object> {
|
export interface Column<TableData extends object> {
|
||||||
/**
|
/**
|
||||||
@@ -11,9 +11,9 @@ export interface Column<TableData extends object> {
|
|||||||
*/
|
*/
|
||||||
cell?: (props: CellProps<TableData>) => ReactNode;
|
cell?: (props: CellProps<TableData>) => ReactNode;
|
||||||
/**
|
/**
|
||||||
* Header name. Can be a string, renderer function, or undefined. If `undefined` the header will be empty. Useful for action columns.
|
* Header name. if `undefined` the header will be empty. Useful for action columns.
|
||||||
*/
|
*/
|
||||||
header?: Renderer<HeaderProps<TableData>>;
|
header?: string;
|
||||||
/**
|
/**
|
||||||
* Column sort type. If `undefined` the column will not be sortable.
|
* Column sort type. If `undefined` the column will not be sortable.
|
||||||
* */
|
* */
|
||||||
|
|||||||
@@ -76,27 +76,21 @@ func (hs *HTTPServer) CreateDashboardSnapshot(c *contextmodel.ReqContext) {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
// Do not check permissions when the instance snapshot public mode is enabled
|
||||||
|
if !hs.Cfg.SnapshotPublicMode {
|
||||||
|
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
||||||
|
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
||||||
|
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
dashboardsnapshots.CreateDashboardSnapshot(c, snapshot.SnapshotSharingOptions{
|
||||||
SnapshotsEnabled: hs.Cfg.SnapshotEnabled,
|
SnapshotsEnabled: hs.Cfg.SnapshotEnabled,
|
||||||
ExternalEnabled: hs.Cfg.ExternalEnabled,
|
ExternalEnabled: hs.Cfg.ExternalEnabled,
|
||||||
ExternalSnapshotName: hs.Cfg.ExternalSnapshotName,
|
ExternalSnapshotName: hs.Cfg.ExternalSnapshotName,
|
||||||
ExternalSnapshotURL: hs.Cfg.ExternalSnapshotUrl,
|
ExternalSnapshotURL: hs.Cfg.ExternalSnapshotUrl,
|
||||||
}
|
}, cmd, hs.dashboardsnapshotsService)
|
||||||
|
|
||||||
if hs.Cfg.SnapshotPublicMode {
|
|
||||||
// Public mode: no user or dashboard validation needed
|
|
||||||
dashboardsnapshots.CreateDashboardSnapshotPublic(c, cfg, cmd, hs.dashboardsnapshotsService)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Regular mode: check permissions
|
|
||||||
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
|
||||||
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
|
||||||
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
dashboardsnapshots.CreateDashboardSnapshot(c, cfg, cmd, hs.dashboardsnapshotsService)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// GET /api/snapshots/:key
|
// GET /api/snapshots/:key
|
||||||
@@ -219,6 +213,13 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
|||||||
return response.Error(http.StatusUnauthorized, "OrgID mismatch", nil)
|
return response.Error(http.StatusUnauthorized, "OrgID mismatch", nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if queryResult.External {
|
||||||
|
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
||||||
|
if err != nil {
|
||||||
|
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Dashboard can be empty (creation error or external snapshot). This means that the mustInt here returns a 0,
|
// Dashboard can be empty (creation error or external snapshot). This means that the mustInt here returns a 0,
|
||||||
// which before RBAC would result in a dashboard which has no ACL. A dashboard without an ACL would fallback
|
// which before RBAC would result in a dashboard which has no ACL. A dashboard without an ACL would fallback
|
||||||
// to the user’s org role, which for editors and admins would essentially always be allowed here. With RBAC,
|
// to the user’s org role, which for editors and admins would essentially always be allowed here. With RBAC,
|
||||||
@@ -238,13 +239,6 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if queryResult.External {
|
|
||||||
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
|
||||||
if err != nil {
|
|
||||||
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
cmd := &dashboardsnapshots.DeleteDashboardSnapshotCommand{DeleteKey: queryResult.DeleteKey}
|
cmd := &dashboardsnapshots.DeleteDashboardSnapshotCommand{DeleteKey: queryResult.DeleteKey}
|
||||||
|
|
||||||
if err := hs.dashboardsnapshotsService.DeleteDashboardSnapshot(c.Req.Context(), cmd); err != nil {
|
if err := hs.dashboardsnapshotsService.DeleteDashboardSnapshot(c.Req.Context(), cmd); err != nil {
|
||||||
|
|||||||
@@ -20,8 +20,10 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/plugins"
|
"github.com/grafana/grafana/pkg/plugins"
|
||||||
"github.com/grafana/grafana/pkg/plugins/config"
|
"github.com/grafana/grafana/pkg/plugins/config"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/pluginfakes"
|
"github.com/grafana/grafana/pkg/plugins/manager/pluginfakes"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/pluginassets/modulehash"
|
||||||
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
accesscontrolmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
|
accesscontrolmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
|
||||||
"github.com/grafana/grafana/pkg/services/apiserver/endpoints/request"
|
"github.com/grafana/grafana/pkg/services/apiserver/endpoints/request"
|
||||||
@@ -80,7 +82,8 @@ func setupTestEnvironment(t *testing.T, cfg *setting.Cfg, features featuremgmt.F
|
|||||||
var pluginsAssets = passets
|
var pluginsAssets = passets
|
||||||
if pluginsAssets == nil {
|
if pluginsAssets == nil {
|
||||||
sig := signature.ProvideService(pluginsCfg, statickey.New())
|
sig := signature.ProvideService(pluginsCfg, statickey.New())
|
||||||
pluginsAssets = pluginassets.ProvideService(pluginsCfg, pluginsCDN, sig, pluginStore)
|
calc := modulehash.NewCalculator(pluginsCfg, registry.NewInMemory(), pluginsCDN, sig)
|
||||||
|
pluginsAssets = pluginassets.ProvideService(pluginsCfg, pluginsCDN, calc)
|
||||||
}
|
}
|
||||||
|
|
||||||
hs := &HTTPServer{
|
hs := &HTTPServer{
|
||||||
@@ -714,6 +717,8 @@ func newPluginAssets() func() *pluginassets.Service {
|
|||||||
|
|
||||||
func newPluginAssetsWithConfig(pCfg *config.PluginManagementCfg) func() *pluginassets.Service {
|
func newPluginAssetsWithConfig(pCfg *config.PluginManagementCfg) func() *pluginassets.Service {
|
||||||
return func() *pluginassets.Service {
|
return func() *pluginassets.Service {
|
||||||
return pluginassets.ProvideService(pCfg, pluginscdn.ProvideService(pCfg), signature.ProvideService(pCfg, statickey.New()), &pluginstore.FakePluginStore{})
|
cdn := pluginscdn.ProvideService(pCfg)
|
||||||
|
calc := modulehash.NewCalculator(pCfg, registry.NewInMemory(), cdn, signature.ProvideService(pCfg, statickey.New()))
|
||||||
|
return pluginassets.ProvideService(pCfg, cdn, calc)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -32,8 +32,6 @@ import (
|
|||||||
var (
|
var (
|
||||||
logger = glog.New("data-proxy-log")
|
logger = glog.New("data-proxy-log")
|
||||||
client = newHTTPClient()
|
client = newHTTPClient()
|
||||||
|
|
||||||
errPluginProxyRouteAccessDenied = errors.New("plugin proxy route access denied")
|
|
||||||
)
|
)
|
||||||
|
|
||||||
type DataSourceProxy struct {
|
type DataSourceProxy struct {
|
||||||
@@ -310,21 +308,12 @@ func (proxy *DataSourceProxy) validateRequest() error {
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
// issues/116273: When we have an empty input route (or input that becomes relative to "."), we do not want it
|
|
||||||
// to be ".". This is because the `CleanRelativePath` function will never return "./" prefixes, and as such,
|
|
||||||
// the common prefix we need is an empty string.
|
|
||||||
if r1 == "." && proxy.proxyPath != "." {
|
|
||||||
r1 = ""
|
|
||||||
}
|
|
||||||
if r2 == "." && route.Path != "." {
|
|
||||||
r2 = ""
|
|
||||||
}
|
|
||||||
if !strings.HasPrefix(r1, r2) {
|
if !strings.HasPrefix(r1, r2) {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
if !proxy.hasAccessToRoute(route) {
|
if !proxy.hasAccessToRoute(route) {
|
||||||
return errPluginProxyRouteAccessDenied
|
return errors.New("plugin proxy route access denied")
|
||||||
}
|
}
|
||||||
|
|
||||||
proxy.matchedRoute = route
|
proxy.matchedRoute = route
|
||||||
|
|||||||
@@ -673,94 +673,6 @@ func TestIntegrationDataSourceProxy_routeRule(t *testing.T) {
|
|||||||
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
t.Run("Regression of 116273: Fallback routes should apply fallback route roles", func(t *testing.T) {
|
|
||||||
for _, tc := range []struct {
|
|
||||||
InputPath string
|
|
||||||
ConfigurationPath string
|
|
||||||
ExpectError bool
|
|
||||||
}{
|
|
||||||
{
|
|
||||||
InputPath: "api/v2/leak-ur-secrets",
|
|
||||||
ConfigurationPath: "",
|
|
||||||
ExpectError: true,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
InputPath: "",
|
|
||||||
ConfigurationPath: "",
|
|
||||||
ExpectError: true,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
InputPath: ".",
|
|
||||||
ConfigurationPath: ".",
|
|
||||||
ExpectError: true,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
InputPath: "",
|
|
||||||
ConfigurationPath: ".",
|
|
||||||
ExpectError: false,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
InputPath: "api",
|
|
||||||
ConfigurationPath: ".",
|
|
||||||
ExpectError: false,
|
|
||||||
},
|
|
||||||
} {
|
|
||||||
orEmptyStr := func(s string) string {
|
|
||||||
if s == "" {
|
|
||||||
return "<empty>"
|
|
||||||
}
|
|
||||||
return s
|
|
||||||
}
|
|
||||||
t.Run(
|
|
||||||
fmt.Sprintf("with inputPath=%s, configurationPath=%s, expectError=%v",
|
|
||||||
orEmptyStr(tc.InputPath), orEmptyStr(tc.ConfigurationPath), tc.ExpectError),
|
|
||||||
func(t *testing.T) {
|
|
||||||
ds := &datasources.DataSource{
|
|
||||||
UID: "dsUID",
|
|
||||||
JsonData: simplejson.New(),
|
|
||||||
}
|
|
||||||
routes := []*plugins.Route{
|
|
||||||
{
|
|
||||||
Path: tc.ConfigurationPath,
|
|
||||||
ReqRole: org.RoleAdmin,
|
|
||||||
Method: "GET",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Path: tc.ConfigurationPath,
|
|
||||||
ReqRole: org.RoleAdmin,
|
|
||||||
Method: "POST",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Path: tc.ConfigurationPath,
|
|
||||||
ReqRole: org.RoleAdmin,
|
|
||||||
Method: "PUT",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Path: tc.ConfigurationPath,
|
|
||||||
ReqRole: org.RoleAdmin,
|
|
||||||
Method: "DELETE",
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
req, err := http.NewRequestWithContext(t.Context(), "GET", "http://localhost/"+tc.InputPath, nil)
|
|
||||||
require.NoError(t, err, "failed to create HTTP request")
|
|
||||||
ctx := &contextmodel.ReqContext{
|
|
||||||
Context: &web.Context{Req: req},
|
|
||||||
SignedInUser: &user.SignedInUser{OrgRole: org.RoleViewer},
|
|
||||||
}
|
|
||||||
proxy, err := setupDSProxyTest(t, ctx, ds, routes, tc.InputPath)
|
|
||||||
require.NoError(t, err, "failed to setup proxy test")
|
|
||||||
err = proxy.validateRequest()
|
|
||||||
if tc.ExpectError {
|
|
||||||
require.ErrorIs(t, err, errPluginProxyRouteAccessDenied, "request was not denied due to access denied?")
|
|
||||||
} else {
|
|
||||||
require.NoError(t, err, "request was unexpectedly denied access")
|
|
||||||
}
|
|
||||||
},
|
|
||||||
)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// test DataSourceProxy request handling.
|
// test DataSourceProxy request handling.
|
||||||
|
|||||||
@@ -30,6 +30,7 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/pluginassets/modulehash"
|
||||||
"github.com/grafana/grafana/pkg/plugins/pluginerrs"
|
"github.com/grafana/grafana/pkg/plugins/pluginerrs"
|
||||||
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
ac "github.com/grafana/grafana/pkg/services/accesscontrol"
|
ac "github.com/grafana/grafana/pkg/services/accesscontrol"
|
||||||
@@ -849,7 +850,8 @@ func Test_PluginsSettings(t *testing.T) {
|
|||||||
pCfg := &config.PluginManagementCfg{}
|
pCfg := &config.PluginManagementCfg{}
|
||||||
pluginCDN := pluginscdn.ProvideService(pCfg)
|
pluginCDN := pluginscdn.ProvideService(pCfg)
|
||||||
sig := signature.ProvideService(pCfg, statickey.New())
|
sig := signature.ProvideService(pCfg, statickey.New())
|
||||||
hs.pluginAssets = pluginassets.ProvideService(pCfg, pluginCDN, sig, hs.pluginStore)
|
calc := modulehash.NewCalculator(pCfg, registry.NewInMemory(), pluginCDN, sig)
|
||||||
|
hs.pluginAssets = pluginassets.ProvideService(pCfg, pluginCDN, calc)
|
||||||
hs.pluginErrorResolver = pluginerrs.ProvideStore(errTracker)
|
hs.pluginErrorResolver = pluginerrs.ProvideStore(errTracker)
|
||||||
hs.pluginsUpdateChecker, err = updatemanager.ProvidePluginsService(
|
hs.pluginsUpdateChecker, err = updatemanager.ProvidePluginsService(
|
||||||
hs.Cfg,
|
hs.Cfg,
|
||||||
|
|||||||
144
pkg/plugins/pluginassets/modulehash/modulehash.go
Normal file
144
pkg/plugins/pluginassets/modulehash/modulehash.go
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
package modulehash
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/hex"
|
||||||
|
"fmt"
|
||||||
|
"path"
|
||||||
|
"path/filepath"
|
||||||
|
"sync"
|
||||||
|
|
||||||
|
"github.com/grafana/grafana/pkg/plugins"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/config"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/log"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Calculator struct {
|
||||||
|
reg registry.Service
|
||||||
|
cfg *config.PluginManagementCfg
|
||||||
|
cdn *pluginscdn.Service
|
||||||
|
signature *signature.Signature
|
||||||
|
log log.Logger
|
||||||
|
|
||||||
|
moduleHashCache sync.Map
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewCalculator(cfg *config.PluginManagementCfg, reg registry.Service, cdn *pluginscdn.Service, signature *signature.Signature) *Calculator {
|
||||||
|
return &Calculator{
|
||||||
|
cfg: cfg,
|
||||||
|
reg: reg,
|
||||||
|
cdn: cdn,
|
||||||
|
signature: signature,
|
||||||
|
log: log.New("modulehash"),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ModuleHash returns the module.js SHA256 hash for a plugin in the format expected by the browser for SRI checks.
|
||||||
|
// The module hash is read from the plugin's MANIFEST.txt file.
|
||||||
|
// The plugin can also be a nested plugin.
|
||||||
|
// If the plugin is unsigned, an empty string is returned.
|
||||||
|
// The results are cached to avoid repeated reads from the MANIFEST.txt file.
|
||||||
|
func (c *Calculator) ModuleHash(ctx context.Context, pluginID, pluginVersion string) string {
|
||||||
|
p, ok := c.reg.Plugin(ctx, pluginID, pluginVersion)
|
||||||
|
if !ok {
|
||||||
|
c.log.Error("Failed to calculate module hash as plugin is not registered", "pluginId", pluginID)
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
k := c.moduleHashCacheKey(pluginID, pluginVersion)
|
||||||
|
cachedValue, ok := c.moduleHashCache.Load(k)
|
||||||
|
if ok {
|
||||||
|
return cachedValue.(string)
|
||||||
|
}
|
||||||
|
mh, err := c.moduleHash(ctx, p, "")
|
||||||
|
if err != nil {
|
||||||
|
c.log.Error("Failed to calculate module hash", "pluginId", p.ID, "error", err)
|
||||||
|
}
|
||||||
|
c.moduleHashCache.Store(k, mh)
|
||||||
|
return mh
|
||||||
|
}
|
||||||
|
|
||||||
|
// moduleHash is the underlying function for ModuleHash. See its documentation for more information.
|
||||||
|
// If the plugin is not a CDN plugin, the function will return an empty string.
|
||||||
|
// It will read the module hash from the MANIFEST.txt in the [[plugins.FS]] of the provided plugin.
|
||||||
|
// If childFSBase is provided, the function will try to get the hash from MANIFEST.txt for the provided children's
|
||||||
|
// module.js file, rather than for the provided plugin.
|
||||||
|
func (c *Calculator) moduleHash(ctx context.Context, p *plugins.Plugin, childFSBase string) (r string, err error) {
|
||||||
|
if !c.cfg.Features.SriChecksEnabled {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ignore unsigned plugins
|
||||||
|
if !p.Signature.IsValid() {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if p.Parent != nil {
|
||||||
|
// The module hash is contained within the parent's MANIFEST.txt file.
|
||||||
|
// For example, the parent's MANIFEST.txt will contain an entry similar to this:
|
||||||
|
//
|
||||||
|
// ```
|
||||||
|
// "datasource/module.js": "1234567890abcdef..."
|
||||||
|
// ```
|
||||||
|
//
|
||||||
|
// Recursively call moduleHash with the parent plugin and with the children plugin folder path
|
||||||
|
// to get the correct module hash for the nested plugin.
|
||||||
|
if childFSBase == "" {
|
||||||
|
childFSBase = p.FS.Base()
|
||||||
|
}
|
||||||
|
return c.moduleHash(ctx, p.Parent, childFSBase)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Only CDN plugins are supported for SRI checks.
|
||||||
|
// CDN plugins have the version as part of the URL, which acts as a cache-buster.
|
||||||
|
// Needed due to: https://github.com/grafana/plugin-tools/pull/1426
|
||||||
|
// FS plugins build before this change will have SRI mismatch issues.
|
||||||
|
if !c.cdnEnabled(p.ID, p.FS) {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest, err := c.signature.ReadPluginManifestFromFS(ctx, p.FS)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("read plugin manifest: %w", err)
|
||||||
|
}
|
||||||
|
if !manifest.IsV2() {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var childPath string
|
||||||
|
if childFSBase != "" {
|
||||||
|
// Calculate the relative path of the child plugin folder from the parent plugin folder.
|
||||||
|
childPath, err = p.FS.Rel(childFSBase)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("rel path: %w", err)
|
||||||
|
}
|
||||||
|
// MANIFETS.txt uses forward slashes as path separators.
|
||||||
|
childPath = filepath.ToSlash(childPath)
|
||||||
|
}
|
||||||
|
moduleHash, ok := manifest.Files[path.Join(childPath, "module.js")]
|
||||||
|
if !ok {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
return convertHashForSRI(moduleHash)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Calculator) cdnEnabled(pluginID string, fs plugins.FS) bool {
|
||||||
|
return c.cdn.PluginSupported(pluginID) || fs.Type().CDN()
|
||||||
|
}
|
||||||
|
|
||||||
|
// convertHashForSRI takes a SHA256 hash string and returns it as expected by the browser for SRI checks.
|
||||||
|
func convertHashForSRI(h string) (string, error) {
|
||||||
|
hb, err := hex.DecodeString(h)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("hex decode string: %w", err)
|
||||||
|
}
|
||||||
|
return "sha256-" + base64.StdEncoding.EncodeToString(hb), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// moduleHashCacheKey returns a unique key for the module hash cache.
|
||||||
|
func (c *Calculator) moduleHashCacheKey(pluginId, pluginVersion string) string {
|
||||||
|
return pluginId + ":" + pluginVersion
|
||||||
|
}
|
||||||
459
pkg/plugins/pluginassets/modulehash/modulehash_test.go
Normal file
459
pkg/plugins/pluginassets/modulehash/modulehash_test.go
Normal file
@@ -0,0 +1,459 @@
|
|||||||
|
package modulehash
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"path/filepath"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/require"
|
||||||
|
|
||||||
|
"github.com/grafana/grafana/pkg/plugins"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/config"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
|
)
|
||||||
|
|
||||||
|
func Test_ModuleHash(t *testing.T) {
|
||||||
|
const (
|
||||||
|
pluginID = "grafana-test-datasource"
|
||||||
|
parentPluginID = "grafana-test-app"
|
||||||
|
)
|
||||||
|
for _, tc := range []struct {
|
||||||
|
name string
|
||||||
|
features *config.Features
|
||||||
|
registry []*plugins.Plugin
|
||||||
|
|
||||||
|
// Can be used to configure plugin's fs
|
||||||
|
// fs cdn type = loaded from CDN with no files on disk
|
||||||
|
// fs local type = files on disk but served from CDN only if cdn=true
|
||||||
|
plugin string
|
||||||
|
|
||||||
|
// When true, set cdn=true in config
|
||||||
|
cdn bool
|
||||||
|
expModuleHash string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "unsigned should not return module hash",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(pluginID, withSignatureStatus(plugins.SignatureStatusUnsigned))},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: false},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
withClass(plugins.ClassExternal),
|
||||||
|
)},
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
withClass(plugins.ClassExternal),
|
||||||
|
)},
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
)},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
)},
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: false},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
)},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: false},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// parentPluginID (/)
|
||||||
|
// └── pluginID (/datasource)
|
||||||
|
name: "nested plugin should return module hash from parent MANIFEST.txt",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{
|
||||||
|
newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested", "datasource"))),
|
||||||
|
withParent(newPlugin(
|
||||||
|
parentPluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested"))),
|
||||||
|
)),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: newSRIHash(t, "04d70db091d96c4775fb32ba5a8f84cc22893eb43afdb649726661d4425c6711"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// parentPluginID (/)
|
||||||
|
// └── pluginID (/panels/one)
|
||||||
|
name: "nested plugin deeper than one subfolder should return module hash from parent MANIFEST.txt",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{
|
||||||
|
newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested", "panels", "one"))),
|
||||||
|
withParent(newPlugin(
|
||||||
|
parentPluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested"))),
|
||||||
|
)),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: newSRIHash(t, "cbd1ac2284645a0e1e9a8722a729f5bcdd2b831222728709c6360beecdd6143f"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// grand-parent-app (/)
|
||||||
|
// ├── parent-datasource (/datasource)
|
||||||
|
// │ └── child-panel (/datasource/panels/one)
|
||||||
|
name: "nested plugin of a nested plugin should return module hash from parent MANIFEST.txt",
|
||||||
|
registry: []*plugins.Plugin{
|
||||||
|
newPlugin(
|
||||||
|
"child-panel",
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-deeply-nested", "datasource", "panels", "one"))),
|
||||||
|
withParent(newPlugin(
|
||||||
|
"parent-datasource",
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-deeply-nested", "datasource"))),
|
||||||
|
withParent(newPlugin(
|
||||||
|
"grand-parent-app",
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-deeply-nested"))),
|
||||||
|
)),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
plugin: "child-panel",
|
||||||
|
cdn: true,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: newSRIHash(t, "cbd1ac2284645a0e1e9a8722a729f5bcdd2b831222728709c6360beecdd6143f"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "nested plugin should not return module hash from parent if it's not registered in the registry",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested", "panels", "one"))),
|
||||||
|
withParent(newPlugin(
|
||||||
|
parentPluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested"))),
|
||||||
|
)),
|
||||||
|
)},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "missing module.js entry from MANIFEST.txt should not return module hash",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-no-module-js"))),
|
||||||
|
)},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "signed status but missing MANIFEST.txt should not return module hash",
|
||||||
|
plugin: pluginID,
|
||||||
|
registry: []*plugins.Plugin{newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-no-manifest-txt"))),
|
||||||
|
)},
|
||||||
|
cdn: false,
|
||||||
|
features: &config.Features{SriChecksEnabled: true},
|
||||||
|
expModuleHash: "",
|
||||||
|
},
|
||||||
|
} {
|
||||||
|
if tc.name == "" {
|
||||||
|
var expS string
|
||||||
|
if tc.expModuleHash == "" {
|
||||||
|
expS = "should not return module hash"
|
||||||
|
} else {
|
||||||
|
expS = "should return module hash"
|
||||||
|
}
|
||||||
|
tc.name = fmt.Sprintf("feature=%v, cdn_config=%v %s", tc.features.SriChecksEnabled, tc.cdn, expS)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run(tc.name, func(t *testing.T) {
|
||||||
|
var pluginSettings config.PluginSettings
|
||||||
|
if tc.cdn {
|
||||||
|
pluginSettings = config.PluginSettings{
|
||||||
|
pluginID: {
|
||||||
|
"cdn": "true",
|
||||||
|
},
|
||||||
|
parentPluginID: map[string]string{
|
||||||
|
"cdn": "true",
|
||||||
|
},
|
||||||
|
"grand-parent-app": map[string]string{
|
||||||
|
"cdn": "true",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
features := tc.features
|
||||||
|
if features == nil {
|
||||||
|
features = &config.Features{}
|
||||||
|
}
|
||||||
|
pCfg := &config.PluginManagementCfg{
|
||||||
|
PluginsCDNURLTemplate: "http://cdn.example.com",
|
||||||
|
PluginSettings: pluginSettings,
|
||||||
|
Features: *features,
|
||||||
|
}
|
||||||
|
|
||||||
|
svc := NewCalculator(
|
||||||
|
pCfg,
|
||||||
|
newPluginRegistry(t, tc.registry...),
|
||||||
|
pluginscdn.ProvideService(pCfg),
|
||||||
|
signature.ProvideService(pCfg, statickey.New()),
|
||||||
|
)
|
||||||
|
mh := svc.ModuleHash(context.Background(), tc.plugin, "")
|
||||||
|
require.Equal(t, tc.expModuleHash, mh)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Test_ModuleHash_Cache(t *testing.T) {
|
||||||
|
pCfg := &config.PluginManagementCfg{
|
||||||
|
PluginSettings: config.PluginSettings{},
|
||||||
|
Features: config.Features{SriChecksEnabled: true},
|
||||||
|
}
|
||||||
|
svc := NewCalculator(
|
||||||
|
pCfg,
|
||||||
|
newPluginRegistry(t),
|
||||||
|
pluginscdn.ProvideService(pCfg),
|
||||||
|
signature.ProvideService(pCfg, statickey.New()),
|
||||||
|
)
|
||||||
|
const pluginID = "grafana-test-datasource"
|
||||||
|
|
||||||
|
t.Run("cache key", func(t *testing.T) {
|
||||||
|
t.Run("with version", func(t *testing.T) {
|
||||||
|
const pluginVersion = "1.0.0"
|
||||||
|
p := newPlugin(pluginID, withInfo(plugins.Info{Version: pluginVersion}))
|
||||||
|
k := svc.moduleHashCacheKey(p.ID, p.Info.Version)
|
||||||
|
require.Equal(t, pluginID+":"+pluginVersion, k, "cache key should be correct")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("without version", func(t *testing.T) {
|
||||||
|
p := newPlugin(pluginID)
|
||||||
|
k := svc.moduleHashCacheKey(p.ID, p.Info.Version)
|
||||||
|
require.Equal(t, pluginID+":", k, "cache key should be correct")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("ModuleHash usage", func(t *testing.T) {
|
||||||
|
pV1 := newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withInfo(plugins.Info{Version: "1.0.0"}),
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid"))),
|
||||||
|
)
|
||||||
|
|
||||||
|
pCfg = &config.PluginManagementCfg{
|
||||||
|
PluginsCDNURLTemplate: "https://cdn.grafana.com",
|
||||||
|
PluginSettings: config.PluginSettings{
|
||||||
|
pluginID: {
|
||||||
|
"cdn": "true",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Features: config.Features{SriChecksEnabled: true},
|
||||||
|
}
|
||||||
|
reg := newPluginRegistry(t, pV1)
|
||||||
|
svc = NewCalculator(
|
||||||
|
pCfg,
|
||||||
|
reg,
|
||||||
|
pluginscdn.ProvideService(pCfg),
|
||||||
|
signature.ProvideService(pCfg, statickey.New()),
|
||||||
|
)
|
||||||
|
|
||||||
|
k := svc.moduleHashCacheKey(pV1.ID, pV1.Info.Version)
|
||||||
|
|
||||||
|
_, ok := svc.moduleHashCache.Load(k)
|
||||||
|
require.False(t, ok, "cache should initially be empty")
|
||||||
|
|
||||||
|
mhV1 := svc.ModuleHash(context.Background(), pV1.ID, pV1.Info.Version)
|
||||||
|
pV1Exp := newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03")
|
||||||
|
require.Equal(t, pV1Exp, mhV1, "returned value should be correct")
|
||||||
|
|
||||||
|
cachedMh, ok := svc.moduleHashCache.Load(k)
|
||||||
|
require.True(t, ok)
|
||||||
|
require.Equal(t, pV1Exp, cachedMh, "cache should contain the returned value")
|
||||||
|
|
||||||
|
t.Run("different version uses different cache key", func(t *testing.T) {
|
||||||
|
pV2 := newPlugin(
|
||||||
|
pluginID,
|
||||||
|
withInfo(plugins.Info{Version: "2.0.0"}),
|
||||||
|
withSignatureStatus(plugins.SignatureStatusValid),
|
||||||
|
// different fs for different hash
|
||||||
|
withFS(plugins.NewLocalFS(filepath.Join("../testdata", "module-hash-valid-nested"))),
|
||||||
|
)
|
||||||
|
err := reg.Add(context.Background(), pV2)
|
||||||
|
require.NoError(t, err)
|
||||||
|
|
||||||
|
mhV2 := svc.ModuleHash(context.Background(), pV2.ID, pV2.Info.Version)
|
||||||
|
require.NotEqual(t, mhV2, mhV1, "different version should have different hash")
|
||||||
|
require.Equal(t, newSRIHash(t, "266c19bc148b22ddef2a288fc5f8f40855bda22ccf60be53340b4931e469ae2a"), mhV2)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("cache should be used", func(t *testing.T) {
|
||||||
|
// edit cache directly
|
||||||
|
svc.moduleHashCache.Store(k, "hax")
|
||||||
|
require.Equal(t, "hax", svc.ModuleHash(context.Background(), pV1.ID, pV1.Info.Version))
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestConvertHashFromSRI(t *testing.T) {
|
||||||
|
for _, tc := range []struct {
|
||||||
|
hash string
|
||||||
|
expHash string
|
||||||
|
expErr bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
hash: "ddfcb449445064e6c39f0c20b15be3cb6a55837cf4781df23d02de005f436811",
|
||||||
|
expHash: "sha256-3fy0SURQZObDnwwgsVvjy2pVg3z0eB3yPQLeAF9DaBE=",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
hash: "not-a-valid-hash",
|
||||||
|
expErr: true,
|
||||||
|
},
|
||||||
|
} {
|
||||||
|
t.Run(tc.hash, func(t *testing.T) {
|
||||||
|
r, err := convertHashForSRI(tc.hash)
|
||||||
|
if tc.expErr {
|
||||||
|
require.Error(t, err)
|
||||||
|
} else {
|
||||||
|
require.NoError(t, err)
|
||||||
|
require.Equal(t, tc.expHash, r)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func newPlugin(pluginID string, cbs ...func(p *plugins.Plugin) *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p := &plugins.Plugin{
|
||||||
|
JSONData: plugins.JSONData{
|
||||||
|
ID: pluginID,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
for _, cb := range cbs {
|
||||||
|
p = cb(p)
|
||||||
|
}
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
|
||||||
|
func withInfo(info plugins.Info) func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
return func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p.Info = info
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func withFS(fs plugins.FS) func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
return func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p.FS = fs
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func withSignatureStatus(status plugins.SignatureStatus) func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
return func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p.Signature = status
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func withParent(parent *plugins.Plugin) func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
return func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p.Parent = parent
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func withClass(class plugins.Class) func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
return func(p *plugins.Plugin) *plugins.Plugin {
|
||||||
|
p.Class = class
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func newSRIHash(t *testing.T, s string) string {
|
||||||
|
r, err := convertHashForSRI(s)
|
||||||
|
require.NoError(t, err)
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
type pluginRegistry struct {
|
||||||
|
registry.Service
|
||||||
|
|
||||||
|
reg map[string]*plugins.Plugin
|
||||||
|
}
|
||||||
|
|
||||||
|
func newPluginRegistry(t *testing.T, ps ...*plugins.Plugin) *pluginRegistry {
|
||||||
|
reg := &pluginRegistry{
|
||||||
|
reg: make(map[string]*plugins.Plugin),
|
||||||
|
}
|
||||||
|
for _, p := range ps {
|
||||||
|
err := reg.Add(context.Background(), p)
|
||||||
|
require.NoError(t, err)
|
||||||
|
}
|
||||||
|
return reg
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *pluginRegistry) Plugin(_ context.Context, id, version string) (*plugins.Plugin, bool) {
|
||||||
|
key := fmt.Sprintf("%s-%s", id, version)
|
||||||
|
p, exists := f.reg[key]
|
||||||
|
return p, exists
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *pluginRegistry) Add(_ context.Context, p *plugins.Plugin) error {
|
||||||
|
key := fmt.Sprintf("%s-%s", p.ID, p.Info.Version)
|
||||||
|
f.reg[key] = p
|
||||||
|
return nil
|
||||||
|
}
|
||||||
6
pkg/server/wire_gen.go
generated
6
pkg/server/wire_gen.go
generated
@@ -715,7 +715,8 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
pluginscdnService := pluginscdn.ProvideService(pluginManagementCfg)
|
pluginscdnService := pluginscdn.ProvideService(pluginManagementCfg)
|
||||||
pluginassetsService := pluginassets2.ProvideService(pluginManagementCfg, pluginscdnService, signatureSignature, pluginstoreService)
|
calculator := pluginassets2.ProvideModuleHashCalculator(pluginManagementCfg, pluginscdnService, signatureSignature, inMemory)
|
||||||
|
pluginassetsService := pluginassets2.ProvideService(pluginManagementCfg, pluginscdnService, calculator)
|
||||||
avatarCacheServer := avatar.ProvideAvatarCacheServer(cfg)
|
avatarCacheServer := avatar.ProvideAvatarCacheServer(cfg)
|
||||||
prefService := prefimpl.ProvideService(sqlStore, cfg)
|
prefService := prefimpl.ProvideService(sqlStore, cfg)
|
||||||
dashboardPermissionsService, err := ossaccesscontrol.ProvideDashboardPermissions(cfg, featureToggles, routeRegisterImpl, sqlStore, accessControl, ossLicensingService, dashboardService, folderimplService, acimplService, teamService, userService, actionSetService, dashboardServiceImpl, eventualRestConfigProvider)
|
dashboardPermissionsService, err := ossaccesscontrol.ProvideDashboardPermissions(cfg, featureToggles, routeRegisterImpl, sqlStore, accessControl, ossLicensingService, dashboardService, folderimplService, acimplService, teamService, userService, actionSetService, dashboardServiceImpl, eventualRestConfigProvider)
|
||||||
@@ -1383,7 +1384,8 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
pluginscdnService := pluginscdn.ProvideService(pluginManagementCfg)
|
pluginscdnService := pluginscdn.ProvideService(pluginManagementCfg)
|
||||||
pluginassetsService := pluginassets2.ProvideService(pluginManagementCfg, pluginscdnService, signatureSignature, pluginstoreService)
|
calculator := pluginassets2.ProvideModuleHashCalculator(pluginManagementCfg, pluginscdnService, signatureSignature, inMemory)
|
||||||
|
pluginassetsService := pluginassets2.ProvideService(pluginManagementCfg, pluginscdnService, calculator)
|
||||||
avatarCacheServer := avatar.ProvideAvatarCacheServer(cfg)
|
avatarCacheServer := avatar.ProvideAvatarCacheServer(cfg)
|
||||||
prefService := prefimpl.ProvideService(sqlStore, cfg)
|
prefService := prefimpl.ProvideService(sqlStore, cfg)
|
||||||
dashboardPermissionsService, err := ossaccesscontrol.ProvideDashboardPermissions(cfg, featureToggles, routeRegisterImpl, sqlStore, accessControl, ossLicensingService, dashboardService, folderimplService, acimplService, teamService, userService, actionSetService, dashboardServiceImpl, eventualRestConfigProvider)
|
dashboardPermissionsService, err := ossaccesscontrol.ProvideDashboardPermissions(cfg, featureToggles, routeRegisterImpl, sqlStore, accessControl, ossLicensingService, dashboardService, folderimplService, acimplService, teamService, userService, actionSetService, dashboardServiceImpl, eventualRestConfigProvider)
|
||||||
|
|||||||
@@ -36,9 +36,6 @@ var client = &http.Client{
|
|||||||
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
|
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
|
||||||
}
|
}
|
||||||
|
|
||||||
// CreateDashboardSnapshot creates a snapshot when running Grafana in regular mode.
|
|
||||||
// It validates the user and dashboard exist before creating the snapshot.
|
|
||||||
// This mode supports both local and external snapshots.
|
|
||||||
func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
||||||
if !cfg.SnapshotsEnabled {
|
if !cfg.SnapshotsEnabled {
|
||||||
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
||||||
@@ -46,7 +43,6 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
}
|
}
|
||||||
|
|
||||||
uid := cmd.Dashboard.GetNestedString("uid")
|
uid := cmd.Dashboard.GetNestedString("uid")
|
||||||
|
|
||||||
user, err := identity.GetRequester(c.Req.Context())
|
user, err := identity.GetRequester(c.Req.Context())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
c.JsonApiErr(http.StatusBadRequest, "missing user in context", nil)
|
c.JsonApiErr(http.StatusBadRequest, "missing user in context", nil)
|
||||||
@@ -63,18 +59,21 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
cmd.ExternalURL = ""
|
|
||||||
cmd.OrgID = user.GetOrgID()
|
|
||||||
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
|
||||||
|
|
||||||
if cmd.Name == "" {
|
if cmd.Name == "" {
|
||||||
cmd.Name = "Unnamed snapshot"
|
cmd.Name = "Unnamed snapshot"
|
||||||
}
|
}
|
||||||
|
|
||||||
var snapshotURL string
|
var snapshotUrl string
|
||||||
|
cmd.ExternalURL = ""
|
||||||
|
cmd.OrgID = user.GetOrgID()
|
||||||
|
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
||||||
|
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
||||||
|
if err != nil {
|
||||||
|
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
if cmd.External {
|
if cmd.External {
|
||||||
// Handle external snapshot creation
|
|
||||||
if !cfg.ExternalEnabled {
|
if !cfg.ExternalEnabled {
|
||||||
c.JsonApiErr(http.StatusForbidden, "External dashboard creation is disabled", nil)
|
c.JsonApiErr(http.StatusForbidden, "External dashboard creation is disabled", nil)
|
||||||
return
|
return
|
||||||
@@ -86,83 +85,40 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
snapshotUrl = resp.Url
|
||||||
cmd.Key = resp.Key
|
cmd.Key = resp.Key
|
||||||
cmd.DeleteKey = resp.DeleteKey
|
cmd.DeleteKey = resp.DeleteKey
|
||||||
cmd.ExternalURL = resp.Url
|
cmd.ExternalURL = resp.Url
|
||||||
cmd.ExternalDeleteURL = resp.DeleteUrl
|
cmd.ExternalDeleteURL = resp.DeleteUrl
|
||||||
cmd.Dashboard = &common.Unstructured{}
|
cmd.Dashboard = &common.Unstructured{}
|
||||||
snapshotURL = resp.Url
|
|
||||||
|
|
||||||
metrics.MApiDashboardSnapshotExternal.Inc()
|
metrics.MApiDashboardSnapshotExternal.Inc()
|
||||||
} else {
|
} else {
|
||||||
// Handle local snapshot creation
|
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
||||||
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
|
||||||
if err != nil {
|
if cmd.Key == "" {
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
var err error
|
||||||
return
|
cmd.Key, err = util.GetRandomString(32)
|
||||||
|
if err != nil {
|
||||||
|
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
snapshotURL, err = prepareLocalSnapshot(&cmd, originalDashboardURL)
|
if cmd.DeleteKey == "" {
|
||||||
if err != nil {
|
var err error
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
cmd.DeleteKey, err = util.GetRandomString(32)
|
||||||
return
|
if err != nil {
|
||||||
|
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
snapshotUrl = setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key)
|
||||||
|
|
||||||
metrics.MApiDashboardSnapshotCreate.Inc()
|
metrics.MApiDashboardSnapshotCreate.Inc()
|
||||||
}
|
}
|
||||||
|
|
||||||
saveAndRespond(c, svc, cmd, snapshotURL)
|
|
||||||
}
|
|
||||||
|
|
||||||
// CreateDashboardSnapshotPublic creates a snapshot when running Grafana in public mode.
|
|
||||||
// In public mode, there is no user or dashboard information to validate.
|
|
||||||
// Only local snapshots are supported (external snapshots are not available).
|
|
||||||
func CreateDashboardSnapshotPublic(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
|
||||||
if !cfg.SnapshotsEnabled {
|
|
||||||
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
if cmd.Name == "" {
|
|
||||||
cmd.Name = "Unnamed snapshot"
|
|
||||||
}
|
|
||||||
|
|
||||||
snapshotURL, err := prepareLocalSnapshot(&cmd, "")
|
|
||||||
if err != nil {
|
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
metrics.MApiDashboardSnapshotCreate.Inc()
|
|
||||||
|
|
||||||
saveAndRespond(c, svc, cmd, snapshotURL)
|
|
||||||
}
|
|
||||||
|
|
||||||
// prepareLocalSnapshot prepares the command for a local snapshot and returns the snapshot URL.
|
|
||||||
func prepareLocalSnapshot(cmd *CreateDashboardSnapshotCommand, originalDashboardURL string) (string, error) {
|
|
||||||
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
|
||||||
|
|
||||||
if cmd.Key == "" {
|
|
||||||
key, err := util.GetRandomString(32)
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
cmd.Key = key
|
|
||||||
}
|
|
||||||
|
|
||||||
if cmd.DeleteKey == "" {
|
|
||||||
deleteKey, err := util.GetRandomString(32)
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
cmd.DeleteKey = deleteKey
|
|
||||||
}
|
|
||||||
|
|
||||||
return setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// saveAndRespond saves the snapshot and sends the response.
|
|
||||||
func saveAndRespond(c *contextmodel.ReqContext, svc Service, cmd CreateDashboardSnapshotCommand, snapshotURL string) {
|
|
||||||
result, err := svc.CreateDashboardSnapshot(c.Req.Context(), &cmd)
|
result, err := svc.CreateDashboardSnapshot(c.Req.Context(), &cmd)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Failed to create snapshot", err)
|
c.JsonApiErr(http.StatusInternalServerError, "Failed to create snapshot", err)
|
||||||
@@ -172,7 +128,7 @@ func saveAndRespond(c *contextmodel.ReqContext, svc Service, cmd CreateDashboard
|
|||||||
c.JSON(http.StatusOK, snapshot.DashboardCreateResponse{
|
c.JSON(http.StatusOK, snapshot.DashboardCreateResponse{
|
||||||
Key: result.Key,
|
Key: result.Key,
|
||||||
DeleteKey: result.DeleteKey,
|
DeleteKey: result.DeleteKey,
|
||||||
URL: snapshotURL,
|
URL: snapshotUrl,
|
||||||
DeleteURL: setting.ToAbsUrl("api/snapshots-delete/" + result.DeleteKey),
|
DeleteURL: setting.ToAbsUrl("api/snapshots-delete/" + result.DeleteKey),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -20,30 +20,40 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/web"
|
"github.com/grafana/grafana/pkg/web"
|
||||||
)
|
)
|
||||||
|
|
||||||
func createTestDashboard(t *testing.T) *common.Unstructured {
|
func TestCreateDashboardSnapshot_DashboardNotFound(t *testing.T) {
|
||||||
t.Helper()
|
mockService := &MockService{}
|
||||||
dashboard := &common.Unstructured{}
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
dashboardData := map[string]any{
|
SnapshotsEnabled: true,
|
||||||
"uid": "test-dashboard-uid",
|
ExternalEnabled: false,
|
||||||
"id": 123,
|
|
||||||
}
|
}
|
||||||
dashboardBytes, _ := json.Marshal(dashboardData)
|
testUser := &user.SignedInUser{
|
||||||
_ = json.Unmarshal(dashboardBytes, dashboard)
|
|
||||||
return dashboard
|
|
||||||
}
|
|
||||||
|
|
||||||
func createTestUser() *user.SignedInUser {
|
|
||||||
return &user.SignedInUser{
|
|
||||||
UserID: 1,
|
UserID: 1,
|
||||||
OrgID: 1,
|
OrgID: 1,
|
||||||
Login: "testuser",
|
Login: "testuser",
|
||||||
Name: "Test User",
|
Name: "Test User",
|
||||||
Email: "test@example.com",
|
Email: "test@example.com",
|
||||||
}
|
}
|
||||||
}
|
dashboard := &common.Unstructured{}
|
||||||
|
dashboardData := map[string]interface{}{
|
||||||
|
"uid": "test-dashboard-uid",
|
||||||
|
"id": 123,
|
||||||
|
}
|
||||||
|
dashboardBytes, _ := json.Marshal(dashboardData)
|
||||||
|
_ = json.Unmarshal(dashboardBytes, dashboard)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test Snapshot",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||||
|
Return(dashboards.ErrDashboardNotFound)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||||
|
|
||||||
func createReqContext(t *testing.T, req *http.Request, testUser *user.SignedInUser) (*contextmodel.ReqContext, *httptest.ResponseRecorder) {
|
|
||||||
t.Helper()
|
|
||||||
recorder := httptest.NewRecorder()
|
recorder := httptest.NewRecorder()
|
||||||
ctx := &contextmodel.ReqContext{
|
ctx := &contextmodel.ReqContext{
|
||||||
Context: &web.Context{
|
Context: &web.Context{
|
||||||
@@ -53,319 +63,13 @@ func createReqContext(t *testing.T, req *http.Request, testUser *user.SignedInUs
|
|||||||
SignedInUser: testUser,
|
SignedInUser: testUser,
|
||||||
Logger: log.NewNopLogger(),
|
Logger: log.NewNopLogger(),
|
||||||
}
|
}
|
||||||
return ctx, recorder
|
|
||||||
}
|
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
// TestCreateDashboardSnapshot tests snapshot creation in regular mode (non-public instance).
|
mockService.AssertExpectations(t)
|
||||||
// These tests cover scenarios when Grafana is running as a regular server with user authentication.
|
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
||||||
func TestCreateDashboardSnapshot(t *testing.T) {
|
var response map[string]interface{}
|
||||||
t.Run("should return error when dashboard not found", func(t *testing.T) {
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
mockService := &MockService{}
|
require.NoError(t, err)
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
assert.Equal(t, "Dashboard not found", response["message"])
|
||||||
SnapshotsEnabled: true,
|
|
||||||
ExternalEnabled: false,
|
|
||||||
}
|
|
||||||
testUser := createTestUser()
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test Snapshot",
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
|
||||||
Return(dashboards.ErrDashboardNotFound)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
|
||||||
ctx, recorder := createReqContext(t, req, testUser)
|
|
||||||
|
|
||||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
|
||||||
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "Dashboard not found", response["message"])
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should create external snapshot when external is enabled", func(t *testing.T) {
|
|
||||||
externalServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
|
||||||
assert.Equal(t, "/api/snapshots", r.URL.Path)
|
|
||||||
assert.Equal(t, "POST", r.Method)
|
|
||||||
|
|
||||||
response := map[string]any{
|
|
||||||
"key": "external-key",
|
|
||||||
"deleteKey": "external-delete-key",
|
|
||||||
"url": "https://external.example.com/dashboard/snapshot/external-key",
|
|
||||||
"deleteUrl": "https://external.example.com/api/snapshots-delete/external-delete-key",
|
|
||||||
}
|
|
||||||
w.Header().Set("Content-Type", "application/json")
|
|
||||||
_ = json.NewEncoder(w).Encode(response)
|
|
||||||
}))
|
|
||||||
defer externalServer.Close()
|
|
||||||
|
|
||||||
mockService := NewMockService(t)
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: true,
|
|
||||||
ExternalEnabled: true,
|
|
||||||
ExternalSnapshotURL: externalServer.URL,
|
|
||||||
}
|
|
||||||
testUser := createTestUser()
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test External Snapshot",
|
|
||||||
External: true,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
|
||||||
Return(nil)
|
|
||||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
|
||||||
Return(&DashboardSnapshot{
|
|
||||||
Key: "external-key",
|
|
||||||
DeleteKey: "external-delete-key",
|
|
||||||
}, nil)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
|
||||||
ctx, recorder := createReqContext(t, req, testUser)
|
|
||||||
|
|
||||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
|
||||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
|
||||||
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "external-key", response["key"])
|
|
||||||
assert.Equal(t, "external-delete-key", response["deleteKey"])
|
|
||||||
assert.Equal(t, "https://external.example.com/dashboard/snapshot/external-key", response["url"])
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should return forbidden when external is disabled", func(t *testing.T) {
|
|
||||||
mockService := NewMockService(t)
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: true,
|
|
||||||
ExternalEnabled: false,
|
|
||||||
}
|
|
||||||
testUser := createTestUser()
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test External Snapshot",
|
|
||||||
External: true,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
|
||||||
Return(nil)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
|
||||||
ctx, recorder := createReqContext(t, req, testUser)
|
|
||||||
|
|
||||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
|
||||||
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
|
||||||
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "External dashboard creation is disabled", response["message"])
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should create local snapshot", func(t *testing.T) {
|
|
||||||
mockService := NewMockService(t)
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: true,
|
|
||||||
}
|
|
||||||
testUser := createTestUser()
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test Local Snapshot",
|
|
||||||
},
|
|
||||||
Key: "local-key",
|
|
||||||
DeleteKey: "local-delete-key",
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
|
||||||
Return(nil)
|
|
||||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
|
||||||
Return(&DashboardSnapshot{
|
|
||||||
Key: "local-key",
|
|
||||||
DeleteKey: "local-delete-key",
|
|
||||||
}, nil)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
|
||||||
ctx, recorder := createReqContext(t, req, testUser)
|
|
||||||
|
|
||||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
|
||||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
|
||||||
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "local-key", response["key"])
|
|
||||||
assert.Equal(t, "local-delete-key", response["deleteKey"])
|
|
||||||
assert.Contains(t, response["url"], "dashboard/snapshot/local-key")
|
|
||||||
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/local-delete-key")
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// TestCreateDashboardSnapshotPublic tests snapshot creation in public mode.
|
|
||||||
// These tests cover scenarios when Grafana is running as a public snapshot server
|
|
||||||
// where no user authentication or dashboard validation is required.
|
|
||||||
func TestCreateDashboardSnapshotPublic(t *testing.T) {
|
|
||||||
t.Run("should create local snapshot without user context", func(t *testing.T) {
|
|
||||||
mockService := NewMockService(t)
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: true,
|
|
||||||
}
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test Snapshot",
|
|
||||||
},
|
|
||||||
Key: "test-key",
|
|
||||||
DeleteKey: "test-delete-key",
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
|
||||||
Return(&DashboardSnapshot{
|
|
||||||
Key: "test-key",
|
|
||||||
DeleteKey: "test-delete-key",
|
|
||||||
}, nil)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
recorder := httptest.NewRecorder()
|
|
||||||
ctx := &contextmodel.ReqContext{
|
|
||||||
Context: &web.Context{
|
|
||||||
Req: req,
|
|
||||||
Resp: web.NewResponseWriter("POST", recorder),
|
|
||||||
},
|
|
||||||
Logger: log.NewNopLogger(),
|
|
||||||
}
|
|
||||||
|
|
||||||
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
|
||||||
assert.Equal(t, http.StatusOK, recorder.Code)
|
|
||||||
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "test-key", response["key"])
|
|
||||||
assert.Equal(t, "test-delete-key", response["deleteKey"])
|
|
||||||
assert.Contains(t, response["url"], "dashboard/snapshot/test-key")
|
|
||||||
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/test-delete-key")
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should return forbidden when snapshots are disabled", func(t *testing.T) {
|
|
||||||
mockService := NewMockService(t)
|
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: false,
|
|
||||||
}
|
|
||||||
dashboard := createTestDashboard(t)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test Snapshot",
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
recorder := httptest.NewRecorder()
|
|
||||||
ctx := &contextmodel.ReqContext{
|
|
||||||
Context: &web.Context{
|
|
||||||
Req: req,
|
|
||||||
Resp: web.NewResponseWriter("POST", recorder),
|
|
||||||
},
|
|
||||||
Logger: log.NewNopLogger(),
|
|
||||||
}
|
|
||||||
|
|
||||||
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
|
||||||
|
|
||||||
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
|
||||||
|
|
||||||
var response map[string]any
|
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
|
||||||
require.NoError(t, err)
|
|
||||||
assert.Equal(t, "Dashboard Snapshots are disabled", response["message"])
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// TestDeleteExternalDashboardSnapshot tests deletion of external snapshots.
|
|
||||||
// This function is called in public mode and doesn't require user context.
|
|
||||||
func TestDeleteExternalDashboardSnapshot(t *testing.T) {
|
|
||||||
t.Run("should return nil on successful deletion", func(t *testing.T) {
|
|
||||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
|
||||||
assert.Equal(t, "GET", r.Method)
|
|
||||||
w.WriteHeader(http.StatusOK)
|
|
||||||
}))
|
|
||||||
defer server.Close()
|
|
||||||
|
|
||||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
|
||||||
assert.NoError(t, err)
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should gracefully handle already deleted snapshot", func(t *testing.T) {
|
|
||||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
|
||||||
w.WriteHeader(http.StatusInternalServerError)
|
|
||||||
response := map[string]any{
|
|
||||||
"message": "Failed to get dashboard snapshot",
|
|
||||||
}
|
|
||||||
_ = json.NewEncoder(w).Encode(response)
|
|
||||||
}))
|
|
||||||
defer server.Close()
|
|
||||||
|
|
||||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
|
||||||
assert.NoError(t, err)
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should return error on unexpected status code", func(t *testing.T) {
|
|
||||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
|
||||||
w.WriteHeader(http.StatusNotFound)
|
|
||||||
}))
|
|
||||||
defer server.Close()
|
|
||||||
|
|
||||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
|
||||||
assert.Error(t, err)
|
|
||||||
assert.Contains(t, err.Error(), "unexpected response when deleting external snapshot")
|
|
||||||
assert.Contains(t, err.Error(), "404")
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("should return error on 500 with different message", func(t *testing.T) {
|
|
||||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
|
||||||
w.WriteHeader(http.StatusInternalServerError)
|
|
||||||
response := map[string]any{
|
|
||||||
"message": "Some other error",
|
|
||||||
}
|
|
||||||
_ = json.NewEncoder(w).Encode(response)
|
|
||||||
}))
|
|
||||||
defer server.Close()
|
|
||||||
|
|
||||||
err := DeleteExternalDashboardSnapshot(server.URL)
|
|
||||||
assert.Error(t, err)
|
|
||||||
assert.Contains(t, err.Error(), "500")
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,19 +2,15 @@ package pluginassets
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"encoding/base64"
|
|
||||||
"encoding/hex"
|
|
||||||
"fmt"
|
|
||||||
"path"
|
|
||||||
"path/filepath"
|
|
||||||
"sync"
|
|
||||||
|
|
||||||
"github.com/Masterminds/semver/v3"
|
"github.com/Masterminds/semver/v3"
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/infra/log"
|
"github.com/grafana/grafana/pkg/infra/log"
|
||||||
"github.com/grafana/grafana/pkg/plugins"
|
"github.com/grafana/grafana/pkg/plugins"
|
||||||
"github.com/grafana/grafana/pkg/plugins/config"
|
"github.com/grafana/grafana/pkg/plugins/config"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/manager/registry"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
||||||
|
"github.com/grafana/grafana/pkg/plugins/pluginassets/modulehash"
|
||||||
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
|
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
|
||||||
)
|
)
|
||||||
@@ -28,24 +24,27 @@ var (
|
|||||||
scriptLoadingMinSupportedVersion = semver.MustParse(CreatePluginVersionScriptSupportEnabled)
|
scriptLoadingMinSupportedVersion = semver.MustParse(CreatePluginVersionScriptSupportEnabled)
|
||||||
)
|
)
|
||||||
|
|
||||||
func ProvideService(cfg *config.PluginManagementCfg, cdn *pluginscdn.Service, sig *signature.Signature, store pluginstore.Store) *Service {
|
func ProvideService(cfg *config.PluginManagementCfg, cdn *pluginscdn.Service,
|
||||||
|
calc *modulehash.Calculator) *Service {
|
||||||
return &Service{
|
return &Service{
|
||||||
cfg: cfg,
|
cfg: cfg,
|
||||||
cdn: cdn,
|
cdn: cdn,
|
||||||
signature: sig,
|
log: log.New("pluginassets"),
|
||||||
store: store,
|
calc: calc,
|
||||||
log: log.New("pluginassets"),
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
type Service struct {
|
func ProvideModuleHashCalculator(cfg *config.PluginManagementCfg, cdn *pluginscdn.Service,
|
||||||
cfg *config.PluginManagementCfg
|
signature *signature.Signature, reg registry.Service) *modulehash.Calculator {
|
||||||
cdn *pluginscdn.Service
|
return modulehash.NewCalculator(cfg, reg, cdn, signature)
|
||||||
signature *signature.Signature
|
}
|
||||||
store pluginstore.Store
|
|
||||||
log log.Logger
|
|
||||||
|
|
||||||
moduleHashCache sync.Map
|
type Service struct {
|
||||||
|
cfg *config.PluginManagementCfg
|
||||||
|
cdn *pluginscdn.Service
|
||||||
|
calc *modulehash.Calculator
|
||||||
|
|
||||||
|
log log.Logger
|
||||||
}
|
}
|
||||||
|
|
||||||
// LoadingStrategy calculates the loading strategy for a plugin.
|
// LoadingStrategy calculates the loading strategy for a plugin.
|
||||||
@@ -83,92 +82,8 @@ func (s *Service) LoadingStrategy(_ context.Context, p pluginstore.Plugin) plugi
|
|||||||
}
|
}
|
||||||
|
|
||||||
// ModuleHash returns the module.js SHA256 hash for a plugin in the format expected by the browser for SRI checks.
|
// ModuleHash returns the module.js SHA256 hash for a plugin in the format expected by the browser for SRI checks.
|
||||||
// The module hash is read from the plugin's MANIFEST.txt file.
|
|
||||||
// The plugin can also be a nested plugin.
|
|
||||||
// If the plugin is unsigned, an empty string is returned.
|
|
||||||
// The results are cached to avoid repeated reads from the MANIFEST.txt file.
|
|
||||||
func (s *Service) ModuleHash(ctx context.Context, p pluginstore.Plugin) string {
|
func (s *Service) ModuleHash(ctx context.Context, p pluginstore.Plugin) string {
|
||||||
k := s.moduleHashCacheKey(p)
|
return s.calc.ModuleHash(ctx, p.ID, p.Info.Version)
|
||||||
cachedValue, ok := s.moduleHashCache.Load(k)
|
|
||||||
if ok {
|
|
||||||
return cachedValue.(string)
|
|
||||||
}
|
|
||||||
mh, err := s.moduleHash(ctx, p, "")
|
|
||||||
if err != nil {
|
|
||||||
s.log.Error("Failed to calculate module hash", "plugin", p.ID, "error", err)
|
|
||||||
}
|
|
||||||
s.moduleHashCache.Store(k, mh)
|
|
||||||
return mh
|
|
||||||
}
|
|
||||||
|
|
||||||
// moduleHash is the underlying function for ModuleHash. See its documentation for more information.
|
|
||||||
// If the plugin is not a CDN plugin, the function will return an empty string.
|
|
||||||
// It will read the module hash from the MANIFEST.txt in the [[plugins.FS]] of the provided plugin.
|
|
||||||
// If childFSBase is provided, the function will try to get the hash from MANIFEST.txt for the provided children's
|
|
||||||
// module.js file, rather than for the provided plugin.
|
|
||||||
func (s *Service) moduleHash(ctx context.Context, p pluginstore.Plugin, childFSBase string) (r string, err error) {
|
|
||||||
if !s.cfg.Features.SriChecksEnabled {
|
|
||||||
return "", nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// Ignore unsigned plugins
|
|
||||||
if !p.Signature.IsValid() {
|
|
||||||
return "", nil
|
|
||||||
}
|
|
||||||
|
|
||||||
if p.Parent != nil {
|
|
||||||
// Nested plugin
|
|
||||||
parent, ok := s.store.Plugin(ctx, p.Parent.ID)
|
|
||||||
if !ok {
|
|
||||||
return "", fmt.Errorf("parent plugin plugin %q for child plugin %q not found", p.Parent.ID, p.ID)
|
|
||||||
}
|
|
||||||
|
|
||||||
// The module hash is contained within the parent's MANIFEST.txt file.
|
|
||||||
// For example, the parent's MANIFEST.txt will contain an entry similar to this:
|
|
||||||
//
|
|
||||||
// ```
|
|
||||||
// "datasource/module.js": "1234567890abcdef..."
|
|
||||||
// ```
|
|
||||||
//
|
|
||||||
// Recursively call moduleHash with the parent plugin and with the children plugin folder path
|
|
||||||
// to get the correct module hash for the nested plugin.
|
|
||||||
if childFSBase == "" {
|
|
||||||
childFSBase = p.Base()
|
|
||||||
}
|
|
||||||
return s.moduleHash(ctx, parent, childFSBase)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Only CDN plugins are supported for SRI checks.
|
|
||||||
// CDN plugins have the version as part of the URL, which acts as a cache-buster.
|
|
||||||
// Needed due to: https://github.com/grafana/plugin-tools/pull/1426
|
|
||||||
// FS plugins build before this change will have SRI mismatch issues.
|
|
||||||
if !s.cdnEnabled(p.ID, p.FS) {
|
|
||||||
return "", nil
|
|
||||||
}
|
|
||||||
|
|
||||||
manifest, err := s.signature.ReadPluginManifestFromFS(ctx, p.FS)
|
|
||||||
if err != nil {
|
|
||||||
return "", fmt.Errorf("read plugin manifest: %w", err)
|
|
||||||
}
|
|
||||||
if !manifest.IsV2() {
|
|
||||||
return "", nil
|
|
||||||
}
|
|
||||||
|
|
||||||
var childPath string
|
|
||||||
if childFSBase != "" {
|
|
||||||
// Calculate the relative path of the child plugin folder from the parent plugin folder.
|
|
||||||
childPath, err = p.FS.Rel(childFSBase)
|
|
||||||
if err != nil {
|
|
||||||
return "", fmt.Errorf("rel path: %w", err)
|
|
||||||
}
|
|
||||||
// MANIFETS.txt uses forward slashes as path separators.
|
|
||||||
childPath = filepath.ToSlash(childPath)
|
|
||||||
}
|
|
||||||
moduleHash, ok := manifest.Files[path.Join(childPath, "module.js")]
|
|
||||||
if !ok {
|
|
||||||
return "", nil
|
|
||||||
}
|
|
||||||
return convertHashForSRI(moduleHash)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (s *Service) compatibleCreatePluginVersion(ps map[string]string) bool {
|
func (s *Service) compatibleCreatePluginVersion(ps map[string]string) bool {
|
||||||
@@ -188,17 +103,3 @@ func (s *Service) compatibleCreatePluginVersion(ps map[string]string) bool {
|
|||||||
func (s *Service) cdnEnabled(pluginID string, fs plugins.FS) bool {
|
func (s *Service) cdnEnabled(pluginID string, fs plugins.FS) bool {
|
||||||
return s.cdn.PluginSupported(pluginID) || fs.Type().CDN()
|
return s.cdn.PluginSupported(pluginID) || fs.Type().CDN()
|
||||||
}
|
}
|
||||||
|
|
||||||
// convertHashForSRI takes a SHA256 hash string and returns it as expected by the browser for SRI checks.
|
|
||||||
func convertHashForSRI(h string) (string, error) {
|
|
||||||
hb, err := hex.DecodeString(h)
|
|
||||||
if err != nil {
|
|
||||||
return "", fmt.Errorf("hex decode string: %w", err)
|
|
||||||
}
|
|
||||||
return "sha256-" + base64.StdEncoding.EncodeToString(hb), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// moduleHashCacheKey returns a unique key for the module hash cache.
|
|
||||||
func (s *Service) moduleHashCacheKey(p pluginstore.Plugin) string {
|
|
||||||
return p.ID + ":" + p.Info.Version
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -2,19 +2,14 @@ package pluginassets
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"fmt"
|
|
||||||
"path/filepath"
|
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
"github.com/stretchr/testify/assert"
|
||||||
"github.com/stretchr/testify/require"
|
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/infra/log"
|
"github.com/grafana/grafana/pkg/infra/log"
|
||||||
"github.com/grafana/grafana/pkg/plugins"
|
"github.com/grafana/grafana/pkg/plugins"
|
||||||
"github.com/grafana/grafana/pkg/plugins/config"
|
"github.com/grafana/grafana/pkg/plugins/config"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/pluginfakes"
|
"github.com/grafana/grafana/pkg/plugins/manager/pluginfakes"
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature"
|
|
||||||
"github.com/grafana/grafana/pkg/plugins/manager/signature/statickey"
|
|
||||||
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
"github.com/grafana/grafana/pkg/plugins/pluginscdn"
|
||||||
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
|
"github.com/grafana/grafana/pkg/services/pluginsintegration/pluginstore"
|
||||||
)
|
)
|
||||||
@@ -179,349 +174,6 @@ func TestService_Calculate(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestService_ModuleHash(t *testing.T) {
|
|
||||||
const (
|
|
||||||
pluginID = "grafana-test-datasource"
|
|
||||||
parentPluginID = "grafana-test-app"
|
|
||||||
)
|
|
||||||
for _, tc := range []struct {
|
|
||||||
name string
|
|
||||||
features *config.Features
|
|
||||||
store []pluginstore.Plugin
|
|
||||||
|
|
||||||
// Can be used to configure plugin's fs
|
|
||||||
// fs cdn type = loaded from CDN with no files on disk
|
|
||||||
// fs local type = files on disk but served from CDN only if cdn=true
|
|
||||||
plugin pluginstore.Plugin
|
|
||||||
|
|
||||||
// When true, set cdn=true in config
|
|
||||||
cdn bool
|
|
||||||
expModuleHash string
|
|
||||||
}{
|
|
||||||
{
|
|
||||||
name: "unsigned should not return module hash",
|
|
||||||
plugin: newPlugin(pluginID, withSignatureStatus(plugins.SignatureStatusUnsigned)),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: false},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
withClass(plugins.ClassExternal),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03"),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
withClass(plugins.ClassExternal),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03"),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: false},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: false},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
// parentPluginID (/)
|
|
||||||
// └── pluginID (/datasource)
|
|
||||||
name: "nested plugin should return module hash from parent MANIFEST.txt",
|
|
||||||
store: []pluginstore.Plugin{
|
|
||||||
newPlugin(
|
|
||||||
parentPluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested"))),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested", "datasource"))),
|
|
||||||
withParent(parentPluginID),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: newSRIHash(t, "04d70db091d96c4775fb32ba5a8f84cc22893eb43afdb649726661d4425c6711"),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
// parentPluginID (/)
|
|
||||||
// └── pluginID (/panels/one)
|
|
||||||
name: "nested plugin deeper than one subfolder should return module hash from parent MANIFEST.txt",
|
|
||||||
store: []pluginstore.Plugin{
|
|
||||||
newPlugin(
|
|
||||||
parentPluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested"))),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested", "panels", "one"))),
|
|
||||||
withParent(parentPluginID),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: newSRIHash(t, "cbd1ac2284645a0e1e9a8722a729f5bcdd2b831222728709c6360beecdd6143f"),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
// grand-parent-app (/)
|
|
||||||
// ├── parent-datasource (/datasource)
|
|
||||||
// │ └── child-panel (/datasource/panels/one)
|
|
||||||
name: "nested plugin of a nested plugin should return module hash from parent MANIFEST.txt",
|
|
||||||
store: []pluginstore.Plugin{
|
|
||||||
newPlugin(
|
|
||||||
"grand-parent-app",
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-deeply-nested"))),
|
|
||||||
),
|
|
||||||
newPlugin(
|
|
||||||
"parent-datasource",
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-deeply-nested", "datasource"))),
|
|
||||||
withParent("grand-parent-app"),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
plugin: newPlugin(
|
|
||||||
"child-panel",
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-deeply-nested", "datasource", "panels", "one"))),
|
|
||||||
withParent("parent-datasource"),
|
|
||||||
),
|
|
||||||
cdn: true,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: newSRIHash(t, "cbd1ac2284645a0e1e9a8722a729f5bcdd2b831222728709c6360beecdd6143f"),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "nested plugin should not return module hash from parent if it's not registered in the store",
|
|
||||||
store: []pluginstore.Plugin{},
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested", "panels", "one"))),
|
|
||||||
withParent(parentPluginID),
|
|
||||||
),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "missing module.js entry from MANIFEST.txt should not return module hash",
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-no-module-js"))),
|
|
||||||
),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "signed status but missing MANIFEST.txt should not return module hash",
|
|
||||||
plugin: newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-no-manifest-txt"))),
|
|
||||||
),
|
|
||||||
cdn: false,
|
|
||||||
features: &config.Features{SriChecksEnabled: true},
|
|
||||||
expModuleHash: "",
|
|
||||||
},
|
|
||||||
} {
|
|
||||||
if tc.name == "" {
|
|
||||||
var expS string
|
|
||||||
if tc.expModuleHash == "" {
|
|
||||||
expS = "should not return module hash"
|
|
||||||
} else {
|
|
||||||
expS = "should return module hash"
|
|
||||||
}
|
|
||||||
tc.name = fmt.Sprintf("feature=%v, cdn_config=%v, class=%v %s", tc.features.SriChecksEnabled, tc.cdn, tc.plugin.Class, expS)
|
|
||||||
}
|
|
||||||
|
|
||||||
t.Run(tc.name, func(t *testing.T) {
|
|
||||||
var pluginSettings config.PluginSettings
|
|
||||||
if tc.cdn {
|
|
||||||
pluginSettings = config.PluginSettings{
|
|
||||||
pluginID: {
|
|
||||||
"cdn": "true",
|
|
||||||
},
|
|
||||||
parentPluginID: map[string]string{
|
|
||||||
"cdn": "true",
|
|
||||||
},
|
|
||||||
"grand-parent-app": map[string]string{
|
|
||||||
"cdn": "true",
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
features := tc.features
|
|
||||||
if features == nil {
|
|
||||||
features = &config.Features{}
|
|
||||||
}
|
|
||||||
pCfg := &config.PluginManagementCfg{
|
|
||||||
PluginsCDNURLTemplate: "http://cdn.example.com",
|
|
||||||
PluginSettings: pluginSettings,
|
|
||||||
Features: *features,
|
|
||||||
}
|
|
||||||
svc := ProvideService(
|
|
||||||
pCfg,
|
|
||||||
pluginscdn.ProvideService(pCfg),
|
|
||||||
signature.ProvideService(pCfg, statickey.New()),
|
|
||||||
pluginstore.NewFakePluginStore(tc.store...),
|
|
||||||
)
|
|
||||||
mh := svc.ModuleHash(context.Background(), tc.plugin)
|
|
||||||
require.Equal(t, tc.expModuleHash, mh)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestService_ModuleHash_Cache(t *testing.T) {
|
|
||||||
pCfg := &config.PluginManagementCfg{
|
|
||||||
PluginSettings: config.PluginSettings{},
|
|
||||||
Features: config.Features{SriChecksEnabled: true},
|
|
||||||
}
|
|
||||||
svc := ProvideService(
|
|
||||||
pCfg,
|
|
||||||
pluginscdn.ProvideService(pCfg),
|
|
||||||
signature.ProvideService(pCfg, statickey.New()),
|
|
||||||
pluginstore.NewFakePluginStore(),
|
|
||||||
)
|
|
||||||
const pluginID = "grafana-test-datasource"
|
|
||||||
|
|
||||||
t.Run("cache key", func(t *testing.T) {
|
|
||||||
t.Run("with version", func(t *testing.T) {
|
|
||||||
const pluginVersion = "1.0.0"
|
|
||||||
p := newPlugin(pluginID, withInfo(plugins.Info{Version: pluginVersion}))
|
|
||||||
k := svc.moduleHashCacheKey(p)
|
|
||||||
require.Equal(t, pluginID+":"+pluginVersion, k, "cache key should be correct")
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("without version", func(t *testing.T) {
|
|
||||||
p := newPlugin(pluginID)
|
|
||||||
k := svc.moduleHashCacheKey(p)
|
|
||||||
require.Equal(t, pluginID+":", k, "cache key should be correct")
|
|
||||||
})
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("ModuleHash usage", func(t *testing.T) {
|
|
||||||
pV1 := newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withInfo(plugins.Info{Version: "1.0.0"}),
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid"))),
|
|
||||||
)
|
|
||||||
|
|
||||||
pCfg = &config.PluginManagementCfg{
|
|
||||||
PluginsCDNURLTemplate: "https://cdn.grafana.com",
|
|
||||||
PluginSettings: config.PluginSettings{
|
|
||||||
pluginID: {
|
|
||||||
"cdn": "true",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
Features: config.Features{SriChecksEnabled: true},
|
|
||||||
}
|
|
||||||
svc = ProvideService(
|
|
||||||
pCfg,
|
|
||||||
pluginscdn.ProvideService(pCfg),
|
|
||||||
signature.ProvideService(pCfg, statickey.New()),
|
|
||||||
pluginstore.NewFakePluginStore(),
|
|
||||||
)
|
|
||||||
|
|
||||||
k := svc.moduleHashCacheKey(pV1)
|
|
||||||
|
|
||||||
_, ok := svc.moduleHashCache.Load(k)
|
|
||||||
require.False(t, ok, "cache should initially be empty")
|
|
||||||
|
|
||||||
mhV1 := svc.ModuleHash(context.Background(), pV1)
|
|
||||||
pV1Exp := newSRIHash(t, "5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03")
|
|
||||||
require.Equal(t, pV1Exp, mhV1, "returned value should be correct")
|
|
||||||
|
|
||||||
cachedMh, ok := svc.moduleHashCache.Load(k)
|
|
||||||
require.True(t, ok)
|
|
||||||
require.Equal(t, pV1Exp, cachedMh, "cache should contain the returned value")
|
|
||||||
|
|
||||||
t.Run("different version uses different cache key", func(t *testing.T) {
|
|
||||||
pV2 := newPlugin(
|
|
||||||
pluginID,
|
|
||||||
withInfo(plugins.Info{Version: "2.0.0"}),
|
|
||||||
withSignatureStatus(plugins.SignatureStatusValid),
|
|
||||||
// different fs for different hash
|
|
||||||
withFS(plugins.NewLocalFS(filepath.Join("testdata", "module-hash-valid-nested"))),
|
|
||||||
)
|
|
||||||
mhV2 := svc.ModuleHash(context.Background(), pV2)
|
|
||||||
require.NotEqual(t, mhV2, mhV1, "different version should have different hash")
|
|
||||||
require.Equal(t, newSRIHash(t, "266c19bc148b22ddef2a288fc5f8f40855bda22ccf60be53340b4931e469ae2a"), mhV2)
|
|
||||||
})
|
|
||||||
|
|
||||||
t.Run("cache should be used", func(t *testing.T) {
|
|
||||||
// edit cache directly
|
|
||||||
svc.moduleHashCache.Store(k, "hax")
|
|
||||||
require.Equal(t, "hax", svc.ModuleHash(context.Background(), pV1))
|
|
||||||
})
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestConvertHashFromSRI(t *testing.T) {
|
|
||||||
for _, tc := range []struct {
|
|
||||||
hash string
|
|
||||||
expHash string
|
|
||||||
expErr bool
|
|
||||||
}{
|
|
||||||
{
|
|
||||||
hash: "ddfcb449445064e6c39f0c20b15be3cb6a55837cf4781df23d02de005f436811",
|
|
||||||
expHash: "sha256-3fy0SURQZObDnwwgsVvjy2pVg3z0eB3yPQLeAF9DaBE=",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
hash: "not-a-valid-hash",
|
|
||||||
expErr: true,
|
|
||||||
},
|
|
||||||
} {
|
|
||||||
t.Run(tc.hash, func(t *testing.T) {
|
|
||||||
r, err := convertHashForSRI(tc.hash)
|
|
||||||
if tc.expErr {
|
|
||||||
require.Error(t, err)
|
|
||||||
} else {
|
|
||||||
require.NoError(t, err)
|
|
||||||
require.Equal(t, tc.expHash, r)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func newPlugin(pluginID string, cbs ...func(p pluginstore.Plugin) pluginstore.Plugin) pluginstore.Plugin {
|
func newPlugin(pluginID string, cbs ...func(p pluginstore.Plugin) pluginstore.Plugin) pluginstore.Plugin {
|
||||||
p := pluginstore.Plugin{
|
p := pluginstore.Plugin{
|
||||||
JSONData: plugins.JSONData{
|
JSONData: plugins.JSONData{
|
||||||
@@ -534,13 +186,6 @@ func newPlugin(pluginID string, cbs ...func(p pluginstore.Plugin) pluginstore.Pl
|
|||||||
return p
|
return p
|
||||||
}
|
}
|
||||||
|
|
||||||
func withInfo(info plugins.Info) func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
p.Info = info
|
|
||||||
return p
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func withFS(fs plugins.FS) func(p pluginstore.Plugin) pluginstore.Plugin {
|
func withFS(fs plugins.FS) func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
p.FS = fs
|
p.FS = fs
|
||||||
@@ -548,13 +193,6 @@ func withFS(fs plugins.FS) func(p pluginstore.Plugin) pluginstore.Plugin {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func withSignatureStatus(status plugins.SignatureStatus) func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
p.Signature = status
|
|
||||||
return p
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func withAngular(angular bool) func(p pluginstore.Plugin) pluginstore.Plugin {
|
func withAngular(angular bool) func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
p.Angular = plugins.AngularMeta{Detected: angular}
|
p.Angular = plugins.AngularMeta{Detected: angular}
|
||||||
@@ -562,13 +200,6 @@ func withAngular(angular bool) func(p pluginstore.Plugin) pluginstore.Plugin {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func withParent(parentID string) func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
|
||||||
p.Parent = &pluginstore.ParentPlugin{ID: parentID}
|
|
||||||
return p
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func withClass(class plugins.Class) func(p pluginstore.Plugin) pluginstore.Plugin {
|
func withClass(class plugins.Class) func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
return func(p pluginstore.Plugin) pluginstore.Plugin {
|
||||||
p.Class = class
|
p.Class = class
|
||||||
@@ -587,9 +218,3 @@ func newPluginSettings(pluginID string, kv map[string]string) config.PluginSetti
|
|||||||
pluginID: kv,
|
pluginID: kv,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func newSRIHash(t *testing.T, s string) string {
|
|
||||||
r, err := convertHashForSRI(s)
|
|
||||||
require.NoError(t, err)
|
|
||||||
return r
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -131,6 +131,7 @@ var WireSet = wire.NewSet(
|
|||||||
plugincontext.ProvideBaseService,
|
plugincontext.ProvideBaseService,
|
||||||
wire.Bind(new(plugincontext.BasePluginContextProvider), new(*plugincontext.BaseProvider)),
|
wire.Bind(new(plugincontext.BasePluginContextProvider), new(*plugincontext.BaseProvider)),
|
||||||
plugininstaller.ProvideService,
|
plugininstaller.ProvideService,
|
||||||
|
pluginassets.ProvideModuleHashCalculator,
|
||||||
pluginassets.ProvideService,
|
pluginassets.ProvideService,
|
||||||
pluginchecker.ProvidePreinstall,
|
pluginchecker.ProvidePreinstall,
|
||||||
wire.Bind(new(pluginchecker.Preinstall), new(*pluginchecker.PreinstallImpl)),
|
wire.Bind(new(pluginchecker.Preinstall), new(*pluginchecker.PreinstallImpl)),
|
||||||
|
|||||||
@@ -14,7 +14,6 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/rvmanager"
|
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
||||||
gocache "github.com/patrickmn/go-cache"
|
gocache "github.com/patrickmn/go-cache"
|
||||||
)
|
)
|
||||||
@@ -869,18 +868,10 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
if key.Action == DataActionDeleted {
|
if key.Action == DataActionDeleted {
|
||||||
generation = 0
|
generation = 0
|
||||||
}
|
}
|
||||||
|
|
||||||
// In compatibility mode, the previous RV, when available, is saved as a microsecond
|
|
||||||
// timestamp, as is done in the SQL backend.
|
|
||||||
previousRV := event.PreviousRV
|
|
||||||
if event.PreviousRV > 0 && isSnowflake(event.PreviousRV) {
|
|
||||||
previousRV = rvmanager.RVFromSnowflake(event.PreviousRV)
|
|
||||||
}
|
|
||||||
|
|
||||||
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
||||||
SQLTemplate: sqltemplate.New(kv.dialect),
|
SQLTemplate: sqltemplate.New(kv.dialect),
|
||||||
GUID: key.GUID,
|
GUID: key.GUID,
|
||||||
PreviousRV: previousRV,
|
PreviousRV: event.PreviousRV,
|
||||||
Generation: generation,
|
Generation: generation,
|
||||||
})
|
})
|
||||||
|
|
||||||
@@ -909,7 +900,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
Name: key.Name,
|
Name: key.Name,
|
||||||
Action: action,
|
Action: action,
|
||||||
Folder: key.Folder,
|
Folder: key.Folder,
|
||||||
PreviousRV: previousRV,
|
PreviousRV: event.PreviousRV,
|
||||||
})
|
})
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -925,7 +916,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
Name: key.Name,
|
Name: key.Name,
|
||||||
Action: action,
|
Action: action,
|
||||||
Folder: key.Folder,
|
Folder: key.Folder,
|
||||||
PreviousRV: previousRV,
|
PreviousRV: event.PreviousRV,
|
||||||
})
|
})
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -947,15 +938,3 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// isSnowflake returns whether the argument passed is a snowflake ID (new) or a microsecond timestamp (old).
|
|
||||||
// We try to interpret the number as a microsecond timestamp first. If it represents a time in the past,
|
|
||||||
// it is considered a microsecond timestamp. Snowflake IDs are much larger integers and would lead
|
|
||||||
// to dates in the future if interpreted as a microsecond timestamp.
|
|
||||||
func isSnowflake(rv int64) bool {
|
|
||||||
ts := time.UnixMicro(rv)
|
|
||||||
oneHourFromNow := time.Now().Add(time.Hour)
|
|
||||||
isMicroSecRV := ts.Before(oneHourFromNow)
|
|
||||||
|
|
||||||
return !isMicroSecRV
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -19,18 +19,13 @@ const (
|
|||||||
defaultBufferSize = 10000
|
defaultBufferSize = 10000
|
||||||
)
|
)
|
||||||
|
|
||||||
type notifier interface {
|
type notifier struct {
|
||||||
Watch(context.Context, watchOptions) <-chan Event
|
|
||||||
}
|
|
||||||
|
|
||||||
type pollingNotifier struct {
|
|
||||||
eventStore *eventStore
|
eventStore *eventStore
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
}
|
}
|
||||||
|
|
||||||
type notifierOptions struct {
|
type notifierOptions struct {
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
useChannelNotifier bool
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type watchOptions struct {
|
type watchOptions struct {
|
||||||
@@ -49,26 +44,15 @@ func defaultWatchOptions() watchOptions {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func newNotifier(eventStore *eventStore, opts notifierOptions) notifier {
|
func newNotifier(eventStore *eventStore, opts notifierOptions) *notifier {
|
||||||
if opts.log == nil {
|
if opts.log == nil {
|
||||||
opts.log = &logging.NoOpLogger{}
|
opts.log = &logging.NoOpLogger{}
|
||||||
}
|
}
|
||||||
|
return ¬ifier{eventStore: eventStore, log: opts.log}
|
||||||
if opts.useChannelNotifier {
|
|
||||||
return &channelNotifier{}
|
|
||||||
}
|
|
||||||
|
|
||||||
return &pollingNotifier{eventStore: eventStore, log: opts.log}
|
|
||||||
}
|
|
||||||
|
|
||||||
type channelNotifier struct{}
|
|
||||||
|
|
||||||
func (cn *channelNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
|
||||||
return nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Return the last resource version from the event store
|
// Return the last resource version from the event store
|
||||||
func (n *pollingNotifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
func (n *notifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
||||||
e, err := n.eventStore.LastEventKey(ctx)
|
e, err := n.eventStore.LastEventKey(ctx)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return 0, err
|
return 0, err
|
||||||
@@ -76,11 +60,11 @@ func (n *pollingNotifier) lastEventResourceVersion(ctx context.Context) (int64,
|
|||||||
return e.ResourceVersion, nil
|
return e.ResourceVersion, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (n *pollingNotifier) cacheKey(evt Event) string {
|
func (n *notifier) cacheKey(evt Event) string {
|
||||||
return fmt.Sprintf("%s~%s~%s~%s~%d", evt.Namespace, evt.Group, evt.Resource, evt.Name, evt.ResourceVersion)
|
return fmt.Sprintf("%s~%s~%s~%s~%d", evt.Namespace, evt.Group, evt.Resource, evt.Name, evt.ResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (n *pollingNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
func (n *notifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||||
if opts.MinBackoff <= 0 {
|
if opts.MinBackoff <= 0 {
|
||||||
opts.MinBackoff = defaultMinBackoff
|
opts.MinBackoff = defaultMinBackoff
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ import (
|
|||||||
"github.com/stretchr/testify/require"
|
"github.com/stretchr/testify/require"
|
||||||
)
|
)
|
||||||
|
|
||||||
func setupTestNotifier(t *testing.T) (*pollingNotifier, *eventStore) {
|
func setupTestNotifier(t *testing.T) (*notifier, *eventStore) {
|
||||||
db := setupTestBadgerDB(t)
|
db := setupTestBadgerDB(t)
|
||||||
t.Cleanup(func() {
|
t.Cleanup(func() {
|
||||||
err := db.Close()
|
err := db.Close()
|
||||||
@@ -22,10 +22,10 @@ func setupTestNotifier(t *testing.T) (*pollingNotifier, *eventStore) {
|
|||||||
kv := NewBadgerKV(db)
|
kv := NewBadgerKV(db)
|
||||||
eventStore := newEventStore(kv)
|
eventStore := newEventStore(kv)
|
||||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||||
return notifier.(*pollingNotifier), eventStore
|
return notifier, eventStore
|
||||||
}
|
}
|
||||||
|
|
||||||
func setupTestNotifierSqlKv(t *testing.T) (*pollingNotifier, *eventStore) {
|
func setupTestNotifierSqlKv(t *testing.T) (*notifier, *eventStore) {
|
||||||
dbstore := db.InitTestDB(t)
|
dbstore := db.InitTestDB(t)
|
||||||
eDB, err := dbimpl.ProvideResourceDB(dbstore, setting.NewCfg(), nil)
|
eDB, err := dbimpl.ProvideResourceDB(dbstore, setting.NewCfg(), nil)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
@@ -33,7 +33,7 @@ func setupTestNotifierSqlKv(t *testing.T) (*pollingNotifier, *eventStore) {
|
|||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
eventStore := newEventStore(kv)
|
eventStore := newEventStore(kv)
|
||||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||||
return notifier.(*pollingNotifier), eventStore
|
return notifier, eventStore
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestNewNotifier(t *testing.T) {
|
func TestNewNotifier(t *testing.T) {
|
||||||
@@ -49,7 +49,7 @@ func TestDefaultWatchOptions(t *testing.T) {
|
|||||||
assert.Equal(t, defaultBufferSize, opts.BufferSize)
|
assert.Equal(t, defaultBufferSize, opts.BufferSize)
|
||||||
}
|
}
|
||||||
|
|
||||||
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*pollingNotifier, *eventStore), testFn func(*testing.T, context.Context, *pollingNotifier, *eventStore)) {
|
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*notifier, *eventStore), testFn func(*testing.T, context.Context, *notifier, *eventStore)) {
|
||||||
t.Run(storeName, func(t *testing.T) {
|
t.Run(storeName, func(t *testing.T) {
|
||||||
ctx := context.Background()
|
ctx := context.Background()
|
||||||
notifier, eventStore := newStoreFn(t)
|
notifier, eventStore := newStoreFn(t)
|
||||||
@@ -62,7 +62,7 @@ func TestNotifier_lastEventResourceVersion(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierLastEventResourceVersion)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierLastEventResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
// Test with no events
|
// Test with no events
|
||||||
rv, err := notifier.lastEventResourceVersion(ctx)
|
rv, err := notifier.lastEventResourceVersion(ctx)
|
||||||
assert.Error(t, err)
|
assert.Error(t, err)
|
||||||
@@ -113,7 +113,7 @@ func TestNotifier_cachekey(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierCachekey)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierCachekey)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
event Event
|
event Event
|
||||||
@@ -167,7 +167,7 @@ func TestNotifier_Watch_NoEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchNoEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchNoEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 500*time.Millisecond)
|
ctx, cancel := context.WithTimeout(ctx, 500*time.Millisecond)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -208,7 +208,7 @@ func TestNotifier_Watch_WithExistingEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchWithExistingEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchWithExistingEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -282,7 +282,7 @@ func TestNotifier_Watch_EventDeduplication(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchEventDeduplication)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchEventDeduplication)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -348,7 +348,7 @@ func TestNotifier_Watch_ContextCancellation(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchContextCancellation)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchContextCancellation)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithCancel(ctx)
|
ctx, cancel := context.WithCancel(ctx)
|
||||||
|
|
||||||
// Add an initial event so that lastEventResourceVersion doesn't return ErrNotFound
|
// Add an initial event so that lastEventResourceVersion doesn't return ErrNotFound
|
||||||
@@ -394,7 +394,7 @@ func TestNotifier_Watch_MultipleEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchMultipleEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchMultipleEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 3*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 3*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
rv := time.Now().UnixNano()
|
rv := time.Now().UnixNano()
|
||||||
@@ -456,27 +456,33 @@ func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier
|
|||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
errCh := make(chan error)
|
|
||||||
go func() {
|
go func() {
|
||||||
for _, event := range testEvents {
|
for _, event := range testEvents {
|
||||||
errCh <- eventStore.Save(ctx, event)
|
err := eventStore.Save(ctx, event)
|
||||||
|
require.NoError(t, err)
|
||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
// Receive events
|
// Receive events
|
||||||
receivedEvents := make([]string, 0, len(testEvents))
|
receivedEvents := make([]Event, 0, len(testEvents))
|
||||||
for len(receivedEvents) != len(testEvents) {
|
for i := 0; i < len(testEvents); i++ {
|
||||||
select {
|
select {
|
||||||
case event := <-events:
|
case event := <-events:
|
||||||
receivedEvents = append(receivedEvents, event.Name)
|
receivedEvents = append(receivedEvents, event)
|
||||||
case err := <-errCh:
|
|
||||||
require.NoError(t, err)
|
|
||||||
case <-time.After(1 * time.Second):
|
case <-time.After(1 * time.Second):
|
||||||
t.Fatalf("Timed out waiting for event %d", len(receivedEvents)+1)
|
t.Fatalf("Timed out waiting for event %d", i+1)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Verify all events were received
|
||||||
|
assert.Len(t, receivedEvents, len(testEvents))
|
||||||
|
|
||||||
// Verify the events match and ordered by resource version
|
// Verify the events match and ordered by resource version
|
||||||
|
receivedNames := make([]string, len(receivedEvents))
|
||||||
|
for i, event := range receivedEvents {
|
||||||
|
receivedNames[i] = event.Name
|
||||||
|
}
|
||||||
|
|
||||||
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
||||||
assert.ElementsMatch(t, expectedNames, receivedEvents)
|
assert.ElementsMatch(t, expectedNames, receivedNames)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -473,6 +473,8 @@ func (k *sqlKV) Delete(ctx context.Context, section string, key string) error {
|
|||||||
return ErrNotFound
|
return ErrNotFound
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// TODO reflect change to resource table
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ type kvStorageBackend struct {
|
|||||||
bulkLock *BulkLock
|
bulkLock *BulkLock
|
||||||
dataStore *dataStore
|
dataStore *dataStore
|
||||||
eventStore *eventStore
|
eventStore *eventStore
|
||||||
notifier notifier
|
notifier *notifier
|
||||||
builder DocumentBuilder
|
builder DocumentBuilder
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
withPruner bool
|
withPruner bool
|
||||||
@@ -91,7 +91,6 @@ type KVBackendOptions struct {
|
|||||||
Tracer trace.Tracer // TODO add tracing
|
Tracer trace.Tracer // TODO add tracing
|
||||||
Reg prometheus.Registerer // TODO add metrics
|
Reg prometheus.Registerer // TODO add metrics
|
||||||
|
|
||||||
UseChannelNotifier bool
|
|
||||||
// Adding RvManager overrides the RV generated with snowflake in order to keep backwards compatibility with
|
// Adding RvManager overrides the RV generated with snowflake in order to keep backwards compatibility with
|
||||||
// unified/sql
|
// unified/sql
|
||||||
RvManager *rvmanager.ResourceVersionManager
|
RvManager *rvmanager.ResourceVersionManager
|
||||||
@@ -122,7 +121,7 @@ func NewKVStorageBackend(opts KVBackendOptions) (KVBackend, error) {
|
|||||||
bulkLock: NewBulkLock(),
|
bulkLock: NewBulkLock(),
|
||||||
dataStore: newDataStore(kv),
|
dataStore: newDataStore(kv),
|
||||||
eventStore: eventStore,
|
eventStore: eventStore,
|
||||||
notifier: newNotifier(eventStore, notifierOptions{useChannelNotifier: opts.UseChannelNotifier}),
|
notifier: newNotifier(eventStore, notifierOptions{}),
|
||||||
snowflake: s,
|
snowflake: s,
|
||||||
builder: StandardDocumentBuilder(), // For now we use the standard document builder.
|
builder: StandardDocumentBuilder(), // For now we use the standard document builder.
|
||||||
log: &logging.NoOpLogger{}, // Make this configurable
|
log: &logging.NoOpLogger{}, // Make this configurable
|
||||||
@@ -347,7 +346,7 @@ func (k *kvStorageBackend) WriteEvent(ctx context.Context, event WriteEvent) (in
|
|||||||
return 0, fmt.Errorf("failed to write data: %w", err)
|
return 0, fmt.Errorf("failed to write data: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
rv = rvmanager.SnowflakeFromRV(rv)
|
rv = rvmanager.SnowflakeFromRv(rv)
|
||||||
dataKey.ResourceVersion = rv
|
dataKey.ResourceVersion = rv
|
||||||
} else {
|
} else {
|
||||||
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
||||||
|
|||||||
@@ -307,7 +307,7 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
|||||||
// Allocate the RVs
|
// Allocate the RVs
|
||||||
for i, guid := range guids {
|
for i, guid := range guids {
|
||||||
guidToRV[guid] = rv
|
guidToRV[guid] = rv
|
||||||
guidToSnowflakeRV[guid] = SnowflakeFromRV(rv)
|
guidToSnowflakeRV[guid] = SnowflakeFromRv(rv)
|
||||||
rvs[i] = rv
|
rvs[i] = rv
|
||||||
rv++
|
rv++
|
||||||
}
|
}
|
||||||
@@ -364,20 +364,12 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// takes a unix microsecond RV and transforms into a snowflake format. The timestamp is converted from microsecond to
|
// takes a unix microsecond rv and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||||
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
||||||
func SnowflakeFromRV(rv int64) int64 {
|
func SnowflakeFromRv(rv int64) int64 {
|
||||||
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
||||||
}
|
}
|
||||||
|
|
||||||
// It is generally not possible to convert from a snowflakeID to a microsecond RV due to the loss in precision
|
|
||||||
// (snowflake ID stores timestamp in milliseconds). However, this implementation stores the microsecond fraction
|
|
||||||
// in the step bits (see SnowflakeFromRV), allowing us to compute the microsecond timestamp.
|
|
||||||
func RVFromSnowflake(snowflakeID int64) int64 {
|
|
||||||
microSecFraction := snowflakeID & ((1 << snowflake.StepBits) - 1)
|
|
||||||
return ((snowflakeID>>(snowflake.NodeBits+snowflake.StepBits))+snowflake.Epoch)*1000 + microSecFraction
|
|
||||||
}
|
|
||||||
|
|
||||||
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
||||||
// if comparison fails
|
// if comparison fails
|
||||||
func IsRvEqual(rv1, rv2 int64) bool {
|
func IsRvEqual(rv1, rv2 int64) bool {
|
||||||
@@ -385,7 +377,7 @@ func IsRvEqual(rv1, rv2 int64) bool {
|
|||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
return rv1 == SnowflakeFromRV(rv2)
|
return rv1 == SnowflakeFromRv(rv2)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Lock locks the resource version for the given key
|
// Lock locks the resource version for the given key
|
||||||
|
|||||||
@@ -63,13 +63,3 @@ func TestResourceVersionManager(t *testing.T) {
|
|||||||
require.Equal(t, rv, int64(200))
|
require.Equal(t, rv, int64(200))
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestSnowflakeFromRVRoundtrips(t *testing.T) {
|
|
||||||
// 2026-01-12 19:33:58.806211 +0000 UTC
|
|
||||||
offset := int64(1768246438806211) // in microseconds
|
|
||||||
|
|
||||||
for n := range int64(100) {
|
|
||||||
ts := offset + n
|
|
||||||
require.Equal(t, ts, RVFromSnowflake(SnowflakeFromRV(ts)))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -99,9 +99,6 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
|
||||||
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
|
||||||
|
|
||||||
if opts.Cfg.EnableSQLKVBackend {
|
if opts.Cfg.EnableSQLKVBackend {
|
||||||
sqlkv, err := resource.NewSQLKV(eDB)
|
sqlkv, err := resource.NewSQLKV(eDB)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -109,10 +106,9 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
kvBackendOpts := resource.KVBackendOptions{
|
kvBackendOpts := resource.KVBackendOptions{
|
||||||
KvStore: sqlkv,
|
KvStore: sqlkv,
|
||||||
Tracer: opts.Tracer,
|
Tracer: opts.Tracer,
|
||||||
Reg: opts.Reg,
|
Reg: opts.Reg,
|
||||||
UseChannelNotifier: !isHA,
|
|
||||||
}
|
}
|
||||||
|
|
||||||
ctx := context.Background()
|
ctx := context.Background()
|
||||||
@@ -144,6 +140,9 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
serverOptions.Backend = kvBackend
|
serverOptions.Backend = kvBackend
|
||||||
serverOptions.Diagnostics = kvBackend
|
serverOptions.Diagnostics = kvBackend
|
||||||
} else {
|
} else {
|
||||||
|
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
||||||
|
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
||||||
|
|
||||||
backend, err := NewBackend(BackendOptions{
|
backend, err := NewBackend(BackendOptions{
|
||||||
DBProvider: eDB,
|
DBProvider: eDB,
|
||||||
Reg: opts.Reg,
|
Reg: opts.Reg,
|
||||||
|
|||||||
@@ -200,7 +200,7 @@ func verifyKeyPath(t *testing.T, db sqldb.DB, ctx context.Context, key *resource
|
|||||||
var keyPathRV int64
|
var keyPathRV int64
|
||||||
if isSqlBackend {
|
if isSqlBackend {
|
||||||
// Convert microsecond RV to snowflake for key_path construction
|
// Convert microsecond RV to snowflake for key_path construction
|
||||||
keyPathRV = rvmanager.SnowflakeFromRV(resourceVersion)
|
keyPathRV = rvmanager.SnowflakeFromRv(resourceVersion)
|
||||||
} else {
|
} else {
|
||||||
// KV backend already provides snowflake RV
|
// KV backend already provides snowflake RV
|
||||||
keyPathRV = resourceVersion
|
keyPathRV = resourceVersion
|
||||||
@@ -434,6 +434,9 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
|
|
||||||
rows, err := db.QueryContext(ctx, query, namespace)
|
rows, err := db.QueryContext(ctx, query, namespace)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
defer func() {
|
||||||
|
_ = rows.Close()
|
||||||
|
}()
|
||||||
|
|
||||||
var records []ResourceHistoryRecord
|
var records []ResourceHistoryRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -457,34 +460,33 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
for resourceIdx, res := range resources {
|
for resourceIdx, res := range resources {
|
||||||
// Check create record (action=1, generation=1)
|
// Check create record (action=1, generation=1)
|
||||||
createRecord := records[recordIndex]
|
createRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, createRecord, namespace, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
verifyResourceHistoryRecord(t, createRecord, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
|
|
||||||
for resourceIdx, res := range resources {
|
for resourceIdx, res := range resources {
|
||||||
// Check update record (action=2, generation=2)
|
// Check update record (action=2, generation=2)
|
||||||
updateRecord := records[recordIndex]
|
updateRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, updateRecord, namespace, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
verifyResourceHistoryRecord(t, updateRecord, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
|
|
||||||
for resourceIdx, res := range resources[:2] {
|
for resourceIdx, res := range resources[:2] {
|
||||||
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
||||||
deleteRecord := records[recordIndex]
|
deleteRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, deleteRecord, namespace, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
verifyResourceHistoryRecord(t, deleteRecord, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// verifyResourceHistoryRecord validates a single resource_history record
|
// verifyResourceHistoryRecord validates a single resource_history record
|
||||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, namespace string, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||||
// Validate GUID (should be non-empty)
|
// Validate GUID (should be non-empty)
|
||||||
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
||||||
|
|
||||||
// Validate group/resource/namespace/name
|
// Validate group/resource/namespace/name
|
||||||
require.Equal(t, "playlist.grafana.app", record.Group)
|
require.Equal(t, "playlist.grafana.app", record.Group)
|
||||||
require.Equal(t, "playlists", record.Resource)
|
require.Equal(t, "playlists", record.Resource)
|
||||||
require.Equal(t, namespace, record.Namespace)
|
|
||||||
require.Equal(t, expectedRes.name, record.Name)
|
require.Equal(t, expectedRes.name, record.Name)
|
||||||
|
|
||||||
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
||||||
@@ -511,12 +513,8 @@ func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, nam
|
|||||||
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
||||||
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
||||||
if strings.Contains(record.Namespace, "-kv") {
|
if strings.Contains(record.Namespace, "-kv") {
|
||||||
if expectedPrevRV == 0 {
|
require.True(t, rvmanager.IsRvEqual(expectedPrevRV, record.PreviousResourceVersion),
|
||||||
require.Zero(t, record.PreviousResourceVersion)
|
"Previous resource version should match (KV backend snowflake format)")
|
||||||
} else {
|
|
||||||
require.Equal(t, expectedPrevRV, rvmanager.SnowflakeFromRV(record.PreviousResourceVersion),
|
|
||||||
"Previous resource version should match (KV backend snowflake format)")
|
|
||||||
}
|
|
||||||
} else {
|
} else {
|
||||||
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
||||||
}
|
}
|
||||||
@@ -548,6 +546,9 @@ func verifyResourceTable(t *testing.T, db sqldb.DB, namespace string, resources
|
|||||||
|
|
||||||
rows, err := db.QueryContext(ctx, query, namespace)
|
rows, err := db.QueryContext(ctx, query, namespace)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
defer func() {
|
||||||
|
_ = rows.Close()
|
||||||
|
}()
|
||||||
|
|
||||||
var records []ResourceRecord
|
var records []ResourceRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -611,6 +612,9 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
||||||
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
defer func() {
|
||||||
|
_ = rows.Close()
|
||||||
|
}()
|
||||||
|
|
||||||
var records []ResourceVersionRecord
|
var records []ResourceVersionRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -645,7 +649,7 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
isKvBackend := strings.Contains(namespace, "-kv")
|
isKvBackend := strings.Contains(namespace, "-kv")
|
||||||
recordResourceVersion := record.ResourceVersion
|
recordResourceVersion := record.ResourceVersion
|
||||||
if isKvBackend {
|
if isKvBackend {
|
||||||
recordResourceVersion = rvmanager.SnowflakeFromRV(record.ResourceVersion)
|
recordResourceVersion = rvmanager.SnowflakeFromRv(record.ResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
||||||
@@ -837,20 +841,24 @@ func runMixedConcurrentOperations(t *testing.T, sqlServer, kvServer resource.Res
|
|||||||
}
|
}
|
||||||
|
|
||||||
// SQL backend operations
|
// SQL backend operations
|
||||||
wg.Go(func() {
|
wg.Add(1)
|
||||||
|
go func() {
|
||||||
|
defer wg.Done()
|
||||||
<-startBarrier // Wait for signal to start
|
<-startBarrier // Wait for signal to start
|
||||||
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
||||||
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
||||||
}
|
}
|
||||||
})
|
}()
|
||||||
|
|
||||||
// KV backend operations
|
// KV backend operations
|
||||||
wg.Go(func() {
|
wg.Add(1)
|
||||||
|
go func() {
|
||||||
|
defer wg.Done()
|
||||||
<-startBarrier // Wait for signal to start
|
<-startBarrier // Wait for signal to start
|
||||||
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
||||||
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
||||||
}
|
}
|
||||||
})
|
}()
|
||||||
|
|
||||||
// Start both goroutines simultaneously
|
// Start both goroutines simultaneously
|
||||||
close(startBarrier)
|
close(startBarrier)
|
||||||
|
|||||||
@@ -8,7 +8,6 @@ import (
|
|||||||
"github.com/stretchr/testify/require"
|
"github.com/stretchr/testify/require"
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||||
"github.com/grafana/grafana/pkg/util/testutil"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestBadgerKVStorageBackend(t *testing.T) {
|
func TestBadgerKVStorageBackend(t *testing.T) {
|
||||||
@@ -37,9 +36,7 @@ func TestBadgerKVStorageBackend(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestIntegrationSQLKVStorageBackend(t *testing.T) {
|
func TestSQLKVStorageBackend(t *testing.T) {
|
||||||
testutil.SkipIntegrationTestInShortMode(t)
|
|
||||||
|
|
||||||
skipTests := map[string]bool{
|
skipTests := map[string]bool{
|
||||||
TestWatchWriteEvents: true,
|
TestWatchWriteEvents: true,
|
||||||
TestList: true,
|
TestList: true,
|
||||||
|
|||||||
@@ -25,10 +25,6 @@ export class ExportAsCode extends ShareExportTab {
|
|||||||
public getTabLabel(): string {
|
public getTabLabel(): string {
|
||||||
return t('export.json.title', 'Export dashboard');
|
return t('export.json.title', 'Export dashboard');
|
||||||
}
|
}
|
||||||
|
|
||||||
public getSubtitle(): string | undefined {
|
|
||||||
return t('export.json.info-text', 'Copy or download a file containing the definition of your dashboard');
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
||||||
@@ -57,6 +53,12 @@ function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div data-testid={selector.container} className={styles.container}>
|
<div data-testid={selector.container} className={styles.container}>
|
||||||
|
<p>
|
||||||
|
<Trans i18nKey="export.json.info-text">
|
||||||
|
Copy or download a file containing the definition of your dashboard
|
||||||
|
</Trans>
|
||||||
|
</p>
|
||||||
|
|
||||||
{config.featureToggles.kubernetesDashboards ? (
|
{config.featureToggles.kubernetesDashboards ? (
|
||||||
<ResourceExport
|
<ResourceExport
|
||||||
dashboardJson={dashboardJson}
|
dashboardJson={dashboardJson}
|
||||||
|
|||||||
@@ -1,189 +0,0 @@
|
|||||||
import { render, screen, within } from '@testing-library/react';
|
|
||||||
import userEvent from '@testing-library/user-event';
|
|
||||||
import { AsyncState } from 'react-use/lib/useAsync';
|
|
||||||
|
|
||||||
import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
|
||||||
import { Dashboard } from '@grafana/schema';
|
|
||||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
|
||||||
|
|
||||||
import { ExportMode, ResourceExport } from './ResourceExport';
|
|
||||||
|
|
||||||
type DashboardJsonState = AsyncState<{
|
|
||||||
json: Dashboard | DashboardV2Spec | { error: unknown };
|
|
||||||
hasLibraryPanels?: boolean;
|
|
||||||
initialSaveModelVersion: 'v1' | 'v2';
|
|
||||||
}>;
|
|
||||||
|
|
||||||
const selector = e2eSelectors.pages.ExportDashboardDrawer.ExportAsJson;
|
|
||||||
|
|
||||||
const createDefaultProps = (overrides?: Partial<Parameters<typeof ResourceExport>[0]>) => {
|
|
||||||
const defaultProps: Parameters<typeof ResourceExport>[0] = {
|
|
||||||
dashboardJson: {
|
|
||||||
loading: false,
|
|
||||||
value: {
|
|
||||||
json: { title: 'Test Dashboard' } as Dashboard,
|
|
||||||
hasLibraryPanels: false,
|
|
||||||
initialSaveModelVersion: 'v1',
|
|
||||||
},
|
|
||||||
} as DashboardJsonState,
|
|
||||||
isSharingExternally: false,
|
|
||||||
exportMode: ExportMode.Classic,
|
|
||||||
isViewingYAML: false,
|
|
||||||
onExportModeChange: jest.fn(),
|
|
||||||
onShareExternallyChange: jest.fn(),
|
|
||||||
onViewYAML: jest.fn(),
|
|
||||||
};
|
|
||||||
|
|
||||||
return { ...defaultProps, ...overrides };
|
|
||||||
};
|
|
||||||
|
|
||||||
const createV2DashboardJson = (hasLibraryPanels = false): DashboardJsonState => ({
|
|
||||||
loading: false,
|
|
||||||
value: {
|
|
||||||
json: {
|
|
||||||
title: 'Test V2 Dashboard',
|
|
||||||
spec: {
|
|
||||||
elements: {},
|
|
||||||
},
|
|
||||||
} as unknown as DashboardV2Spec,
|
|
||||||
hasLibraryPanels,
|
|
||||||
initialSaveModelVersion: 'v2',
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
const expandOptions = async () => {
|
|
||||||
const button = screen.getByRole('button', { expanded: false });
|
|
||||||
await userEvent.click(button);
|
|
||||||
};
|
|
||||||
|
|
||||||
describe('ResourceExport', () => {
|
|
||||||
describe('export mode options for v1 dashboard', () => {
|
|
||||||
it('should show three export mode options in correct order: Classic, V1 Resource, V2 Resource', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps()} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
|
||||||
const labels = within(radioGroup)
|
|
||||||
.getAllByRole('radio')
|
|
||||||
.map((radio) => radio.parentElement?.textContent?.trim());
|
|
||||||
|
|
||||||
expect(labels).toHaveLength(3);
|
|
||||||
expect(labels).toEqual(['Classic', 'V1 Resource', 'V2 Resource']);
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should have first option selected by default when exportMode is Classic', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
|
||||||
const radios = within(radioGroup).getAllByRole('radio');
|
|
||||||
expect(radios[0]).toBeChecked();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should call onExportModeChange when export mode is changed', async () => {
|
|
||||||
const onExportModeChange = jest.fn();
|
|
||||||
render(<ResourceExport {...createDefaultProps({ onExportModeChange })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
|
||||||
const radios = within(radioGroup).getAllByRole('radio');
|
|
||||||
await userEvent.click(radios[1]); // V1 Resource
|
|
||||||
expect(onExportModeChange).toHaveBeenCalledWith(ExportMode.V1Resource);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('export mode options for v2 dashboard', () => {
|
|
||||||
it('should not show export mode options', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ dashboardJson: createV2DashboardJson() })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
expect(screen.queryByRole('radiogroup', { name: /model/i })).not.toBeInTheDocument();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('format options', () => {
|
|
||||||
it('should not show format options when export mode is Classic', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
|
||||||
expect(screen.queryByRole('radiogroup', { name: /format/i })).not.toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it.each([ExportMode.V1Resource, ExportMode.V2Resource])(
|
|
||||||
'should show format options when export mode is %s',
|
|
||||||
async (exportMode) => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
|
||||||
expect(screen.getByRole('radiogroup', { name: /format/i })).toBeInTheDocument();
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
it('should have first format option selected when isViewingYAML is false', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: false })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
|
||||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
|
||||||
expect(formatRadios[0]).toBeChecked(); // JSON
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should have second format option selected when isViewingYAML is true', async () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: true })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
|
||||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
|
||||||
expect(formatRadios[1]).toBeChecked(); // YAML
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should call onViewYAML when format is changed', async () => {
|
|
||||||
const onViewYAML = jest.fn();
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, onViewYAML })} />);
|
|
||||||
await expandOptions();
|
|
||||||
|
|
||||||
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
|
||||||
const formatRadios = within(formatGroup).getAllByRole('radio');
|
|
||||||
await userEvent.click(formatRadios[1]); // YAML
|
|
||||||
expect(onViewYAML).toHaveBeenCalled();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('share externally switch', () => {
|
|
||||||
it('should show share externally switch for Classic mode', () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
|
||||||
|
|
||||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should show share externally switch for V2Resource mode with V2 dashboard', () => {
|
|
||||||
render(
|
|
||||||
<ResourceExport
|
|
||||||
{...createDefaultProps({
|
|
||||||
dashboardJson: createV2DashboardJson(),
|
|
||||||
exportMode: ExportMode.V2Resource,
|
|
||||||
})}
|
|
||||||
/>
|
|
||||||
);
|
|
||||||
|
|
||||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should call onShareExternallyChange when switch is toggled', async () => {
|
|
||||||
const onShareExternallyChange = jest.fn();
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, onShareExternallyChange })} />);
|
|
||||||
|
|
||||||
const switchElement = screen.getByTestId(selector.exportExternallyToggle);
|
|
||||||
await userEvent.click(switchElement);
|
|
||||||
expect(onShareExternallyChange).toHaveBeenCalled();
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should reflect isSharingExternally value in switch', () => {
|
|
||||||
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, isSharingExternally: true })} />);
|
|
||||||
|
|
||||||
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeChecked();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
@@ -4,8 +4,7 @@ import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
|||||||
import { Trans, t } from '@grafana/i18n';
|
import { Trans, t } from '@grafana/i18n';
|
||||||
import { Dashboard } from '@grafana/schema';
|
import { Dashboard } from '@grafana/schema';
|
||||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||||
import { Alert, Icon, Label, RadioButtonGroup, Stack, Switch, Box, Tooltip } from '@grafana/ui';
|
import { Alert, Label, RadioButtonGroup, Stack, Switch } from '@grafana/ui';
|
||||||
import { QueryOperationRow } from 'app/core/components/QueryOperationRow/QueryOperationRow';
|
|
||||||
import { DashboardJson } from 'app/features/manage-dashboards/types';
|
import { DashboardJson } from 'app/features/manage-dashboards/types';
|
||||||
|
|
||||||
import { ExportableResource } from '../ShareExportTab';
|
import { ExportableResource } from '../ShareExportTab';
|
||||||
@@ -49,90 +48,80 @@ export function ResourceExport({
|
|||||||
|
|
||||||
const switchExportLabel =
|
const switchExportLabel =
|
||||||
exportMode === ExportMode.V2Resource
|
exportMode === ExportMode.V2Resource
|
||||||
? t('dashboard-scene.resource-export.share-externally', 'Share dashboard with another instance')
|
? t('export.json.export-remove-ds-refs', 'Remove deployment details')
|
||||||
: t('share-modal.export.share-externally-label', 'Export for sharing externally');
|
: t('share-modal.export.share-externally-label', `Export for sharing externally`);
|
||||||
const switchExportTooltip = t(
|
|
||||||
'dashboard-scene.resource-export.share-externally-tooltip',
|
|
||||||
'Removes all instance-specific metadata and data source references from the resource before export.'
|
|
||||||
);
|
|
||||||
const switchExportModeLabel = t('export.json.export-mode', 'Model');
|
const switchExportModeLabel = t('export.json.export-mode', 'Model');
|
||||||
const switchExportFormatLabel = t('export.json.export-format', 'Format');
|
const switchExportFormatLabel = t('export.json.export-format', 'Format');
|
||||||
|
|
||||||
const exportResourceOptions = [
|
|
||||||
{
|
|
||||||
label: t('dashboard-scene.resource-export.label.classic', 'Classic'),
|
|
||||||
value: ExportMode.Classic,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
|
||||||
value: ExportMode.V1Resource,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
|
||||||
value: ExportMode.V2Resource,
|
|
||||||
},
|
|
||||||
];
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<Stack gap={2} direction="column">
|
||||||
<QueryOperationRow
|
<Stack gap={1} direction="column">
|
||||||
id="Advanced options"
|
{initialSaveModelVersion === 'v1' && (
|
||||||
index={0}
|
<Stack alignItems="center">
|
||||||
title={t('dashboard-scene.resource-export.label.advanced-options', 'Advanced options')}
|
<Label>{switchExportModeLabel}</Label>
|
||||||
isOpen={false}
|
<RadioButtonGroup
|
||||||
>
|
options={[
|
||||||
<Box marginTop={2}>
|
{ label: t('dashboard-scene.resource-export.label.classic', 'Classic'), value: ExportMode.Classic },
|
||||||
<Stack gap={1} direction="column">
|
{
|
||||||
{initialSaveModelVersion === 'v1' && (
|
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||||
<Stack gap={1} alignItems="center">
|
value: ExportMode.V1Resource,
|
||||||
<Label>{switchExportModeLabel}</Label>
|
},
|
||||||
<RadioButtonGroup
|
{
|
||||||
options={exportResourceOptions}
|
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||||
value={exportMode}
|
value: ExportMode.V2Resource,
|
||||||
onChange={(value) => onExportModeChange(value)}
|
},
|
||||||
aria-label={switchExportModeLabel}
|
]}
|
||||||
/>
|
value={exportMode}
|
||||||
</Stack>
|
onChange={(value) => onExportModeChange(value)}
|
||||||
)}
|
/>
|
||||||
|
|
||||||
{exportMode !== ExportMode.Classic && (
|
|
||||||
<Stack gap={1} alignItems="center">
|
|
||||||
<Label>{switchExportFormatLabel}</Label>
|
|
||||||
<RadioButtonGroup
|
|
||||||
options={[
|
|
||||||
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
|
||||||
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
|
||||||
]}
|
|
||||||
value={isViewingYAML ? 'yaml' : 'json'}
|
|
||||||
onChange={onViewYAML}
|
|
||||||
aria-label={switchExportFormatLabel}
|
|
||||||
/>
|
|
||||||
</Stack>
|
|
||||||
)}
|
|
||||||
</Stack>
|
</Stack>
|
||||||
</Box>
|
)}
|
||||||
</QueryOperationRow>
|
{initialSaveModelVersion === 'v2' && (
|
||||||
|
<Stack alignItems="center">
|
||||||
{(isV2Dashboard ||
|
<Label>{switchExportModeLabel}</Label>
|
||||||
exportMode === ExportMode.Classic ||
|
<RadioButtonGroup
|
||||||
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
options={[
|
||||||
<Stack gap={1} alignItems="start">
|
{
|
||||||
<Label>
|
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||||
<Stack gap={0.5} alignItems="center">
|
value: ExportMode.V2Resource,
|
||||||
<Tooltip content={switchExportTooltip} placement="bottom">
|
},
|
||||||
<Icon name="info-circle" size="sm" />
|
{
|
||||||
</Tooltip>
|
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||||
{switchExportLabel}
|
value: ExportMode.V1Resource,
|
||||||
</Stack>
|
},
|
||||||
</Label>
|
]}
|
||||||
<Switch
|
value={exportMode}
|
||||||
label={switchExportLabel}
|
onChange={(value) => onExportModeChange(value)}
|
||||||
value={isSharingExternally}
|
/>
|
||||||
onChange={onShareExternallyChange}
|
</Stack>
|
||||||
data-testid={selector.exportExternallyToggle}
|
)}
|
||||||
/>
|
{exportMode !== ExportMode.Classic && (
|
||||||
</Stack>
|
<Stack gap={1} alignItems="center">
|
||||||
)}
|
<Label>{switchExportFormatLabel}</Label>
|
||||||
|
<RadioButtonGroup
|
||||||
|
options={[
|
||||||
|
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
||||||
|
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
||||||
|
]}
|
||||||
|
value={isViewingYAML ? 'yaml' : 'json'}
|
||||||
|
onChange={onViewYAML}
|
||||||
|
/>
|
||||||
|
</Stack>
|
||||||
|
)}
|
||||||
|
{(isV2Dashboard ||
|
||||||
|
exportMode === ExportMode.Classic ||
|
||||||
|
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
||||||
|
<Stack gap={1} alignItems="start">
|
||||||
|
<Label>{switchExportLabel}</Label>
|
||||||
|
<Switch
|
||||||
|
label={switchExportLabel}
|
||||||
|
value={isSharingExternally}
|
||||||
|
onChange={onShareExternallyChange}
|
||||||
|
data-testid={selector.exportExternallyToggle}
|
||||||
|
/>
|
||||||
|
</Stack>
|
||||||
|
)}
|
||||||
|
</Stack>
|
||||||
|
|
||||||
{showV2LibPanelAlert && (
|
{showV2LibPanelAlert && (
|
||||||
<Alert
|
<Alert
|
||||||
@@ -141,7 +130,6 @@ export function ResourceExport({
|
|||||||
'Library panels will be converted to regular panels'
|
'Library panels will be converted to regular panels'
|
||||||
)}
|
)}
|
||||||
severity="warning"
|
severity="warning"
|
||||||
topSpacing={2}
|
|
||||||
>
|
>
|
||||||
<Trans i18nKey="dashboard-scene.save-dashboard-form.schema-v2-library-panels-export">
|
<Trans i18nKey="dashboard-scene.save-dashboard-form.schema-v2-library-panels-export">
|
||||||
Due to limitations in the new dashboard schema (V2), library panels will be converted to regular panels with
|
Due to limitations in the new dashboard schema (V2), library panels will be converted to regular panels with
|
||||||
@@ -149,6 +137,6 @@ export function ResourceExport({
|
|||||||
</Trans>
|
</Trans>
|
||||||
</Alert>
|
</Alert>
|
||||||
)}
|
)}
|
||||||
</>
|
</Stack>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -66,12 +66,7 @@ function ShareDrawerRenderer({ model }: SceneComponentProps<ShareDrawer>) {
|
|||||||
const dashboard = getDashboardSceneFor(model);
|
const dashboard = getDashboardSceneFor(model);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Drawer
|
<Drawer title={activeShare?.getTabLabel()} onClose={model.onDismiss} size="md">
|
||||||
title={activeShare?.getTabLabel()}
|
|
||||||
subtitle={activeShare?.getSubtitle?.()}
|
|
||||||
onClose={model.onDismiss}
|
|
||||||
size="md"
|
|
||||||
>
|
|
||||||
<ShareDrawerContext.Provider value={{ dashboard, onDismiss: model.onDismiss }}>
|
<ShareDrawerContext.Provider value={{ dashboard, onDismiss: model.onDismiss }}>
|
||||||
{activeShare && <activeShare.Component model={activeShare} />}
|
{activeShare && <activeShare.Component model={activeShare} />}
|
||||||
</ShareDrawerContext.Provider>
|
</ShareDrawerContext.Provider>
|
||||||
|
|||||||
@@ -66,10 +66,6 @@ export class ShareExportTab extends SceneObjectBase<ShareExportTabState> impleme
|
|||||||
return t('share-modal.tab-title.export', 'Export');
|
return t('share-modal.tab-title.export', 'Export');
|
||||||
}
|
}
|
||||||
|
|
||||||
public getSubtitle(): string | undefined {
|
|
||||||
return undefined;
|
|
||||||
}
|
|
||||||
|
|
||||||
public onShareExternallyChange = () => {
|
public onShareExternallyChange = () => {
|
||||||
this.setState({
|
this.setState({
|
||||||
isSharingExternally: !this.state.isSharingExternally,
|
isSharingExternally: !this.state.isSharingExternally,
|
||||||
|
|||||||
@@ -15,6 +15,5 @@ export interface SceneShareTab<T extends SceneShareTabState = SceneShareTabState
|
|||||||
|
|
||||||
export interface ShareView extends SceneObject {
|
export interface ShareView extends SceneObject {
|
||||||
getTabLabel(): string;
|
getTabLabel(): string;
|
||||||
getSubtitle?(): string | undefined;
|
|
||||||
onDismiss?: () => void;
|
onDismiss?: () => void;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,9 +2,8 @@ import { render, screen } from '@testing-library/react';
|
|||||||
import { defaultsDeep } from 'lodash';
|
import { defaultsDeep } from 'lodash';
|
||||||
import { Provider } from 'react-redux';
|
import { Provider } from 'react-redux';
|
||||||
|
|
||||||
import { CoreApp, EventBusSrv, FieldType, getDefaultTimeRange, LoadingState } from '@grafana/data';
|
import { FieldType, getDefaultTimeRange, LoadingState } from '@grafana/data';
|
||||||
import { config, PanelDataErrorViewProps } from '@grafana/runtime';
|
import { PanelDataErrorViewProps } from '@grafana/runtime';
|
||||||
import { usePanelContext } from '@grafana/ui';
|
|
||||||
import { configureStore } from 'app/store/configureStore';
|
import { configureStore } from 'app/store/configureStore';
|
||||||
|
|
||||||
import { PanelDataErrorView } from './PanelDataErrorView';
|
import { PanelDataErrorView } from './PanelDataErrorView';
|
||||||
@@ -17,24 +16,7 @@ jest.mock('app/features/dashboard/services/DashboardSrv', () => ({
|
|||||||
},
|
},
|
||||||
}));
|
}));
|
||||||
|
|
||||||
jest.mock('@grafana/ui', () => ({
|
|
||||||
...jest.requireActual('@grafana/ui'),
|
|
||||||
usePanelContext: jest.fn(),
|
|
||||||
}));
|
|
||||||
|
|
||||||
const mockUsePanelContext = jest.mocked(usePanelContext);
|
|
||||||
const RUN_QUERY_MESSAGE = 'Run a query to visualize it here or go to all visualizations to add other panel types';
|
|
||||||
const panelContextRoot = {
|
|
||||||
app: CoreApp.Dashboard,
|
|
||||||
eventsScope: 'global',
|
|
||||||
eventBus: new EventBusSrv(),
|
|
||||||
};
|
|
||||||
|
|
||||||
describe('PanelDataErrorView', () => {
|
describe('PanelDataErrorView', () => {
|
||||||
beforeEach(() => {
|
|
||||||
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
|
||||||
});
|
|
||||||
|
|
||||||
it('show No data when there is no data', () => {
|
it('show No data when there is no data', () => {
|
||||||
renderWithProps();
|
renderWithProps();
|
||||||
|
|
||||||
@@ -88,45 +70,6 @@ describe('PanelDataErrorView', () => {
|
|||||||
|
|
||||||
expect(screen.getByText('Query returned nothing')).toBeInTheDocument();
|
expect(screen.getByText('Query returned nothing')).toBeInTheDocument();
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should show "Run a query..." message when no query is configured and feature toggle is enabled', () => {
|
|
||||||
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
|
||||||
|
|
||||||
const originalFeatureToggle = config.featureToggles.newVizSuggestions;
|
|
||||||
config.featureToggles.newVizSuggestions = true;
|
|
||||||
|
|
||||||
renderWithProps({
|
|
||||||
data: {
|
|
||||||
state: LoadingState.Done,
|
|
||||||
series: [],
|
|
||||||
timeRange: getDefaultTimeRange(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
expect(screen.getByText(RUN_QUERY_MESSAGE)).toBeInTheDocument();
|
|
||||||
|
|
||||||
config.featureToggles.newVizSuggestions = originalFeatureToggle;
|
|
||||||
});
|
|
||||||
|
|
||||||
it('should show "No data" message when feature toggle is disabled even without queries', () => {
|
|
||||||
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
|
||||||
|
|
||||||
const originalFeatureToggle = config.featureToggles.newVizSuggestions;
|
|
||||||
config.featureToggles.newVizSuggestions = false;
|
|
||||||
|
|
||||||
renderWithProps({
|
|
||||||
data: {
|
|
||||||
state: LoadingState.Done,
|
|
||||||
series: [],
|
|
||||||
timeRange: getDefaultTimeRange(),
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
expect(screen.getByText('No data')).toBeInTheDocument();
|
|
||||||
expect(screen.queryByText(RUN_QUERY_MESSAGE)).not.toBeInTheDocument();
|
|
||||||
|
|
||||||
config.featureToggles.newVizSuggestions = originalFeatureToggle;
|
|
||||||
});
|
|
||||||
});
|
});
|
||||||
|
|
||||||
function renderWithProps(overrides?: Partial<PanelDataErrorViewProps>) {
|
function renderWithProps(overrides?: Partial<PanelDataErrorViewProps>) {
|
||||||
|
|||||||
@@ -5,15 +5,14 @@ import {
|
|||||||
FieldType,
|
FieldType,
|
||||||
getPanelDataSummary,
|
getPanelDataSummary,
|
||||||
GrafanaTheme2,
|
GrafanaTheme2,
|
||||||
PanelData,
|
|
||||||
PanelDataSummary,
|
PanelDataSummary,
|
||||||
PanelPluginVisualizationSuggestion,
|
PanelPluginVisualizationSuggestion,
|
||||||
} from '@grafana/data';
|
} from '@grafana/data';
|
||||||
import { selectors } from '@grafana/e2e-selectors';
|
import { selectors } from '@grafana/e2e-selectors';
|
||||||
import { t, Trans } from '@grafana/i18n';
|
import { t, Trans } from '@grafana/i18n';
|
||||||
import { PanelDataErrorViewProps, locationService, config } from '@grafana/runtime';
|
import { PanelDataErrorViewProps, locationService } from '@grafana/runtime';
|
||||||
import { VizPanel } from '@grafana/scenes';
|
import { VizPanel } from '@grafana/scenes';
|
||||||
import { Icon, usePanelContext, useStyles2 } from '@grafana/ui';
|
import { usePanelContext, useStyles2 } from '@grafana/ui';
|
||||||
import { CardButton } from 'app/core/components/CardButton';
|
import { CardButton } from 'app/core/components/CardButton';
|
||||||
import { LS_VISUALIZATION_SELECT_TAB_KEY } from 'app/core/constants';
|
import { LS_VISUALIZATION_SELECT_TAB_KEY } from 'app/core/constants';
|
||||||
import store from 'app/core/store';
|
import store from 'app/core/store';
|
||||||
@@ -25,11 +24,6 @@ import { findVizPanelByKey, getVizPanelKeyForPanelId } from 'app/features/dashbo
|
|||||||
import { useDispatch } from 'app/types/store';
|
import { useDispatch } from 'app/types/store';
|
||||||
|
|
||||||
import { changePanelPlugin } from '../state/actions';
|
import { changePanelPlugin } from '../state/actions';
|
||||||
import { hasData } from '../suggestions/utils';
|
|
||||||
|
|
||||||
function hasNoQueryConfigured(data: PanelData): boolean {
|
|
||||||
return !data.request?.targets || data.request.targets.length === 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
@@ -99,14 +93,8 @@ export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const noData = !hasData(props.data);
|
|
||||||
const noQueryConfigured = hasNoQueryConfigured(props.data);
|
|
||||||
const showEmptyState =
|
|
||||||
config.featureToggles.newVizSuggestions && context.app === CoreApp.PanelEditor && noQueryConfigured && noData;
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className={styles.wrapper}>
|
<div className={styles.wrapper}>
|
||||||
{showEmptyState && <Icon name="chart-line" size="xxxl" className={styles.emptyStateIcon} />}
|
|
||||||
<div className={styles.message} data-testid={selectors.components.Panels.Panel.PanelDataErrorMessage}>
|
<div className={styles.message} data-testid={selectors.components.Panels.Panel.PanelDataErrorMessage}>
|
||||||
{message}
|
{message}
|
||||||
</div>
|
</div>
|
||||||
@@ -143,17 +131,7 @@ function getMessageFor(
|
|||||||
return message;
|
return message;
|
||||||
}
|
}
|
||||||
|
|
||||||
const noData = !hasData(data);
|
if (!data.series || data.series.length === 0 || data.series.every((frame) => frame.length === 0)) {
|
||||||
const noQueryConfigured = hasNoQueryConfigured(data);
|
|
||||||
|
|
||||||
if (config.featureToggles.newVizSuggestions && noQueryConfigured && noData) {
|
|
||||||
return t(
|
|
||||||
'dashboard.new-panel.empty-state-message',
|
|
||||||
'Run a query to visualize it here or go to all visualizations to add other panel types'
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (noData) {
|
|
||||||
return fieldConfig?.defaults.noValue ?? t('panel.panel-data-error-view.no-value.default', 'No data');
|
return fieldConfig?.defaults.noValue ?? t('panel.panel-data-error-view.no-value.default', 'No data');
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -198,9 +176,5 @@ const getStyles = (theme: GrafanaTheme2) => {
|
|||||||
width: '100%',
|
width: '100%',
|
||||||
maxWidth: '600px',
|
maxWidth: '600px',
|
||||||
}),
|
}),
|
||||||
emptyStateIcon: css({
|
|
||||||
color: theme.colors.text.secondary,
|
|
||||||
marginBottom: theme.spacing(2),
|
|
||||||
}),
|
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,26 +1,29 @@
|
|||||||
import { SelectableValue } from '@grafana/data';
|
import { SelectableValue } from '@grafana/data';
|
||||||
import { RadioButtonGroup } from '@grafana/ui';
|
import { RadioButtonGroup } from '@grafana/ui';
|
||||||
|
|
||||||
|
import { useDispatch } from '../../hooks/useStatelessReducer';
|
||||||
import { EditorType } from '../../types';
|
import { EditorType } from '../../types';
|
||||||
|
|
||||||
|
import { useQuery } from './ElasticsearchQueryContext';
|
||||||
|
import { changeEditorTypeAndResetQuery } from './state';
|
||||||
|
|
||||||
const BASE_OPTIONS: Array<SelectableValue<EditorType>> = [
|
const BASE_OPTIONS: Array<SelectableValue<EditorType>> = [
|
||||||
{ value: 'builder', label: 'Builder' },
|
{ value: 'builder', label: 'Builder' },
|
||||||
{ value: 'code', label: 'Code' },
|
{ value: 'code', label: 'Code' },
|
||||||
];
|
];
|
||||||
|
|
||||||
interface Props {
|
export const EditorTypeSelector = () => {
|
||||||
value: EditorType;
|
const query = useQuery();
|
||||||
onChange: (editorType: EditorType) => void;
|
const dispatch = useDispatch();
|
||||||
}
|
|
||||||
|
// Default to 'builder' if editorType is empty
|
||||||
|
const editorType: EditorType = query.editorType === 'code' ? 'code' : 'builder';
|
||||||
|
|
||||||
|
const onChange = (newEditorType: EditorType) => {
|
||||||
|
dispatch(changeEditorTypeAndResetQuery(newEditorType));
|
||||||
|
};
|
||||||
|
|
||||||
export const EditorTypeSelector = ({ value, onChange }: Props) => {
|
|
||||||
return (
|
return (
|
||||||
<RadioButtonGroup<EditorType>
|
<RadioButtonGroup<EditorType> fullWidth={false} options={BASE_OPTIONS} value={editorType} onChange={onChange} />
|
||||||
data-testid="elasticsearch-editor-type-toggle"
|
|
||||||
size="sm"
|
|
||||||
options={BASE_OPTIONS}
|
|
||||||
value={value}
|
|
||||||
onChange={onChange}
|
|
||||||
/>
|
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -10,13 +10,9 @@ interface Props {
|
|||||||
onRunQuery: () => void;
|
onRunQuery: () => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
// This offset was chosen by testing to match Prometheus behavior
|
|
||||||
const EDITOR_HEIGHT_OFFSET = 2;
|
|
||||||
|
|
||||||
export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
const editorRef = useRef<monacoTypes.editor.IStandaloneCodeEditor | null>(null);
|
const editorRef = useRef<monacoTypes.editor.IStandaloneCodeEditor | null>(null);
|
||||||
const containerRef = useRef<HTMLDivElement | null>(null);
|
|
||||||
|
|
||||||
const handleEditorDidMount = useCallback(
|
const handleEditorDidMount = useCallback(
|
||||||
(editor: monacoTypes.editor.IStandaloneCodeEditor, monaco: Monaco) => {
|
(editor: monacoTypes.editor.IStandaloneCodeEditor, monaco: Monaco) => {
|
||||||
@@ -26,22 +22,6 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
editor.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.Enter, () => {
|
editor.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.Enter, () => {
|
||||||
onRunQuery();
|
onRunQuery();
|
||||||
});
|
});
|
||||||
|
|
||||||
// Make the editor resize itself so that the content fits (grows taller when necessary)
|
|
||||||
// this code comes from the Prometheus query editor.
|
|
||||||
// We may wish to consider abstracting it into the grafana/ui repo in the future
|
|
||||||
const updateElementHeight = () => {
|
|
||||||
const containerDiv = containerRef.current;
|
|
||||||
if (containerDiv !== null) {
|
|
||||||
const pixelHeight = editor.getContentHeight();
|
|
||||||
containerDiv.style.height = `${pixelHeight + EDITOR_HEIGHT_OFFSET}px`;
|
|
||||||
const pixelWidth = containerDiv.clientWidth;
|
|
||||||
editor.layout({ width: pixelWidth, height: pixelHeight });
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
editor.onDidContentSizeChange(updateElementHeight);
|
|
||||||
updateElementHeight();
|
|
||||||
},
|
},
|
||||||
[onRunQuery]
|
[onRunQuery]
|
||||||
);
|
);
|
||||||
@@ -85,17 +65,7 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<Box>
|
<Box>
|
||||||
<div ref={containerRef} className={styles.editorContainer}>
|
<div className={styles.header}>
|
||||||
<CodeEditor
|
|
||||||
value={value ?? ''}
|
|
||||||
language="json"
|
|
||||||
width="100%"
|
|
||||||
onBlur={handleQueryChange}
|
|
||||||
monacoOptions={monacoOptions}
|
|
||||||
onEditorDidMount={handleEditorDidMount}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
<div className={styles.footer}>
|
|
||||||
<Stack gap={1}>
|
<Stack gap={1}>
|
||||||
<Button
|
<Button
|
||||||
size="sm"
|
size="sm"
|
||||||
@@ -106,8 +76,20 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
>
|
>
|
||||||
Format
|
Format
|
||||||
</Button>
|
</Button>
|
||||||
|
<Button size="sm" variant="primary" icon="play" onClick={onRunQuery} tooltip="Run query (Ctrl/Cmd+Enter)">
|
||||||
|
Run
|
||||||
|
</Button>
|
||||||
</Stack>
|
</Stack>
|
||||||
</div>
|
</div>
|
||||||
|
<CodeEditor
|
||||||
|
value={value ?? ''}
|
||||||
|
language="json"
|
||||||
|
height={200}
|
||||||
|
width="100%"
|
||||||
|
onBlur={handleQueryChange}
|
||||||
|
monacoOptions={monacoOptions}
|
||||||
|
onEditorDidMount={handleEditorDidMount}
|
||||||
|
/>
|
||||||
</Box>
|
</Box>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -118,11 +100,7 @@ const getStyles = (theme: GrafanaTheme2) => ({
|
|||||||
flexDirection: 'column',
|
flexDirection: 'column',
|
||||||
gap: theme.spacing(1),
|
gap: theme.spacing(1),
|
||||||
}),
|
}),
|
||||||
editorContainer: css({
|
header: css({
|
||||||
width: '100%',
|
|
||||||
overflow: 'hidden',
|
|
||||||
}),
|
|
||||||
footer: css({
|
|
||||||
display: 'flex',
|
display: 'flex',
|
||||||
justifyContent: 'flex-end',
|
justifyContent: 'flex-end',
|
||||||
padding: theme.spacing(0.5, 0),
|
padding: theme.spacing(0.5, 0),
|
||||||
|
|||||||
@@ -1,16 +1,16 @@
|
|||||||
import { css } from '@emotion/css';
|
import { css } from '@emotion/css';
|
||||||
import { useCallback, useEffect, useId, useState } from 'react';
|
import { useEffect, useId, useState } from 'react';
|
||||||
import { SemVer } from 'semver';
|
import { SemVer } from 'semver';
|
||||||
|
|
||||||
import { getDefaultTimeRange, GrafanaTheme2, QueryEditorProps } from '@grafana/data';
|
import { getDefaultTimeRange, GrafanaTheme2, QueryEditorProps } from '@grafana/data';
|
||||||
import { config } from '@grafana/runtime';
|
import { config } from '@grafana/runtime';
|
||||||
import { Alert, ConfirmModal, InlineField, InlineLabel, Input, QueryField, useStyles2 } from '@grafana/ui';
|
import { Alert, InlineField, InlineLabel, Input, QueryField, useStyles2 } from '@grafana/ui';
|
||||||
|
|
||||||
import { ElasticsearchDataQuery } from '../../dataquery.gen';
|
import { ElasticsearchDataQuery } from '../../dataquery.gen';
|
||||||
import { ElasticDatasource } from '../../datasource';
|
import { ElasticDatasource } from '../../datasource';
|
||||||
import { useNextId } from '../../hooks/useNextId';
|
import { useNextId } from '../../hooks/useNextId';
|
||||||
import { useDispatch } from '../../hooks/useStatelessReducer';
|
import { useDispatch } from '../../hooks/useStatelessReducer';
|
||||||
import { EditorType, ElasticsearchOptions } from '../../types';
|
import { ElasticsearchOptions } from '../../types';
|
||||||
import { isSupportedVersion, isTimeSeriesQuery, unsupportedVersionMessage } from '../../utils';
|
import { isSupportedVersion, isTimeSeriesQuery, unsupportedVersionMessage } from '../../utils';
|
||||||
|
|
||||||
import { BucketAggregationsEditor } from './BucketAggregationsEditor';
|
import { BucketAggregationsEditor } from './BucketAggregationsEditor';
|
||||||
@@ -20,7 +20,7 @@ import { MetricAggregationsEditor } from './MetricAggregationsEditor';
|
|||||||
import { metricAggregationConfig } from './MetricAggregationsEditor/utils';
|
import { metricAggregationConfig } from './MetricAggregationsEditor/utils';
|
||||||
import { QueryTypeSelector } from './QueryTypeSelector';
|
import { QueryTypeSelector } from './QueryTypeSelector';
|
||||||
import { RawQueryEditor } from './RawQueryEditor';
|
import { RawQueryEditor } from './RawQueryEditor';
|
||||||
import { changeAliasPattern, changeEditorTypeAndResetQuery, changeQuery, changeRawDSLQuery } from './state';
|
import { changeAliasPattern, changeQuery, changeRawDSLQuery } from './state';
|
||||||
|
|
||||||
export type ElasticQueryEditorProps = QueryEditorProps<ElasticDatasource, ElasticsearchDataQuery, ElasticsearchOptions>;
|
export type ElasticQueryEditorProps = QueryEditorProps<ElasticDatasource, ElasticsearchDataQuery, ElasticsearchOptions>;
|
||||||
|
|
||||||
@@ -97,61 +97,31 @@ const QueryEditorForm = ({ value, onRunQuery }: Props & { onRunQuery: () => void
|
|||||||
const inputId = useId();
|
const inputId = useId();
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
|
|
||||||
const [switchModalOpen, setSwitchModalOpen] = useState(false);
|
|
||||||
const [pendingEditorType, setPendingEditorType] = useState<EditorType | null>(null);
|
|
||||||
|
|
||||||
const isTimeSeries = isTimeSeriesQuery(value);
|
const isTimeSeries = isTimeSeriesQuery(value);
|
||||||
|
|
||||||
const isCodeEditor = value.editorType === 'code';
|
const isCodeEditor = value.editorType === 'code';
|
||||||
const rawDSLFeatureEnabled = config.featureToggles.elasticsearchRawDSLQuery;
|
const rawDSLFeatureEnabled = config.featureToggles.elasticsearchRawDSLQuery;
|
||||||
|
|
||||||
// Default to 'builder' if editorType is empty
|
|
||||||
const currentEditorType: EditorType = value.editorType === 'code' ? 'code' : 'builder';
|
|
||||||
|
|
||||||
const showBucketAggregationsEditor = value.metrics?.every(
|
const showBucketAggregationsEditor = value.metrics?.every(
|
||||||
(metric) => metricAggregationConfig[metric.type].impliedQueryType === 'metrics'
|
(metric) => metricAggregationConfig[metric.type].impliedQueryType === 'metrics'
|
||||||
);
|
);
|
||||||
|
|
||||||
const onEditorTypeChange = useCallback((newEditorType: EditorType) => {
|
|
||||||
// Show warning modal when switching modes
|
|
||||||
setPendingEditorType(newEditorType);
|
|
||||||
setSwitchModalOpen(true);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const confirmEditorTypeChange = useCallback(() => {
|
|
||||||
if (pendingEditorType) {
|
|
||||||
dispatch(changeEditorTypeAndResetQuery(pendingEditorType));
|
|
||||||
}
|
|
||||||
setSwitchModalOpen(false);
|
|
||||||
setPendingEditorType(null);
|
|
||||||
}, [dispatch, pendingEditorType]);
|
|
||||||
|
|
||||||
const cancelEditorTypeChange = useCallback(() => {
|
|
||||||
setSwitchModalOpen(false);
|
|
||||||
setPendingEditorType(null);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
<ConfirmModal
|
|
||||||
isOpen={switchModalOpen}
|
|
||||||
title="Switch editor"
|
|
||||||
body="Switching between editors will reset your query. Are you sure you want to continue?"
|
|
||||||
confirmText="Continue"
|
|
||||||
onConfirm={confirmEditorTypeChange}
|
|
||||||
onDismiss={cancelEditorTypeChange}
|
|
||||||
/>
|
|
||||||
<div className={styles.root}>
|
<div className={styles.root}>
|
||||||
<InlineLabel width={17}>Query type</InlineLabel>
|
<InlineLabel width={17}>Query type</InlineLabel>
|
||||||
<div className={styles.queryItem}>
|
<div className={styles.queryItem}>
|
||||||
<QueryTypeSelector />
|
<QueryTypeSelector />
|
||||||
</div>
|
</div>
|
||||||
{rawDSLFeatureEnabled && (
|
|
||||||
<div style={{ marginLeft: 'auto' }}>
|
|
||||||
<EditorTypeSelector value={currentEditorType} onChange={onEditorTypeChange} />
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
</div>
|
||||||
|
{rawDSLFeatureEnabled && (
|
||||||
|
<div className={styles.root}>
|
||||||
|
<InlineLabel width={17}>Editor type</InlineLabel>
|
||||||
|
<div className={styles.queryItem}>
|
||||||
|
<EditorTypeSelector />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
{isCodeEditor && rawDSLFeatureEnabled && (
|
{isCodeEditor && rawDSLFeatureEnabled && (
|
||||||
<RawQueryEditor
|
<RawQueryEditor
|
||||||
|
|||||||
@@ -6383,15 +6383,12 @@
|
|||||||
},
|
},
|
||||||
"resource-export": {
|
"resource-export": {
|
||||||
"label": {
|
"label": {
|
||||||
"advanced-options": "Advanced options",
|
|
||||||
"classic": "Classic",
|
"classic": "Classic",
|
||||||
"json": "JSON",
|
"json": "JSON",
|
||||||
"v1-resource": "V1 Resource",
|
"v1-resource": "V1 Resource",
|
||||||
"v2-resource": "V2 Resource",
|
"v2-resource": "V2 Resource",
|
||||||
"yaml": "YAML"
|
"yaml": "YAML"
|
||||||
},
|
}
|
||||||
"share-externally": "Share dashboard with another instance",
|
|
||||||
"share-externally-tooltip": "Removes all instance-specific metadata and data source references from the resource before export."
|
|
||||||
},
|
},
|
||||||
"revert-dashboard-modal": {
|
"revert-dashboard-modal": {
|
||||||
"body-restore-version": "Are you sure you want to restore the dashboard to version {{version}}? All unsaved changes will be lost.",
|
"body-restore-version": "Are you sure you want to restore the dashboard to version {{version}}? All unsaved changes will be lost.",
|
||||||
@@ -7845,6 +7842,7 @@
|
|||||||
"export-externally-label": "Export the dashboard to use in another instance",
|
"export-externally-label": "Export the dashboard to use in another instance",
|
||||||
"export-format": "Format",
|
"export-format": "Format",
|
||||||
"export-mode": "Model",
|
"export-mode": "Model",
|
||||||
|
"export-remove-ds-refs": "Remove deployment details",
|
||||||
"info-text": "Copy or download a file containing the definition of your dashboard",
|
"info-text": "Copy or download a file containing the definition of your dashboard",
|
||||||
"title": "Export dashboard"
|
"title": "Export dashboard"
|
||||||
},
|
},
|
||||||
|
|||||||
Reference in New Issue
Block a user