Compare commits
13 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 2d1f51d02f | |||
| 9f2f93b401 | |||
| 9e399e0b19 | |||
| 2f520454ae | |||
| 72f7bd3900 | |||
| ba416eab4e | |||
| 189d50d815 | |||
| 450eaba447 | |||
| 87f5d5e741 | |||
| 5e68b07cac | |||
| 99acd3766d | |||
| f9eb5b7360 | |||
| 785c578e2f |
@@ -270,7 +270,17 @@ Click **View in CloudWatch console** to interactively view, search, and analyze
|
|||||||
|
|
||||||
### Query Log groups with OpenSearch SQL
|
### Query Log groups with OpenSearch SQL
|
||||||
|
|
||||||
When querying log groups with OpenSearch SQL, you **must** explicitly state the log group identifier or ARN in the `FROM` clause:
|
When querying log groups with OpenSearch SQL, you can use the `$__logGroups` macro to automatically reference log groups selected in the query editor's log group selector. This is the recommended approach as it allows you to manage log groups through the UI.
|
||||||
|
|
||||||
|
```sql
|
||||||
|
SELECT window.start, COUNT(*) AS exceptionCount
|
||||||
|
FROM $__logGroups
|
||||||
|
WHERE `@message` LIKE '%Exception%'
|
||||||
|
```
|
||||||
|
|
||||||
|
The `$__logGroups` macro expands to the proper `logGroups(logGroupIdentifier: [...])` syntax with the log groups you've selected in the UI.
|
||||||
|
|
||||||
|
Alternatively, you can manually specify a single log group directly in the `FROM` clause:
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
SELECT window.start, COUNT(*) AS exceptionCount
|
SELECT window.start, COUNT(*) AS exceptionCount
|
||||||
@@ -278,7 +288,7 @@ FROM `log_group`
|
|||||||
WHERE `@message` LIKE '%Exception%'
|
WHERE `@message` LIKE '%Exception%'
|
||||||
```
|
```
|
||||||
|
|
||||||
or, when querying multiple log groups:
|
When querying multiple log groups you **must** use the `logGroups(logGroupIdentifier: [...])` syntax:
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
SELECT window.start, COUNT(*) AS exceptionCount
|
SELECT window.start, COUNT(*) AS exceptionCount
|
||||||
@@ -286,6 +296,8 @@ FROM `logGroups( logGroupIdentifier: ['LogGroup1', 'LogGroup2'])`
|
|||||||
WHERE `@message` LIKE '%Exception%'
|
WHERE `@message` LIKE '%Exception%'
|
||||||
```
|
```
|
||||||
|
|
||||||
|
To reference log groups in a monitoring account, use ARNs instead of LogGroup names.
|
||||||
|
|
||||||
You can also write queries returning time series data by using the [`stats` command](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_Insights-Visualizing-Log-Data.html).
|
You can also write queries returning time series data by using the [`stats` command](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_Insights-Visualizing-Log-Data.html).
|
||||||
When making `stats` queries in [Explore](ref:explore), ensure you are in Metrics Explore mode.
|
When making `stats` queries in [Explore](ref:explore), ensure you are in Metrics Explore mode.
|
||||||
|
|
||||||
|
|||||||
+8
-2
@@ -4,7 +4,8 @@ comments: |
|
|||||||
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
||||||
---
|
---
|
||||||
|
|
||||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
You can pan the panel time range left and right, and zoom it and in and out.
|
||||||
|
This, in turn, changes the dashboard time range.
|
||||||
|
|
||||||
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
||||||
|
|
||||||
@@ -16,4 +17,9 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
|||||||
- Next range: 8:30 - 10:29
|
- Next range: 8:30 - 10:29
|
||||||
- Next range: 7:30 - 11:29
|
- Next range: 7:30 - 11:29
|
||||||
|
|
||||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#zoom-panel-time-range).
|
**Pan** - Click and drag the x-axis area of the panel to pan the time range.
|
||||||
|
|
||||||
|
The time range shifts by the distance you drag.
|
||||||
|
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||||
|
|
||||||
|
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#pan-and-zoom-panel-time-range).
|
||||||
@@ -317,13 +317,16 @@ Click the **Copy time range to clipboard** icon to copy the current time range t
|
|||||||
|
|
||||||
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
||||||
|
|
||||||
#### Zoom out (Cmd+Z or Ctrl+Z)
|
#### Zoom out
|
||||||
|
|
||||||
Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualization.
|
- Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualizations
|
||||||
|
- Double click on the panel graph area (time series family visualizations only)
|
||||||
|
- Type the `t-` keyboard shortcut
|
||||||
|
|
||||||
#### Zoom in (only applicable to graph visualizations)
|
#### Zoom in
|
||||||
|
|
||||||
Click and drag to select the time range in the visualization that you want to view.
|
- Click and drag horizontally in the panel graph area to select a time range (time series family visualizations only)
|
||||||
|
- Type the `t+` keyboard shortcut
|
||||||
|
|
||||||
#### Refresh dashboard
|
#### Refresh dashboard
|
||||||
|
|
||||||
|
|||||||
@@ -175,9 +175,10 @@ By hovering over a panel with the mouse you can use some shortcuts that will tar
|
|||||||
- `pl`: Hide or show legend
|
- `pl`: Hide or show legend
|
||||||
- `pr`: Remove Panel
|
- `pr`: Remove Panel
|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
You can pan the panel time range left and right, and zoom it and in and out.
|
||||||
|
This, in turn, changes the dashboard time range.
|
||||||
|
|
||||||
This feature is supported for the following visualizations:
|
This feature is supported for the following visualizations:
|
||||||
|
|
||||||
@@ -191,7 +192,7 @@ This feature is supported for the following visualizations:
|
|||||||
|
|
||||||
Click and drag on the panel to zoom in on a particular time range.
|
Click and drag on the panel to zoom in on a particular time range.
|
||||||
|
|
||||||
The following screen recordings show this interaction in the time series and x visualizations:
|
The following screen recordings show this interaction in the time series and candlestick visualizations:
|
||||||
|
|
||||||
Time series
|
Time series
|
||||||
|
|
||||||
@@ -211,7 +212,7 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
|||||||
- Next range: 8:30 - 10:29
|
- Next range: 8:30 - 10:29
|
||||||
- Next range: 7:30 - 11:29
|
- Next range: 7:30 - 11:29
|
||||||
|
|
||||||
The following screen recordings demonstrate the preceding example in the time series and x visualizations:
|
The following screen recordings demonstrate the preceding example in the time series and heatmap visualizations:
|
||||||
|
|
||||||
Time series
|
Time series
|
||||||
|
|
||||||
@@ -221,6 +222,19 @@ Heatmap
|
|||||||
|
|
||||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
||||||
|
|
||||||
|
### Pan
|
||||||
|
|
||||||
|
Click and drag the x-axis area of the panel to pan the time range.
|
||||||
|
|
||||||
|
The time range shifts by the distance you drag.
|
||||||
|
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||||
|
|
||||||
|
The following screen recordings show this interaction in the time series visualization:
|
||||||
|
|
||||||
|
Time series
|
||||||
|
|
||||||
|
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-ts-time-pan-mouse.mp4" >}}
|
||||||
|
|
||||||
## Add a panel
|
## Add a panel
|
||||||
|
|
||||||
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
||||||
|
|||||||
+2
-2
@@ -92,9 +92,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -79,9 +79,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
+2
-2
@@ -93,9 +93,9 @@ You can also create a state timeline visualization using time series data. To do
|
|||||||
|
|
||||||

|

|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
+2
-2
@@ -85,9 +85,9 @@ The data is converted as follows:
|
|||||||
|
|
||||||
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
+2
-2
@@ -167,9 +167,9 @@ The following example shows three series: Min, Max, and Value. The Min and Max s
|
|||||||
|
|
||||||
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
||||||
|
|
||||||
## Zoom panel time range
|
## Pan and zoom panel time range
|
||||||
|
|
||||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||||
|
|
||||||
## Configuration options
|
## Configuration options
|
||||||
|
|
||||||
|
|||||||
@@ -117,6 +117,44 @@ export const MyComponent = () => {
|
|||||||
};
|
};
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Custom Header Rendering
|
||||||
|
|
||||||
|
Column headers can be customized using strings, React elements, or renderer functions. The `header` property accepts any value that matches React Table's `Renderer` type.
|
||||||
|
|
||||||
|
**Important:** When using custom header content, prefer inline elements (like `<span>`) over block elements (like `<div>`) to avoid layout issues. Block-level elements can cause extra spacing and alignment problems in table headers because they disrupt the table's inline flow. Use `display: inline-flex` or `display: inline-block` when you need flexbox or block-like behavior.
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
const columns: Array<Column<TableData>> = [
|
||||||
|
// React element header
|
||||||
|
{
|
||||||
|
id: 'checkbox',
|
||||||
|
header: (
|
||||||
|
<>
|
||||||
|
<label htmlFor="select-all" className="sr-only">
|
||||||
|
Select all rows
|
||||||
|
</label>
|
||||||
|
<Checkbox id="select-all" />
|
||||||
|
</>
|
||||||
|
),
|
||||||
|
cell: () => <Checkbox aria-label="Select row" />,
|
||||||
|
},
|
||||||
|
|
||||||
|
// Function renderer header
|
||||||
|
{
|
||||||
|
id: 'firstName',
|
||||||
|
header: () => (
|
||||||
|
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||||
|
<Icon name="user" size="sm" />
|
||||||
|
<span>First Name</span>
|
||||||
|
</span>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
|
||||||
|
// String header
|
||||||
|
{ id: 'lastName', header: 'Last name' },
|
||||||
|
];
|
||||||
|
```
|
||||||
|
|
||||||
### Custom Cell Rendering
|
### Custom Cell Rendering
|
||||||
|
|
||||||
Individual cells can be rendered using custom content dy defining a `cell` property on the column definition.
|
Individual cells can be rendered using custom content dy defining a `cell` property on the column definition.
|
||||||
|
|||||||
@@ -3,8 +3,11 @@ import { useCallback, useMemo, useState } from 'react';
|
|||||||
import { CellProps } from 'react-table';
|
import { CellProps } from 'react-table';
|
||||||
|
|
||||||
import { LinkButton } from '../Button/Button';
|
import { LinkButton } from '../Button/Button';
|
||||||
|
import { Checkbox } from '../Forms/Checkbox';
|
||||||
import { Field } from '../Forms/Field';
|
import { Field } from '../Forms/Field';
|
||||||
|
import { Icon } from '../Icon/Icon';
|
||||||
import { Input } from '../Input/Input';
|
import { Input } from '../Input/Input';
|
||||||
|
import { Text } from '../Text/Text';
|
||||||
|
|
||||||
import { FetchDataArgs, InteractiveTable, InteractiveTableHeaderTooltip } from './InteractiveTable';
|
import { FetchDataArgs, InteractiveTable, InteractiveTableHeaderTooltip } from './InteractiveTable';
|
||||||
import mdx from './InteractiveTable.mdx';
|
import mdx from './InteractiveTable.mdx';
|
||||||
@@ -297,4 +300,40 @@ export const WithControlledSort: StoryFn<typeof InteractiveTable> = (args) => {
|
|||||||
return <InteractiveTable {...args} data={data} pageSize={15} fetchData={fetchData} />;
|
return <InteractiveTable {...args} data={data} pageSize={15} fetchData={fetchData} />;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const WithCustomHeader: TableStoryObj = {
|
||||||
|
args: {
|
||||||
|
columns: [
|
||||||
|
// React element header
|
||||||
|
{
|
||||||
|
id: 'checkbox',
|
||||||
|
header: (
|
||||||
|
<>
|
||||||
|
<label htmlFor="select-all" className="sr-only">
|
||||||
|
Select all rows
|
||||||
|
</label>
|
||||||
|
<Checkbox id="select-all" />
|
||||||
|
</>
|
||||||
|
),
|
||||||
|
cell: () => <Checkbox aria-label="Select row" />,
|
||||||
|
},
|
||||||
|
// Function renderer header
|
||||||
|
{
|
||||||
|
id: 'firstName',
|
||||||
|
header: () => (
|
||||||
|
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||||
|
<Icon name="user" size="sm" />
|
||||||
|
<Text element="span">First Name</Text>
|
||||||
|
</span>
|
||||||
|
),
|
||||||
|
sortType: 'string',
|
||||||
|
},
|
||||||
|
// String header
|
||||||
|
{ id: 'lastName', header: 'Last name', sortType: 'string' },
|
||||||
|
{ id: 'car', header: 'Car', sortType: 'string' },
|
||||||
|
{ id: 'age', header: 'Age', sortType: 'number' },
|
||||||
|
],
|
||||||
|
data: pageableData.slice(0, 10),
|
||||||
|
getRowId: (r) => r.id,
|
||||||
|
},
|
||||||
|
};
|
||||||
export default meta;
|
export default meta;
|
||||||
|
|||||||
@@ -2,6 +2,9 @@ import { render, screen, within } from '@testing-library/react';
|
|||||||
import userEvent from '@testing-library/user-event';
|
import userEvent from '@testing-library/user-event';
|
||||||
import * as React from 'react';
|
import * as React from 'react';
|
||||||
|
|
||||||
|
import { Checkbox } from '../Forms/Checkbox';
|
||||||
|
import { Icon } from '../Icon/Icon';
|
||||||
|
|
||||||
import { InteractiveTable } from './InteractiveTable';
|
import { InteractiveTable } from './InteractiveTable';
|
||||||
import { Column } from './types';
|
import { Column } from './types';
|
||||||
|
|
||||||
@@ -247,4 +250,104 @@ describe('InteractiveTable', () => {
|
|||||||
expect(fetchData).toHaveBeenCalledWith({ sortBy: [{ id: 'id', desc: false }] });
|
expect(fetchData).toHaveBeenCalledWith({ sortBy: [{ id: 'id', desc: false }] });
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('custom header rendering', () => {
|
||||||
|
it('should render string headers', () => {
|
||||||
|
const columns: Array<Column<TableData>> = [{ id: 'id', header: 'ID' }];
|
||||||
|
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||||
|
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||||
|
|
||||||
|
expect(screen.getByRole('columnheader', { name: 'ID' })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should render React element headers', () => {
|
||||||
|
const columns: Array<Column<TableData>> = [
|
||||||
|
{
|
||||||
|
id: 'checkbox',
|
||||||
|
header: (
|
||||||
|
<>
|
||||||
|
<label htmlFor="select-all" className="sr-only">
|
||||||
|
Select all rows
|
||||||
|
</label>
|
||||||
|
<Checkbox id="select-all" data-testid="header-checkbox" />
|
||||||
|
</>
|
||||||
|
),
|
||||||
|
cell: () => <Checkbox data-testid="cell-checkbox" aria-label="Select row" />,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||||
|
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||||
|
|
||||||
|
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
||||||
|
expect(screen.getByTestId('cell-checkbox')).toBeInTheDocument();
|
||||||
|
expect(screen.getByLabelText('Select all rows')).toBeInTheDocument();
|
||||||
|
expect(screen.getByLabelText('Select row')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Select all rows')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should render function renderer headers', () => {
|
||||||
|
const columns: Array<Column<TableData>> = [
|
||||||
|
{
|
||||||
|
id: 'firstName',
|
||||||
|
header: () => (
|
||||||
|
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||||
|
<Icon name="user" size="sm" data-testid="header-icon" />
|
||||||
|
<span>First Name</span>
|
||||||
|
</span>
|
||||||
|
),
|
||||||
|
sortType: 'string',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
const data: TableData[] = [{ id: '1', value: '1', country: 'Sweden' }];
|
||||||
|
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||||
|
|
||||||
|
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('columnheader', { name: /first name/i })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should render all header types together', () => {
|
||||||
|
const columns: Array<Column<TableData>> = [
|
||||||
|
{
|
||||||
|
id: 'checkbox',
|
||||||
|
header: (
|
||||||
|
<>
|
||||||
|
<label htmlFor="select-all" className="sr-only">
|
||||||
|
Select all rows
|
||||||
|
</label>
|
||||||
|
<Checkbox id="select-all" data-testid="header-checkbox" />
|
||||||
|
</>
|
||||||
|
),
|
||||||
|
cell: () => <Checkbox aria-label="Select row" />,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'id',
|
||||||
|
header: () => (
|
||||||
|
<span style={{ display: 'inline-flex', alignItems: 'center', gap: '8px' }}>
|
||||||
|
<Icon name="user" size="sm" data-testid="header-icon" />
|
||||||
|
<span>ID</span>
|
||||||
|
</span>
|
||||||
|
),
|
||||||
|
sortType: 'string',
|
||||||
|
},
|
||||||
|
{ id: 'country', header: 'Country', sortType: 'string' },
|
||||||
|
{ id: 'value', header: 'Value' },
|
||||||
|
];
|
||||||
|
const data: TableData[] = [
|
||||||
|
{ id: '1', value: 'Value 1', country: 'Sweden' },
|
||||||
|
{ id: '2', value: 'Value 2', country: 'Norway' },
|
||||||
|
];
|
||||||
|
render(<InteractiveTable columns={columns} data={data} getRowId={getRowId} />);
|
||||||
|
|
||||||
|
expect(screen.getByTestId('header-checkbox')).toBeInTheDocument();
|
||||||
|
expect(screen.getByTestId('header-icon')).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('columnheader', { name: 'Country' })).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('columnheader', { name: 'Value' })).toBeInTheDocument();
|
||||||
|
|
||||||
|
// Verify data is rendered
|
||||||
|
expect(screen.getByText('Sweden')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Norway')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Value 1')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Value 2')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { ReactNode } from 'react';
|
import { ReactNode } from 'react';
|
||||||
import { CellProps, DefaultSortTypes, IdType, SortByFn } from 'react-table';
|
import { CellProps, DefaultSortTypes, HeaderProps, IdType, Renderer, SortByFn } from 'react-table';
|
||||||
|
|
||||||
export interface Column<TableData extends object> {
|
export interface Column<TableData extends object> {
|
||||||
/**
|
/**
|
||||||
@@ -11,9 +11,9 @@ export interface Column<TableData extends object> {
|
|||||||
*/
|
*/
|
||||||
cell?: (props: CellProps<TableData>) => ReactNode;
|
cell?: (props: CellProps<TableData>) => ReactNode;
|
||||||
/**
|
/**
|
||||||
* Header name. if `undefined` the header will be empty. Useful for action columns.
|
* Header name. Can be a string, renderer function, or undefined. If `undefined` the header will be empty. Useful for action columns.
|
||||||
*/
|
*/
|
||||||
header?: string;
|
header?: Renderer<HeaderProps<TableData>>;
|
||||||
/**
|
/**
|
||||||
* Column sort type. If `undefined` the column will not be sortable.
|
* Column sort type. If `undefined` the column will not be sortable.
|
||||||
* */
|
* */
|
||||||
|
|||||||
@@ -76,21 +76,27 @@ func (hs *HTTPServer) CreateDashboardSnapshot(c *contextmodel.ReqContext) {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// Do not check permissions when the instance snapshot public mode is enabled
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
if !hs.Cfg.SnapshotPublicMode {
|
|
||||||
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
|
||||||
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
|
||||||
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
dashboardsnapshots.CreateDashboardSnapshot(c, snapshot.SnapshotSharingOptions{
|
|
||||||
SnapshotsEnabled: hs.Cfg.SnapshotEnabled,
|
SnapshotsEnabled: hs.Cfg.SnapshotEnabled,
|
||||||
ExternalEnabled: hs.Cfg.ExternalEnabled,
|
ExternalEnabled: hs.Cfg.ExternalEnabled,
|
||||||
ExternalSnapshotName: hs.Cfg.ExternalSnapshotName,
|
ExternalSnapshotName: hs.Cfg.ExternalSnapshotName,
|
||||||
ExternalSnapshotURL: hs.Cfg.ExternalSnapshotUrl,
|
ExternalSnapshotURL: hs.Cfg.ExternalSnapshotUrl,
|
||||||
}, cmd, hs.dashboardsnapshotsService)
|
}
|
||||||
|
|
||||||
|
if hs.Cfg.SnapshotPublicMode {
|
||||||
|
// Public mode: no user or dashboard validation needed
|
||||||
|
dashboardsnapshots.CreateDashboardSnapshotPublic(c, cfg, cmd, hs.dashboardsnapshotsService)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Regular mode: check permissions
|
||||||
|
evaluator := ac.EvalAll(ac.EvalPermission(dashboards.ActionSnapshotsCreate), ac.EvalPermission(dashboards.ActionDashboardsRead, dashboards.ScopeDashboardsProvider.GetResourceScopeUID(cmd.Dashboard.GetNestedString("uid"))))
|
||||||
|
if canSave, err := hs.AccessControl.Evaluate(c.Req.Context(), c.SignedInUser, evaluator); err != nil || !canSave {
|
||||||
|
c.JsonApiErr(http.StatusForbidden, "forbidden", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
dashboardsnapshots.CreateDashboardSnapshot(c, cfg, cmd, hs.dashboardsnapshotsService)
|
||||||
}
|
}
|
||||||
|
|
||||||
// GET /api/snapshots/:key
|
// GET /api/snapshots/:key
|
||||||
@@ -213,13 +219,6 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
|||||||
return response.Error(http.StatusUnauthorized, "OrgID mismatch", nil)
|
return response.Error(http.StatusUnauthorized, "OrgID mismatch", nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
if queryResult.External {
|
|
||||||
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
|
||||||
if err != nil {
|
|
||||||
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Dashboard can be empty (creation error or external snapshot). This means that the mustInt here returns a 0,
|
// Dashboard can be empty (creation error or external snapshot). This means that the mustInt here returns a 0,
|
||||||
// which before RBAC would result in a dashboard which has no ACL. A dashboard without an ACL would fallback
|
// which before RBAC would result in a dashboard which has no ACL. A dashboard without an ACL would fallback
|
||||||
// to the user’s org role, which for editors and admins would essentially always be allowed here. With RBAC,
|
// to the user’s org role, which for editors and admins would essentially always be allowed here. With RBAC,
|
||||||
@@ -239,6 +238,13 @@ func (hs *HTTPServer) DeleteDashboardSnapshot(c *contextmodel.ReqContext) respon
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if queryResult.External {
|
||||||
|
err := dashboardsnapshots.DeleteExternalDashboardSnapshot(queryResult.ExternalDeleteURL)
|
||||||
|
if err != nil {
|
||||||
|
return response.Error(http.StatusInternalServerError, "Failed to delete external dashboard", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
cmd := &dashboardsnapshots.DeleteDashboardSnapshotCommand{DeleteKey: queryResult.DeleteKey}
|
cmd := &dashboardsnapshots.DeleteDashboardSnapshotCommand{DeleteKey: queryResult.DeleteKey}
|
||||||
|
|
||||||
if err := hs.dashboardsnapshotsService.DeleteDashboardSnapshot(c.Req.Context(), cmd); err != nil {
|
if err := hs.dashboardsnapshotsService.DeleteDashboardSnapshot(c.Req.Context(), cmd); err != nil {
|
||||||
|
|||||||
@@ -32,6 +32,8 @@ import (
|
|||||||
var (
|
var (
|
||||||
logger = glog.New("data-proxy-log")
|
logger = glog.New("data-proxy-log")
|
||||||
client = newHTTPClient()
|
client = newHTTPClient()
|
||||||
|
|
||||||
|
errPluginProxyRouteAccessDenied = errors.New("plugin proxy route access denied")
|
||||||
)
|
)
|
||||||
|
|
||||||
type DataSourceProxy struct {
|
type DataSourceProxy struct {
|
||||||
@@ -308,12 +310,21 @@ func (proxy *DataSourceProxy) validateRequest() error {
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
// issues/116273: When we have an empty input route (or input that becomes relative to "."), we do not want it
|
||||||
|
// to be ".". This is because the `CleanRelativePath` function will never return "./" prefixes, and as such,
|
||||||
|
// the common prefix we need is an empty string.
|
||||||
|
if r1 == "." && proxy.proxyPath != "." {
|
||||||
|
r1 = ""
|
||||||
|
}
|
||||||
|
if r2 == "." && route.Path != "." {
|
||||||
|
r2 = ""
|
||||||
|
}
|
||||||
if !strings.HasPrefix(r1, r2) {
|
if !strings.HasPrefix(r1, r2) {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
if !proxy.hasAccessToRoute(route) {
|
if !proxy.hasAccessToRoute(route) {
|
||||||
return errors.New("plugin proxy route access denied")
|
return errPluginProxyRouteAccessDenied
|
||||||
}
|
}
|
||||||
|
|
||||||
proxy.matchedRoute = route
|
proxy.matchedRoute = route
|
||||||
|
|||||||
@@ -673,6 +673,94 @@ func TestIntegrationDataSourceProxy_routeRule(t *testing.T) {
|
|||||||
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
t.Run("Regression of 116273: Fallback routes should apply fallback route roles", func(t *testing.T) {
|
||||||
|
for _, tc := range []struct {
|
||||||
|
InputPath string
|
||||||
|
ConfigurationPath string
|
||||||
|
ExpectError bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
InputPath: "api/v2/leak-ur-secrets",
|
||||||
|
ConfigurationPath: "",
|
||||||
|
ExpectError: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
InputPath: "",
|
||||||
|
ConfigurationPath: "",
|
||||||
|
ExpectError: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
InputPath: ".",
|
||||||
|
ConfigurationPath: ".",
|
||||||
|
ExpectError: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
InputPath: "",
|
||||||
|
ConfigurationPath: ".",
|
||||||
|
ExpectError: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
InputPath: "api",
|
||||||
|
ConfigurationPath: ".",
|
||||||
|
ExpectError: false,
|
||||||
|
},
|
||||||
|
} {
|
||||||
|
orEmptyStr := func(s string) string {
|
||||||
|
if s == "" {
|
||||||
|
return "<empty>"
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
t.Run(
|
||||||
|
fmt.Sprintf("with inputPath=%s, configurationPath=%s, expectError=%v",
|
||||||
|
orEmptyStr(tc.InputPath), orEmptyStr(tc.ConfigurationPath), tc.ExpectError),
|
||||||
|
func(t *testing.T) {
|
||||||
|
ds := &datasources.DataSource{
|
||||||
|
UID: "dsUID",
|
||||||
|
JsonData: simplejson.New(),
|
||||||
|
}
|
||||||
|
routes := []*plugins.Route{
|
||||||
|
{
|
||||||
|
Path: tc.ConfigurationPath,
|
||||||
|
ReqRole: org.RoleAdmin,
|
||||||
|
Method: "GET",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Path: tc.ConfigurationPath,
|
||||||
|
ReqRole: org.RoleAdmin,
|
||||||
|
Method: "POST",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Path: tc.ConfigurationPath,
|
||||||
|
ReqRole: org.RoleAdmin,
|
||||||
|
Method: "PUT",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Path: tc.ConfigurationPath,
|
||||||
|
ReqRole: org.RoleAdmin,
|
||||||
|
Method: "DELETE",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
req, err := http.NewRequestWithContext(t.Context(), "GET", "http://localhost/"+tc.InputPath, nil)
|
||||||
|
require.NoError(t, err, "failed to create HTTP request")
|
||||||
|
ctx := &contextmodel.ReqContext{
|
||||||
|
Context: &web.Context{Req: req},
|
||||||
|
SignedInUser: &user.SignedInUser{OrgRole: org.RoleViewer},
|
||||||
|
}
|
||||||
|
proxy, err := setupDSProxyTest(t, ctx, ds, routes, tc.InputPath)
|
||||||
|
require.NoError(t, err, "failed to setup proxy test")
|
||||||
|
err = proxy.validateRequest()
|
||||||
|
if tc.ExpectError {
|
||||||
|
require.ErrorIs(t, err, errPluginProxyRouteAccessDenied, "request was not denied due to access denied?")
|
||||||
|
} else {
|
||||||
|
require.NoError(t, err, "request was unexpectedly denied access")
|
||||||
|
}
|
||||||
|
},
|
||||||
|
)
|
||||||
|
}
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
// test DataSourceProxy request handling.
|
// test DataSourceProxy request handling.
|
||||||
|
|||||||
@@ -36,6 +36,9 @@ var client = &http.Client{
|
|||||||
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
|
Transport: &http.Transport{Proxy: http.ProxyFromEnvironment},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CreateDashboardSnapshot creates a snapshot when running Grafana in regular mode.
|
||||||
|
// It validates the user and dashboard exist before creating the snapshot.
|
||||||
|
// This mode supports both local and external snapshots.
|
||||||
func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
||||||
if !cfg.SnapshotsEnabled {
|
if !cfg.SnapshotsEnabled {
|
||||||
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
||||||
@@ -43,6 +46,7 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
}
|
}
|
||||||
|
|
||||||
uid := cmd.Dashboard.GetNestedString("uid")
|
uid := cmd.Dashboard.GetNestedString("uid")
|
||||||
|
|
||||||
user, err := identity.GetRequester(c.Req.Context())
|
user, err := identity.GetRequester(c.Req.Context())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
c.JsonApiErr(http.StatusBadRequest, "missing user in context", nil)
|
c.JsonApiErr(http.StatusBadRequest, "missing user in context", nil)
|
||||||
@@ -59,21 +63,18 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
cmd.ExternalURL = ""
|
||||||
|
cmd.OrgID = user.GetOrgID()
|
||||||
|
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
||||||
|
|
||||||
if cmd.Name == "" {
|
if cmd.Name == "" {
|
||||||
cmd.Name = "Unnamed snapshot"
|
cmd.Name = "Unnamed snapshot"
|
||||||
}
|
}
|
||||||
|
|
||||||
var snapshotUrl string
|
var snapshotURL string
|
||||||
cmd.ExternalURL = ""
|
|
||||||
cmd.OrgID = user.GetOrgID()
|
|
||||||
cmd.UserID, _ = identity.UserIdentifier(user.GetID())
|
|
||||||
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
|
||||||
if err != nil {
|
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
if cmd.External {
|
if cmd.External {
|
||||||
|
// Handle external snapshot creation
|
||||||
if !cfg.ExternalEnabled {
|
if !cfg.ExternalEnabled {
|
||||||
c.JsonApiErr(http.StatusForbidden, "External dashboard creation is disabled", nil)
|
c.JsonApiErr(http.StatusForbidden, "External dashboard creation is disabled", nil)
|
||||||
return
|
return
|
||||||
@@ -85,40 +86,83 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
snapshotUrl = resp.Url
|
|
||||||
cmd.Key = resp.Key
|
cmd.Key = resp.Key
|
||||||
cmd.DeleteKey = resp.DeleteKey
|
cmd.DeleteKey = resp.DeleteKey
|
||||||
cmd.ExternalURL = resp.Url
|
cmd.ExternalURL = resp.Url
|
||||||
cmd.ExternalDeleteURL = resp.DeleteUrl
|
cmd.ExternalDeleteURL = resp.DeleteUrl
|
||||||
cmd.Dashboard = &common.Unstructured{}
|
cmd.Dashboard = &common.Unstructured{}
|
||||||
|
snapshotURL = resp.Url
|
||||||
|
|
||||||
metrics.MApiDashboardSnapshotExternal.Inc()
|
metrics.MApiDashboardSnapshotExternal.Inc()
|
||||||
} else {
|
} else {
|
||||||
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
// Handle local snapshot creation
|
||||||
|
originalDashboardURL, err := createOriginalDashboardURL(&cmd)
|
||||||
if cmd.Key == "" {
|
if err != nil {
|
||||||
var err error
|
c.JsonApiErr(http.StatusInternalServerError, "Invalid app URL", err)
|
||||||
cmd.Key, err = util.GetRandomString(32)
|
return
|
||||||
if err != nil {
|
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if cmd.DeleteKey == "" {
|
snapshotURL, err = prepareLocalSnapshot(&cmd, originalDashboardURL)
|
||||||
var err error
|
if err != nil {
|
||||||
cmd.DeleteKey, err = util.GetRandomString(32)
|
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||||
if err != nil {
|
return
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
snapshotUrl = setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key)
|
|
||||||
|
|
||||||
metrics.MApiDashboardSnapshotCreate.Inc()
|
metrics.MApiDashboardSnapshotCreate.Inc()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
saveAndRespond(c, svc, cmd, snapshotURL)
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateDashboardSnapshotPublic creates a snapshot when running Grafana in public mode.
|
||||||
|
// In public mode, there is no user or dashboard information to validate.
|
||||||
|
// Only local snapshots are supported (external snapshots are not available).
|
||||||
|
func CreateDashboardSnapshotPublic(c *contextmodel.ReqContext, cfg snapshot.SnapshotSharingOptions, cmd CreateDashboardSnapshotCommand, svc Service) {
|
||||||
|
if !cfg.SnapshotsEnabled {
|
||||||
|
c.JsonApiErr(http.StatusForbidden, "Dashboard Snapshots are disabled", nil)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if cmd.Name == "" {
|
||||||
|
cmd.Name = "Unnamed snapshot"
|
||||||
|
}
|
||||||
|
|
||||||
|
snapshotURL, err := prepareLocalSnapshot(&cmd, "")
|
||||||
|
if err != nil {
|
||||||
|
c.JsonApiErr(http.StatusInternalServerError, "Could not generate random string", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
metrics.MApiDashboardSnapshotCreate.Inc()
|
||||||
|
|
||||||
|
saveAndRespond(c, svc, cmd, snapshotURL)
|
||||||
|
}
|
||||||
|
|
||||||
|
// prepareLocalSnapshot prepares the command for a local snapshot and returns the snapshot URL.
|
||||||
|
func prepareLocalSnapshot(cmd *CreateDashboardSnapshotCommand, originalDashboardURL string) (string, error) {
|
||||||
|
cmd.Dashboard.SetNestedField(originalDashboardURL, "snapshot", "originalUrl")
|
||||||
|
|
||||||
|
if cmd.Key == "" {
|
||||||
|
key, err := util.GetRandomString(32)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
cmd.Key = key
|
||||||
|
}
|
||||||
|
|
||||||
|
if cmd.DeleteKey == "" {
|
||||||
|
deleteKey, err := util.GetRandomString(32)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
cmd.DeleteKey = deleteKey
|
||||||
|
}
|
||||||
|
|
||||||
|
return setting.ToAbsUrl("dashboard/snapshot/" + cmd.Key), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// saveAndRespond saves the snapshot and sends the response.
|
||||||
|
func saveAndRespond(c *contextmodel.ReqContext, svc Service, cmd CreateDashboardSnapshotCommand, snapshotURL string) {
|
||||||
result, err := svc.CreateDashboardSnapshot(c.Req.Context(), &cmd)
|
result, err := svc.CreateDashboardSnapshot(c.Req.Context(), &cmd)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
c.JsonApiErr(http.StatusInternalServerError, "Failed to create snapshot", err)
|
c.JsonApiErr(http.StatusInternalServerError, "Failed to create snapshot", err)
|
||||||
@@ -128,7 +172,7 @@ func CreateDashboardSnapshot(c *contextmodel.ReqContext, cfg snapshot.SnapshotSh
|
|||||||
c.JSON(http.StatusOK, snapshot.DashboardCreateResponse{
|
c.JSON(http.StatusOK, snapshot.DashboardCreateResponse{
|
||||||
Key: result.Key,
|
Key: result.Key,
|
||||||
DeleteKey: result.DeleteKey,
|
DeleteKey: result.DeleteKey,
|
||||||
URL: snapshotUrl,
|
URL: snapshotURL,
|
||||||
DeleteURL: setting.ToAbsUrl("api/snapshots-delete/" + result.DeleteKey),
|
DeleteURL: setting.ToAbsUrl("api/snapshots-delete/" + result.DeleteKey),
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -20,40 +20,30 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/web"
|
"github.com/grafana/grafana/pkg/web"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestCreateDashboardSnapshot_DashboardNotFound(t *testing.T) {
|
func createTestDashboard(t *testing.T) *common.Unstructured {
|
||||||
mockService := &MockService{}
|
t.Helper()
|
||||||
cfg := snapshot.SnapshotSharingOptions{
|
dashboard := &common.Unstructured{}
|
||||||
SnapshotsEnabled: true,
|
dashboardData := map[string]any{
|
||||||
ExternalEnabled: false,
|
"uid": "test-dashboard-uid",
|
||||||
|
"id": 123,
|
||||||
}
|
}
|
||||||
testUser := &user.SignedInUser{
|
dashboardBytes, _ := json.Marshal(dashboardData)
|
||||||
|
_ = json.Unmarshal(dashboardBytes, dashboard)
|
||||||
|
return dashboard
|
||||||
|
}
|
||||||
|
|
||||||
|
func createTestUser() *user.SignedInUser {
|
||||||
|
return &user.SignedInUser{
|
||||||
UserID: 1,
|
UserID: 1,
|
||||||
OrgID: 1,
|
OrgID: 1,
|
||||||
Login: "testuser",
|
Login: "testuser",
|
||||||
Name: "Test User",
|
Name: "Test User",
|
||||||
Email: "test@example.com",
|
Email: "test@example.com",
|
||||||
}
|
}
|
||||||
dashboard := &common.Unstructured{}
|
}
|
||||||
dashboardData := map[string]interface{}{
|
|
||||||
"uid": "test-dashboard-uid",
|
|
||||||
"id": 123,
|
|
||||||
}
|
|
||||||
dashboardBytes, _ := json.Marshal(dashboardData)
|
|
||||||
_ = json.Unmarshal(dashboardBytes, dashboard)
|
|
||||||
|
|
||||||
cmd := CreateDashboardSnapshotCommand{
|
|
||||||
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
|
||||||
Dashboard: dashboard,
|
|
||||||
Name: "Test Snapshot",
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
|
||||||
Return(dashboards.ErrDashboardNotFound)
|
|
||||||
|
|
||||||
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
|
||||||
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
|
||||||
|
|
||||||
|
func createReqContext(t *testing.T, req *http.Request, testUser *user.SignedInUser) (*contextmodel.ReqContext, *httptest.ResponseRecorder) {
|
||||||
|
t.Helper()
|
||||||
recorder := httptest.NewRecorder()
|
recorder := httptest.NewRecorder()
|
||||||
ctx := &contextmodel.ReqContext{
|
ctx := &contextmodel.ReqContext{
|
||||||
Context: &web.Context{
|
Context: &web.Context{
|
||||||
@@ -63,13 +53,319 @@ func TestCreateDashboardSnapshot_DashboardNotFound(t *testing.T) {
|
|||||||
SignedInUser: testUser,
|
SignedInUser: testUser,
|
||||||
Logger: log.NewNopLogger(),
|
Logger: log.NewNopLogger(),
|
||||||
}
|
}
|
||||||
|
return ctx, recorder
|
||||||
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
}
|
||||||
|
|
||||||
mockService.AssertExpectations(t)
|
// TestCreateDashboardSnapshot tests snapshot creation in regular mode (non-public instance).
|
||||||
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
// These tests cover scenarios when Grafana is running as a regular server with user authentication.
|
||||||
var response map[string]interface{}
|
func TestCreateDashboardSnapshot(t *testing.T) {
|
||||||
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
t.Run("should return error when dashboard not found", func(t *testing.T) {
|
||||||
require.NoError(t, err)
|
mockService := &MockService{}
|
||||||
assert.Equal(t, "Dashboard not found", response["message"])
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: true,
|
||||||
|
ExternalEnabled: false,
|
||||||
|
}
|
||||||
|
testUser := createTestUser()
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test Snapshot",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||||
|
Return(dashboards.ErrDashboardNotFound)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||||
|
ctx, recorder := createReqContext(t, req, testUser)
|
||||||
|
|
||||||
|
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
mockService.AssertExpectations(t)
|
||||||
|
assert.Equal(t, http.StatusBadRequest, recorder.Code)
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "Dashboard not found", response["message"])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should create external snapshot when external is enabled", func(t *testing.T) {
|
||||||
|
externalServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
assert.Equal(t, "/api/snapshots", r.URL.Path)
|
||||||
|
assert.Equal(t, "POST", r.Method)
|
||||||
|
|
||||||
|
response := map[string]any{
|
||||||
|
"key": "external-key",
|
||||||
|
"deleteKey": "external-delete-key",
|
||||||
|
"url": "https://external.example.com/dashboard/snapshot/external-key",
|
||||||
|
"deleteUrl": "https://external.example.com/api/snapshots-delete/external-delete-key",
|
||||||
|
}
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
_ = json.NewEncoder(w).Encode(response)
|
||||||
|
}))
|
||||||
|
defer externalServer.Close()
|
||||||
|
|
||||||
|
mockService := NewMockService(t)
|
||||||
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: true,
|
||||||
|
ExternalEnabled: true,
|
||||||
|
ExternalSnapshotURL: externalServer.URL,
|
||||||
|
}
|
||||||
|
testUser := createTestUser()
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test External Snapshot",
|
||||||
|
External: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||||
|
Return(nil)
|
||||||
|
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||||
|
Return(&DashboardSnapshot{
|
||||||
|
Key: "external-key",
|
||||||
|
DeleteKey: "external-delete-key",
|
||||||
|
}, nil)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||||
|
ctx, recorder := createReqContext(t, req, testUser)
|
||||||
|
|
||||||
|
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
mockService.AssertExpectations(t)
|
||||||
|
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||||
|
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "external-key", response["key"])
|
||||||
|
assert.Equal(t, "external-delete-key", response["deleteKey"])
|
||||||
|
assert.Equal(t, "https://external.example.com/dashboard/snapshot/external-key", response["url"])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should return forbidden when external is disabled", func(t *testing.T) {
|
||||||
|
mockService := NewMockService(t)
|
||||||
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: true,
|
||||||
|
ExternalEnabled: false,
|
||||||
|
}
|
||||||
|
testUser := createTestUser()
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test External Snapshot",
|
||||||
|
External: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||||
|
Return(nil)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||||
|
ctx, recorder := createReqContext(t, req, testUser)
|
||||||
|
|
||||||
|
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
mockService.AssertExpectations(t)
|
||||||
|
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
||||||
|
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "External dashboard creation is disabled", response["message"])
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should create local snapshot", func(t *testing.T) {
|
||||||
|
mockService := NewMockService(t)
|
||||||
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: true,
|
||||||
|
}
|
||||||
|
testUser := createTestUser()
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test Local Snapshot",
|
||||||
|
},
|
||||||
|
Key: "local-key",
|
||||||
|
DeleteKey: "local-delete-key",
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("ValidateDashboardExists", mock.Anything, int64(1), "test-dashboard-uid").
|
||||||
|
Return(nil)
|
||||||
|
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||||
|
Return(&DashboardSnapshot{
|
||||||
|
Key: "local-key",
|
||||||
|
DeleteKey: "local-delete-key",
|
||||||
|
}, nil)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
req = req.WithContext(identity.WithRequester(req.Context(), testUser))
|
||||||
|
ctx, recorder := createReqContext(t, req, testUser)
|
||||||
|
|
||||||
|
CreateDashboardSnapshot(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
mockService.AssertExpectations(t)
|
||||||
|
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||||
|
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "local-key", response["key"])
|
||||||
|
assert.Equal(t, "local-delete-key", response["deleteKey"])
|
||||||
|
assert.Contains(t, response["url"], "dashboard/snapshot/local-key")
|
||||||
|
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/local-delete-key")
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestCreateDashboardSnapshotPublic tests snapshot creation in public mode.
|
||||||
|
// These tests cover scenarios when Grafana is running as a public snapshot server
|
||||||
|
// where no user authentication or dashboard validation is required.
|
||||||
|
func TestCreateDashboardSnapshotPublic(t *testing.T) {
|
||||||
|
t.Run("should create local snapshot without user context", func(t *testing.T) {
|
||||||
|
mockService := NewMockService(t)
|
||||||
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: true,
|
||||||
|
}
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test Snapshot",
|
||||||
|
},
|
||||||
|
Key: "test-key",
|
||||||
|
DeleteKey: "test-delete-key",
|
||||||
|
}
|
||||||
|
|
||||||
|
mockService.On("CreateDashboardSnapshot", mock.Anything, mock.Anything).
|
||||||
|
Return(&DashboardSnapshot{
|
||||||
|
Key: "test-key",
|
||||||
|
DeleteKey: "test-delete-key",
|
||||||
|
}, nil)
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
recorder := httptest.NewRecorder()
|
||||||
|
ctx := &contextmodel.ReqContext{
|
||||||
|
Context: &web.Context{
|
||||||
|
Req: req,
|
||||||
|
Resp: web.NewResponseWriter("POST", recorder),
|
||||||
|
},
|
||||||
|
Logger: log.NewNopLogger(),
|
||||||
|
}
|
||||||
|
|
||||||
|
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
mockService.AssertExpectations(t)
|
||||||
|
assert.Equal(t, http.StatusOK, recorder.Code)
|
||||||
|
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "test-key", response["key"])
|
||||||
|
assert.Equal(t, "test-delete-key", response["deleteKey"])
|
||||||
|
assert.Contains(t, response["url"], "dashboard/snapshot/test-key")
|
||||||
|
assert.Contains(t, response["deleteUrl"], "api/snapshots-delete/test-delete-key")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should return forbidden when snapshots are disabled", func(t *testing.T) {
|
||||||
|
mockService := NewMockService(t)
|
||||||
|
cfg := snapshot.SnapshotSharingOptions{
|
||||||
|
SnapshotsEnabled: false,
|
||||||
|
}
|
||||||
|
dashboard := createTestDashboard(t)
|
||||||
|
|
||||||
|
cmd := CreateDashboardSnapshotCommand{
|
||||||
|
DashboardCreateCommand: snapshot.DashboardCreateCommand{
|
||||||
|
Dashboard: dashboard,
|
||||||
|
Name: "Test Snapshot",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
req, _ := http.NewRequest("POST", "/api/snapshots", nil)
|
||||||
|
recorder := httptest.NewRecorder()
|
||||||
|
ctx := &contextmodel.ReqContext{
|
||||||
|
Context: &web.Context{
|
||||||
|
Req: req,
|
||||||
|
Resp: web.NewResponseWriter("POST", recorder),
|
||||||
|
},
|
||||||
|
Logger: log.NewNopLogger(),
|
||||||
|
}
|
||||||
|
|
||||||
|
CreateDashboardSnapshotPublic(ctx, cfg, cmd, mockService)
|
||||||
|
|
||||||
|
assert.Equal(t, http.StatusForbidden, recorder.Code)
|
||||||
|
|
||||||
|
var response map[string]any
|
||||||
|
err := json.Unmarshal(recorder.Body.Bytes(), &response)
|
||||||
|
require.NoError(t, err)
|
||||||
|
assert.Equal(t, "Dashboard Snapshots are disabled", response["message"])
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestDeleteExternalDashboardSnapshot tests deletion of external snapshots.
|
||||||
|
// This function is called in public mode and doesn't require user context.
|
||||||
|
func TestDeleteExternalDashboardSnapshot(t *testing.T) {
|
||||||
|
t.Run("should return nil on successful deletion", func(t *testing.T) {
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
assert.Equal(t, "GET", r.Method)
|
||||||
|
w.WriteHeader(http.StatusOK)
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should gracefully handle already deleted snapshot", func(t *testing.T) {
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
w.WriteHeader(http.StatusInternalServerError)
|
||||||
|
response := map[string]any{
|
||||||
|
"message": "Failed to get dashboard snapshot",
|
||||||
|
}
|
||||||
|
_ = json.NewEncoder(w).Encode(response)
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should return error on unexpected status code", func(t *testing.T) {
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
w.WriteHeader(http.StatusNotFound)
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||||
|
assert.Error(t, err)
|
||||||
|
assert.Contains(t, err.Error(), "unexpected response when deleting external snapshot")
|
||||||
|
assert.Contains(t, err.Error(), "404")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("should return error on 500 with different message", func(t *testing.T) {
|
||||||
|
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
w.WriteHeader(http.StatusInternalServerError)
|
||||||
|
response := map[string]any{
|
||||||
|
"message": "Some other error",
|
||||||
|
}
|
||||||
|
_ = json.NewEncoder(w).Encode(response)
|
||||||
|
}))
|
||||||
|
defer server.Close()
|
||||||
|
|
||||||
|
err := DeleteExternalDashboardSnapshot(server.URL)
|
||||||
|
assert.Error(t, err)
|
||||||
|
assert.Contains(t, err.Error(), "500")
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ import (
|
|||||||
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
||||||
|
"github.com/grafana/grafana/pkg/storage/unified/sql/rvmanager"
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
||||||
gocache "github.com/patrickmn/go-cache"
|
gocache "github.com/patrickmn/go-cache"
|
||||||
)
|
)
|
||||||
@@ -868,10 +869,18 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
if key.Action == DataActionDeleted {
|
if key.Action == DataActionDeleted {
|
||||||
generation = 0
|
generation = 0
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// In compatibility mode, the previous RV, when available, is saved as a microsecond
|
||||||
|
// timestamp, as is done in the SQL backend.
|
||||||
|
previousRV := event.PreviousRV
|
||||||
|
if event.PreviousRV > 0 && isSnowflake(event.PreviousRV) {
|
||||||
|
previousRV = rvmanager.RVFromSnowflake(event.PreviousRV)
|
||||||
|
}
|
||||||
|
|
||||||
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
||||||
SQLTemplate: sqltemplate.New(kv.dialect),
|
SQLTemplate: sqltemplate.New(kv.dialect),
|
||||||
GUID: key.GUID,
|
GUID: key.GUID,
|
||||||
PreviousRV: event.PreviousRV,
|
PreviousRV: previousRV,
|
||||||
Generation: generation,
|
Generation: generation,
|
||||||
})
|
})
|
||||||
|
|
||||||
@@ -900,7 +909,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
Name: key.Name,
|
Name: key.Name,
|
||||||
Action: action,
|
Action: action,
|
||||||
Folder: key.Folder,
|
Folder: key.Folder,
|
||||||
PreviousRV: event.PreviousRV,
|
PreviousRV: previousRV,
|
||||||
})
|
})
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -916,7 +925,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
Name: key.Name,
|
Name: key.Name,
|
||||||
Action: action,
|
Action: action,
|
||||||
Folder: key.Folder,
|
Folder: key.Folder,
|
||||||
PreviousRV: event.PreviousRV,
|
PreviousRV: previousRV,
|
||||||
})
|
})
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -938,3 +947,15 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
|||||||
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// isSnowflake returns whether the argument passed is a snowflake ID (new) or a microsecond timestamp (old).
|
||||||
|
// We try to interpret the number as a microsecond timestamp first. If it represents a time in the past,
|
||||||
|
// it is considered a microsecond timestamp. Snowflake IDs are much larger integers and would lead
|
||||||
|
// to dates in the future if interpreted as a microsecond timestamp.
|
||||||
|
func isSnowflake(rv int64) bool {
|
||||||
|
ts := time.UnixMicro(rv)
|
||||||
|
oneHourFromNow := time.Now().Add(time.Hour)
|
||||||
|
isMicroSecRV := ts.Before(oneHourFromNow)
|
||||||
|
|
||||||
|
return !isMicroSecRV
|
||||||
|
}
|
||||||
|
|||||||
@@ -19,13 +19,18 @@ const (
|
|||||||
defaultBufferSize = 10000
|
defaultBufferSize = 10000
|
||||||
)
|
)
|
||||||
|
|
||||||
type notifier struct {
|
type notifier interface {
|
||||||
|
Watch(context.Context, watchOptions) <-chan Event
|
||||||
|
}
|
||||||
|
|
||||||
|
type pollingNotifier struct {
|
||||||
eventStore *eventStore
|
eventStore *eventStore
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
}
|
}
|
||||||
|
|
||||||
type notifierOptions struct {
|
type notifierOptions struct {
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
|
useChannelNotifier bool
|
||||||
}
|
}
|
||||||
|
|
||||||
type watchOptions struct {
|
type watchOptions struct {
|
||||||
@@ -44,15 +49,26 @@ func defaultWatchOptions() watchOptions {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func newNotifier(eventStore *eventStore, opts notifierOptions) *notifier {
|
func newNotifier(eventStore *eventStore, opts notifierOptions) notifier {
|
||||||
if opts.log == nil {
|
if opts.log == nil {
|
||||||
opts.log = &logging.NoOpLogger{}
|
opts.log = &logging.NoOpLogger{}
|
||||||
}
|
}
|
||||||
return ¬ifier{eventStore: eventStore, log: opts.log}
|
|
||||||
|
if opts.useChannelNotifier {
|
||||||
|
return &channelNotifier{}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &pollingNotifier{eventStore: eventStore, log: opts.log}
|
||||||
|
}
|
||||||
|
|
||||||
|
type channelNotifier struct{}
|
||||||
|
|
||||||
|
func (cn *channelNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||||
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// Return the last resource version from the event store
|
// Return the last resource version from the event store
|
||||||
func (n *notifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
func (n *pollingNotifier) lastEventResourceVersion(ctx context.Context) (int64, error) {
|
||||||
e, err := n.eventStore.LastEventKey(ctx)
|
e, err := n.eventStore.LastEventKey(ctx)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return 0, err
|
return 0, err
|
||||||
@@ -60,11 +76,11 @@ func (n *notifier) lastEventResourceVersion(ctx context.Context) (int64, error)
|
|||||||
return e.ResourceVersion, nil
|
return e.ResourceVersion, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (n *notifier) cacheKey(evt Event) string {
|
func (n *pollingNotifier) cacheKey(evt Event) string {
|
||||||
return fmt.Sprintf("%s~%s~%s~%s~%d", evt.Namespace, evt.Group, evt.Resource, evt.Name, evt.ResourceVersion)
|
return fmt.Sprintf("%s~%s~%s~%s~%d", evt.Namespace, evt.Group, evt.Resource, evt.Name, evt.ResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (n *notifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
func (n *pollingNotifier) Watch(ctx context.Context, opts watchOptions) <-chan Event {
|
||||||
if opts.MinBackoff <= 0 {
|
if opts.MinBackoff <= 0 {
|
||||||
opts.MinBackoff = defaultMinBackoff
|
opts.MinBackoff = defaultMinBackoff
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ import (
|
|||||||
"github.com/stretchr/testify/require"
|
"github.com/stretchr/testify/require"
|
||||||
)
|
)
|
||||||
|
|
||||||
func setupTestNotifier(t *testing.T) (*notifier, *eventStore) {
|
func setupTestNotifier(t *testing.T) (*pollingNotifier, *eventStore) {
|
||||||
db := setupTestBadgerDB(t)
|
db := setupTestBadgerDB(t)
|
||||||
t.Cleanup(func() {
|
t.Cleanup(func() {
|
||||||
err := db.Close()
|
err := db.Close()
|
||||||
@@ -22,10 +22,10 @@ func setupTestNotifier(t *testing.T) (*notifier, *eventStore) {
|
|||||||
kv := NewBadgerKV(db)
|
kv := NewBadgerKV(db)
|
||||||
eventStore := newEventStore(kv)
|
eventStore := newEventStore(kv)
|
||||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||||
return notifier, eventStore
|
return notifier.(*pollingNotifier), eventStore
|
||||||
}
|
}
|
||||||
|
|
||||||
func setupTestNotifierSqlKv(t *testing.T) (*notifier, *eventStore) {
|
func setupTestNotifierSqlKv(t *testing.T) (*pollingNotifier, *eventStore) {
|
||||||
dbstore := db.InitTestDB(t)
|
dbstore := db.InitTestDB(t)
|
||||||
eDB, err := dbimpl.ProvideResourceDB(dbstore, setting.NewCfg(), nil)
|
eDB, err := dbimpl.ProvideResourceDB(dbstore, setting.NewCfg(), nil)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
@@ -33,7 +33,7 @@ func setupTestNotifierSqlKv(t *testing.T) (*notifier, *eventStore) {
|
|||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
eventStore := newEventStore(kv)
|
eventStore := newEventStore(kv)
|
||||||
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
notifier := newNotifier(eventStore, notifierOptions{log: &logging.NoOpLogger{}})
|
||||||
return notifier, eventStore
|
return notifier.(*pollingNotifier), eventStore
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestNewNotifier(t *testing.T) {
|
func TestNewNotifier(t *testing.T) {
|
||||||
@@ -49,7 +49,7 @@ func TestDefaultWatchOptions(t *testing.T) {
|
|||||||
assert.Equal(t, defaultBufferSize, opts.BufferSize)
|
assert.Equal(t, defaultBufferSize, opts.BufferSize)
|
||||||
}
|
}
|
||||||
|
|
||||||
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*notifier, *eventStore), testFn func(*testing.T, context.Context, *notifier, *eventStore)) {
|
func runNotifierTestWith(t *testing.T, storeName string, newStoreFn func(*testing.T) (*pollingNotifier, *eventStore), testFn func(*testing.T, context.Context, *pollingNotifier, *eventStore)) {
|
||||||
t.Run(storeName, func(t *testing.T) {
|
t.Run(storeName, func(t *testing.T) {
|
||||||
ctx := context.Background()
|
ctx := context.Background()
|
||||||
notifier, eventStore := newStoreFn(t)
|
notifier, eventStore := newStoreFn(t)
|
||||||
@@ -62,7 +62,7 @@ func TestNotifier_lastEventResourceVersion(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierLastEventResourceVersion)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierLastEventResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierLastEventResourceVersion(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
// Test with no events
|
// Test with no events
|
||||||
rv, err := notifier.lastEventResourceVersion(ctx)
|
rv, err := notifier.lastEventResourceVersion(ctx)
|
||||||
assert.Error(t, err)
|
assert.Error(t, err)
|
||||||
@@ -113,7 +113,7 @@ func TestNotifier_cachekey(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierCachekey)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierCachekey)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierCachekey(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
event Event
|
event Event
|
||||||
@@ -167,7 +167,7 @@ func TestNotifier_Watch_NoEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchNoEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchNoEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierWatchNoEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 500*time.Millisecond)
|
ctx, cancel := context.WithTimeout(ctx, 500*time.Millisecond)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -208,7 +208,7 @@ func TestNotifier_Watch_WithExistingEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchWithExistingEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchWithExistingEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierWatchWithExistingEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -282,7 +282,7 @@ func TestNotifier_Watch_EventDeduplication(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchEventDeduplication)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchEventDeduplication)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierWatchEventDeduplication(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 2*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
|
|
||||||
@@ -348,7 +348,7 @@ func TestNotifier_Watch_ContextCancellation(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchContextCancellation)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchContextCancellation)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierWatchContextCancellation(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithCancel(ctx)
|
ctx, cancel := context.WithCancel(ctx)
|
||||||
|
|
||||||
// Add an initial event so that lastEventResourceVersion doesn't return ErrNotFound
|
// Add an initial event so that lastEventResourceVersion doesn't return ErrNotFound
|
||||||
@@ -394,7 +394,7 @@ func TestNotifier_Watch_MultipleEvents(t *testing.T) {
|
|||||||
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchMultipleEvents)
|
runNotifierTestWith(t, "sqlkv", setupTestNotifierSqlKv, testNotifierWatchMultipleEvents)
|
||||||
}
|
}
|
||||||
|
|
||||||
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *notifier, eventStore *eventStore) {
|
func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier *pollingNotifier, eventStore *eventStore) {
|
||||||
ctx, cancel := context.WithTimeout(ctx, 3*time.Second)
|
ctx, cancel := context.WithTimeout(ctx, 3*time.Second)
|
||||||
defer cancel()
|
defer cancel()
|
||||||
rv := time.Now().UnixNano()
|
rv := time.Now().UnixNano()
|
||||||
@@ -456,33 +456,27 @@ func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier
|
|||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
errCh := make(chan error)
|
||||||
go func() {
|
go func() {
|
||||||
for _, event := range testEvents {
|
for _, event := range testEvents {
|
||||||
err := eventStore.Save(ctx, event)
|
errCh <- eventStore.Save(ctx, event)
|
||||||
require.NoError(t, err)
|
|
||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
// Receive events
|
// Receive events
|
||||||
receivedEvents := make([]Event, 0, len(testEvents))
|
receivedEvents := make([]string, 0, len(testEvents))
|
||||||
for i := 0; i < len(testEvents); i++ {
|
for len(receivedEvents) != len(testEvents) {
|
||||||
select {
|
select {
|
||||||
case event := <-events:
|
case event := <-events:
|
||||||
receivedEvents = append(receivedEvents, event)
|
receivedEvents = append(receivedEvents, event.Name)
|
||||||
|
case err := <-errCh:
|
||||||
|
require.NoError(t, err)
|
||||||
case <-time.After(1 * time.Second):
|
case <-time.After(1 * time.Second):
|
||||||
t.Fatalf("Timed out waiting for event %d", i+1)
|
t.Fatalf("Timed out waiting for event %d", len(receivedEvents)+1)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Verify all events were received
|
|
||||||
assert.Len(t, receivedEvents, len(testEvents))
|
|
||||||
|
|
||||||
// Verify the events match and ordered by resource version
|
// Verify the events match and ordered by resource version
|
||||||
receivedNames := make([]string, len(receivedEvents))
|
|
||||||
for i, event := range receivedEvents {
|
|
||||||
receivedNames[i] = event.Name
|
|
||||||
}
|
|
||||||
|
|
||||||
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
||||||
assert.ElementsMatch(t, expectedNames, receivedNames)
|
assert.ElementsMatch(t, expectedNames, receivedEvents)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -473,8 +473,6 @@ func (k *sqlKV) Delete(ctx context.Context, section string, key string) error {
|
|||||||
return ErrNotFound
|
return ErrNotFound
|
||||||
}
|
}
|
||||||
|
|
||||||
// TODO reflect change to resource table
|
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ type kvStorageBackend struct {
|
|||||||
bulkLock *BulkLock
|
bulkLock *BulkLock
|
||||||
dataStore *dataStore
|
dataStore *dataStore
|
||||||
eventStore *eventStore
|
eventStore *eventStore
|
||||||
notifier *notifier
|
notifier notifier
|
||||||
builder DocumentBuilder
|
builder DocumentBuilder
|
||||||
log logging.Logger
|
log logging.Logger
|
||||||
withPruner bool
|
withPruner bool
|
||||||
@@ -91,6 +91,7 @@ type KVBackendOptions struct {
|
|||||||
Tracer trace.Tracer // TODO add tracing
|
Tracer trace.Tracer // TODO add tracing
|
||||||
Reg prometheus.Registerer // TODO add metrics
|
Reg prometheus.Registerer // TODO add metrics
|
||||||
|
|
||||||
|
UseChannelNotifier bool
|
||||||
// Adding RvManager overrides the RV generated with snowflake in order to keep backwards compatibility with
|
// Adding RvManager overrides the RV generated with snowflake in order to keep backwards compatibility with
|
||||||
// unified/sql
|
// unified/sql
|
||||||
RvManager *rvmanager.ResourceVersionManager
|
RvManager *rvmanager.ResourceVersionManager
|
||||||
@@ -121,7 +122,7 @@ func NewKVStorageBackend(opts KVBackendOptions) (KVBackend, error) {
|
|||||||
bulkLock: NewBulkLock(),
|
bulkLock: NewBulkLock(),
|
||||||
dataStore: newDataStore(kv),
|
dataStore: newDataStore(kv),
|
||||||
eventStore: eventStore,
|
eventStore: eventStore,
|
||||||
notifier: newNotifier(eventStore, notifierOptions{}),
|
notifier: newNotifier(eventStore, notifierOptions{useChannelNotifier: opts.UseChannelNotifier}),
|
||||||
snowflake: s,
|
snowflake: s,
|
||||||
builder: StandardDocumentBuilder(), // For now we use the standard document builder.
|
builder: StandardDocumentBuilder(), // For now we use the standard document builder.
|
||||||
log: &logging.NoOpLogger{}, // Make this configurable
|
log: &logging.NoOpLogger{}, // Make this configurable
|
||||||
@@ -346,7 +347,7 @@ func (k *kvStorageBackend) WriteEvent(ctx context.Context, event WriteEvent) (in
|
|||||||
return 0, fmt.Errorf("failed to write data: %w", err)
|
return 0, fmt.Errorf("failed to write data: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
rv = rvmanager.SnowflakeFromRv(rv)
|
rv = rvmanager.SnowflakeFromRV(rv)
|
||||||
dataKey.ResourceVersion = rv
|
dataKey.ResourceVersion = rv
|
||||||
} else {
|
} else {
|
||||||
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
||||||
|
|||||||
@@ -307,7 +307,7 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
|||||||
// Allocate the RVs
|
// Allocate the RVs
|
||||||
for i, guid := range guids {
|
for i, guid := range guids {
|
||||||
guidToRV[guid] = rv
|
guidToRV[guid] = rv
|
||||||
guidToSnowflakeRV[guid] = SnowflakeFromRv(rv)
|
guidToSnowflakeRV[guid] = SnowflakeFromRV(rv)
|
||||||
rvs[i] = rv
|
rvs[i] = rv
|
||||||
rv++
|
rv++
|
||||||
}
|
}
|
||||||
@@ -364,12 +364,20 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// takes a unix microsecond rv and transforms into a snowflake format. The timestamp is converted from microsecond to
|
// takes a unix microsecond RV and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||||
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
||||||
func SnowflakeFromRv(rv int64) int64 {
|
func SnowflakeFromRV(rv int64) int64 {
|
||||||
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// It is generally not possible to convert from a snowflakeID to a microsecond RV due to the loss in precision
|
||||||
|
// (snowflake ID stores timestamp in milliseconds). However, this implementation stores the microsecond fraction
|
||||||
|
// in the step bits (see SnowflakeFromRV), allowing us to compute the microsecond timestamp.
|
||||||
|
func RVFromSnowflake(snowflakeID int64) int64 {
|
||||||
|
microSecFraction := snowflakeID & ((1 << snowflake.StepBits) - 1)
|
||||||
|
return ((snowflakeID>>(snowflake.NodeBits+snowflake.StepBits))+snowflake.Epoch)*1000 + microSecFraction
|
||||||
|
}
|
||||||
|
|
||||||
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
||||||
// if comparison fails
|
// if comparison fails
|
||||||
func IsRvEqual(rv1, rv2 int64) bool {
|
func IsRvEqual(rv1, rv2 int64) bool {
|
||||||
@@ -377,7 +385,7 @@ func IsRvEqual(rv1, rv2 int64) bool {
|
|||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
return rv1 == SnowflakeFromRv(rv2)
|
return rv1 == SnowflakeFromRV(rv2)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Lock locks the resource version for the given key
|
// Lock locks the resource version for the given key
|
||||||
|
|||||||
@@ -63,3 +63,13 @@ func TestResourceVersionManager(t *testing.T) {
|
|||||||
require.Equal(t, rv, int64(200))
|
require.Equal(t, rv, int64(200))
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestSnowflakeFromRVRoundtrips(t *testing.T) {
|
||||||
|
// 2026-01-12 19:33:58.806211 +0000 UTC
|
||||||
|
offset := int64(1768246438806211) // in microseconds
|
||||||
|
|
||||||
|
for n := range int64(100) {
|
||||||
|
ts := offset + n
|
||||||
|
require.Equal(t, ts, RVFromSnowflake(SnowflakeFromRV(ts)))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -99,6 +99,9 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
||||||
|
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
||||||
|
|
||||||
if opts.Cfg.EnableSQLKVBackend {
|
if opts.Cfg.EnableSQLKVBackend {
|
||||||
sqlkv, err := resource.NewSQLKV(eDB)
|
sqlkv, err := resource.NewSQLKV(eDB)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -106,9 +109,10 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
kvBackendOpts := resource.KVBackendOptions{
|
kvBackendOpts := resource.KVBackendOptions{
|
||||||
KvStore: sqlkv,
|
KvStore: sqlkv,
|
||||||
Tracer: opts.Tracer,
|
Tracer: opts.Tracer,
|
||||||
Reg: opts.Reg,
|
Reg: opts.Reg,
|
||||||
|
UseChannelNotifier: !isHA,
|
||||||
}
|
}
|
||||||
|
|
||||||
ctx := context.Background()
|
ctx := context.Background()
|
||||||
@@ -140,9 +144,6 @@ func NewResourceServer(opts ServerOptions) (resource.ResourceServer, error) {
|
|||||||
serverOptions.Backend = kvBackend
|
serverOptions.Backend = kvBackend
|
||||||
serverOptions.Diagnostics = kvBackend
|
serverOptions.Diagnostics = kvBackend
|
||||||
} else {
|
} else {
|
||||||
isHA := isHighAvailabilityEnabled(opts.Cfg.SectionWithEnvOverrides("database"),
|
|
||||||
opts.Cfg.SectionWithEnvOverrides("resource_api"))
|
|
||||||
|
|
||||||
backend, err := NewBackend(BackendOptions{
|
backend, err := NewBackend(BackendOptions{
|
||||||
DBProvider: eDB,
|
DBProvider: eDB,
|
||||||
Reg: opts.Reg,
|
Reg: opts.Reg,
|
||||||
|
|||||||
@@ -200,7 +200,7 @@ func verifyKeyPath(t *testing.T, db sqldb.DB, ctx context.Context, key *resource
|
|||||||
var keyPathRV int64
|
var keyPathRV int64
|
||||||
if isSqlBackend {
|
if isSqlBackend {
|
||||||
// Convert microsecond RV to snowflake for key_path construction
|
// Convert microsecond RV to snowflake for key_path construction
|
||||||
keyPathRV = rvmanager.SnowflakeFromRv(resourceVersion)
|
keyPathRV = rvmanager.SnowflakeFromRV(resourceVersion)
|
||||||
} else {
|
} else {
|
||||||
// KV backend already provides snowflake RV
|
// KV backend already provides snowflake RV
|
||||||
keyPathRV = resourceVersion
|
keyPathRV = resourceVersion
|
||||||
@@ -434,9 +434,6 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
|
|
||||||
rows, err := db.QueryContext(ctx, query, namespace)
|
rows, err := db.QueryContext(ctx, query, namespace)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
defer func() {
|
|
||||||
_ = rows.Close()
|
|
||||||
}()
|
|
||||||
|
|
||||||
var records []ResourceHistoryRecord
|
var records []ResourceHistoryRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -460,33 +457,34 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
for resourceIdx, res := range resources {
|
for resourceIdx, res := range resources {
|
||||||
// Check create record (action=1, generation=1)
|
// Check create record (action=1, generation=1)
|
||||||
createRecord := records[recordIndex]
|
createRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, createRecord, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
verifyResourceHistoryRecord(t, createRecord, namespace, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
|
|
||||||
for resourceIdx, res := range resources {
|
for resourceIdx, res := range resources {
|
||||||
// Check update record (action=2, generation=2)
|
// Check update record (action=2, generation=2)
|
||||||
updateRecord := records[recordIndex]
|
updateRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, updateRecord, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
verifyResourceHistoryRecord(t, updateRecord, namespace, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
|
|
||||||
for resourceIdx, res := range resources[:2] {
|
for resourceIdx, res := range resources[:2] {
|
||||||
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
||||||
deleteRecord := records[recordIndex]
|
deleteRecord := records[recordIndex]
|
||||||
verifyResourceHistoryRecord(t, deleteRecord, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
verifyResourceHistoryRecord(t, deleteRecord, namespace, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||||
recordIndex++
|
recordIndex++
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// verifyResourceHistoryRecord validates a single resource_history record
|
// verifyResourceHistoryRecord validates a single resource_history record
|
||||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, namespace string, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||||
// Validate GUID (should be non-empty)
|
// Validate GUID (should be non-empty)
|
||||||
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
||||||
|
|
||||||
// Validate group/resource/namespace/name
|
// Validate group/resource/namespace/name
|
||||||
require.Equal(t, "playlist.grafana.app", record.Group)
|
require.Equal(t, "playlist.grafana.app", record.Group)
|
||||||
require.Equal(t, "playlists", record.Resource)
|
require.Equal(t, "playlists", record.Resource)
|
||||||
|
require.Equal(t, namespace, record.Namespace)
|
||||||
require.Equal(t, expectedRes.name, record.Name)
|
require.Equal(t, expectedRes.name, record.Name)
|
||||||
|
|
||||||
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
||||||
@@ -513,8 +511,12 @@ func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, exp
|
|||||||
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
||||||
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
||||||
if strings.Contains(record.Namespace, "-kv") {
|
if strings.Contains(record.Namespace, "-kv") {
|
||||||
require.True(t, rvmanager.IsRvEqual(expectedPrevRV, record.PreviousResourceVersion),
|
if expectedPrevRV == 0 {
|
||||||
"Previous resource version should match (KV backend snowflake format)")
|
require.Zero(t, record.PreviousResourceVersion)
|
||||||
|
} else {
|
||||||
|
require.Equal(t, expectedPrevRV, rvmanager.SnowflakeFromRV(record.PreviousResourceVersion),
|
||||||
|
"Previous resource version should match (KV backend snowflake format)")
|
||||||
|
}
|
||||||
} else {
|
} else {
|
||||||
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
||||||
}
|
}
|
||||||
@@ -546,9 +548,6 @@ func verifyResourceTable(t *testing.T, db sqldb.DB, namespace string, resources
|
|||||||
|
|
||||||
rows, err := db.QueryContext(ctx, query, namespace)
|
rows, err := db.QueryContext(ctx, query, namespace)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
defer func() {
|
|
||||||
_ = rows.Close()
|
|
||||||
}()
|
|
||||||
|
|
||||||
var records []ResourceRecord
|
var records []ResourceRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -612,9 +611,6 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
||||||
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
defer func() {
|
|
||||||
_ = rows.Close()
|
|
||||||
}()
|
|
||||||
|
|
||||||
var records []ResourceVersionRecord
|
var records []ResourceVersionRecord
|
||||||
for rows.Next() {
|
for rows.Next() {
|
||||||
@@ -649,7 +645,7 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
|||||||
isKvBackend := strings.Contains(namespace, "-kv")
|
isKvBackend := strings.Contains(namespace, "-kv")
|
||||||
recordResourceVersion := record.ResourceVersion
|
recordResourceVersion := record.ResourceVersion
|
||||||
if isKvBackend {
|
if isKvBackend {
|
||||||
recordResourceVersion = rvmanager.SnowflakeFromRv(record.ResourceVersion)
|
recordResourceVersion = rvmanager.SnowflakeFromRV(record.ResourceVersion)
|
||||||
}
|
}
|
||||||
|
|
||||||
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
||||||
@@ -841,24 +837,20 @@ func runMixedConcurrentOperations(t *testing.T, sqlServer, kvServer resource.Res
|
|||||||
}
|
}
|
||||||
|
|
||||||
// SQL backend operations
|
// SQL backend operations
|
||||||
wg.Add(1)
|
wg.Go(func() {
|
||||||
go func() {
|
|
||||||
defer wg.Done()
|
|
||||||
<-startBarrier // Wait for signal to start
|
<-startBarrier // Wait for signal to start
|
||||||
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
||||||
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
||||||
}
|
}
|
||||||
}()
|
})
|
||||||
|
|
||||||
// KV backend operations
|
// KV backend operations
|
||||||
wg.Add(1)
|
wg.Go(func() {
|
||||||
go func() {
|
|
||||||
defer wg.Done()
|
|
||||||
<-startBarrier // Wait for signal to start
|
<-startBarrier // Wait for signal to start
|
||||||
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
||||||
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
||||||
}
|
}
|
||||||
}()
|
})
|
||||||
|
|
||||||
// Start both goroutines simultaneously
|
// Start both goroutines simultaneously
|
||||||
close(startBarrier)
|
close(startBarrier)
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ import (
|
|||||||
"github.com/stretchr/testify/require"
|
"github.com/stretchr/testify/require"
|
||||||
|
|
||||||
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
"github.com/grafana/grafana/pkg/storage/unified/resource"
|
||||||
|
"github.com/grafana/grafana/pkg/util/testutil"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestBadgerKVStorageBackend(t *testing.T) {
|
func TestBadgerKVStorageBackend(t *testing.T) {
|
||||||
@@ -36,7 +37,9 @@ func TestBadgerKVStorageBackend(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestSQLKVStorageBackend(t *testing.T) {
|
func TestIntegrationSQLKVStorageBackend(t *testing.T) {
|
||||||
|
testutil.SkipIntegrationTestInShortMode(t)
|
||||||
|
|
||||||
skipTests := map[string]bool{
|
skipTests := map[string]bool{
|
||||||
TestWatchWriteEvents: true,
|
TestWatchWriteEvents: true,
|
||||||
TestList: true,
|
TestList: true,
|
||||||
|
|||||||
@@ -30,6 +30,7 @@ const (
|
|||||||
defaultLogGroupLimit = int32(50)
|
defaultLogGroupLimit = int32(50)
|
||||||
logIdentifierInternal = "__log__grafana_internal__"
|
logIdentifierInternal = "__log__grafana_internal__"
|
||||||
logStreamIdentifierInternal = "__logstream__grafana_internal__"
|
logStreamIdentifierInternal = "__logstream__grafana_internal__"
|
||||||
|
logGroupsMacro = "$__logGroups"
|
||||||
)
|
)
|
||||||
|
|
||||||
type AWSError struct {
|
type AWSError struct {
|
||||||
@@ -189,6 +190,47 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
|||||||
logsQuery.QueryLanguage = &cwli
|
logsQuery.QueryLanguage = &cwli
|
||||||
}
|
}
|
||||||
|
|
||||||
|
region := logsQuery.Region
|
||||||
|
if region == "" || region == defaultRegion {
|
||||||
|
region = ds.Settings.Region
|
||||||
|
}
|
||||||
|
|
||||||
|
useARN := false
|
||||||
|
if len(logsQuery.LogGroups) > 0 && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) && region != "" {
|
||||||
|
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
||||||
|
if err != nil {
|
||||||
|
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
||||||
|
} else {
|
||||||
|
useARN = isMonitoringAccount
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var logGroupIdentifiers []string
|
||||||
|
if len(logsQuery.LogGroups) > 0 {
|
||||||
|
// Log queries should use ARNs when querying a monitoring account because log group names are not unique across accounts.
|
||||||
|
if useARN {
|
||||||
|
for _, lg := range logsQuery.LogGroups {
|
||||||
|
if lg.Arn != "" {
|
||||||
|
// The startQuery api does not support arns with a trailing * so we need to remove it
|
||||||
|
logGroupIdentifiers = append(logGroupIdentifiers, strings.TrimSuffix(lg.Arn, "*"))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
||||||
|
seen := make(map[string]struct{}, len(logsQuery.LogGroups))
|
||||||
|
for _, lg := range logsQuery.LogGroups {
|
||||||
|
if lg.Name == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if _, exists := seen[lg.Name]; exists {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
seen[lg.Name] = struct{}{}
|
||||||
|
logGroupIdentifiers = append(logGroupIdentifiers, lg.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
finalQueryString := logsQuery.QueryString
|
finalQueryString := logsQuery.QueryString
|
||||||
// Only for CWLI queries
|
// Only for CWLI queries
|
||||||
// The fields @log and @logStream are always included in the results of a user's query
|
// The fields @log and @logStream are always included in the results of a user's query
|
||||||
@@ -200,6 +242,21 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
|||||||
logStreamIdentifierInternal + "|" + logsQuery.QueryString
|
logStreamIdentifierInternal + "|" + logsQuery.QueryString
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Expand $__logGroups macro for SQL queries
|
||||||
|
if *logsQuery.QueryLanguage == dataquery.LogsQueryLanguageSQL {
|
||||||
|
if strings.Contains(finalQueryString, logGroupsMacro) {
|
||||||
|
if len(logGroupIdentifiers) == 0 {
|
||||||
|
return nil, backend.DownstreamError(fmt.Errorf("query contains %s but no log groups are selected", logGroupsMacro))
|
||||||
|
}
|
||||||
|
quoted := make([]string, len(logGroupIdentifiers))
|
||||||
|
for i, id := range logGroupIdentifiers {
|
||||||
|
quoted[i] = fmt.Sprintf("'%s'", id)
|
||||||
|
}
|
||||||
|
replacement := fmt.Sprintf("`logGroups(logGroupIdentifier: [%s])`", strings.Join(quoted, ", "))
|
||||||
|
finalQueryString = strings.Replace(finalQueryString, logGroupsMacro, replacement, 1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
startQueryInput := &cloudwatchlogs.StartQueryInput{
|
startQueryInput := &cloudwatchlogs.StartQueryInput{
|
||||||
StartTime: aws.Int64(startTime.Unix()),
|
StartTime: aws.Int64(startTime.Unix()),
|
||||||
// Usually grafana time range allows only second precision, but you can create ranges with milliseconds
|
// Usually grafana time range allows only second precision, but you can create ranges with milliseconds
|
||||||
@@ -213,47 +270,13 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
|||||||
|
|
||||||
// log group identifiers can be left out if the query is an SQL query
|
// log group identifiers can be left out if the query is an SQL query
|
||||||
if *logsQuery.QueryLanguage != dataquery.LogsQueryLanguageSQL {
|
if *logsQuery.QueryLanguage != dataquery.LogsQueryLanguageSQL {
|
||||||
useLogGroupIdentifiers := false
|
if useARN {
|
||||||
logGroupsFromQuery := len(logsQuery.LogGroups) > 0
|
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
||||||
if logGroupsFromQuery && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) {
|
} else {
|
||||||
region := logsQuery.Region
|
|
||||||
if region == "" || region == defaultRegion {
|
|
||||||
region = ds.Settings.Region
|
|
||||||
}
|
|
||||||
if region != "" {
|
|
||||||
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
|
||||||
if err != nil {
|
|
||||||
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
|
||||||
} else if isMonitoringAccount {
|
|
||||||
// monitoring accounts require querying by log group identifiers because log group names are not unique across accounts.
|
|
||||||
var logGroupIdentifiers []string
|
|
||||||
for _, lg := range logsQuery.LogGroups {
|
|
||||||
// due to a bug in the startQuery api, we remove * from the arn, otherwise it throws an error
|
|
||||||
arn := strings.TrimSuffix(lg.Arn, "*")
|
|
||||||
logGroupIdentifiers = append(logGroupIdentifiers, arn)
|
|
||||||
}
|
|
||||||
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
|
||||||
useLogGroupIdentifiers = true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if !useLogGroupIdentifiers {
|
|
||||||
// even though logsQuery.LogGroupNames is deprecated, we still need to support it for backwards compatibility and alert queries
|
// even though logsQuery.LogGroupNames is deprecated, we still need to support it for backwards compatibility and alert queries
|
||||||
startQueryInput.LogGroupNames = append([]string(nil), logsQuery.LogGroupNames...)
|
startQueryInput.LogGroupNames = append([]string(nil), logsQuery.LogGroupNames...)
|
||||||
if len(startQueryInput.LogGroupNames) == 0 && logGroupsFromQuery {
|
if len(startQueryInput.LogGroupNames) == 0 && len(logGroupIdentifiers) > 0 {
|
||||||
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
startQueryInput.LogGroupNames = logGroupIdentifiers
|
||||||
seenLogGroupNames := make(map[string]struct{}, len(logsQuery.LogGroups))
|
|
||||||
for _, lg := range logsQuery.LogGroups {
|
|
||||||
if lg.Name == "" {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
if _, exists := seenLogGroupNames[lg.Name]; exists {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
seenLogGroupNames[lg.Name] = struct{}{}
|
|
||||||
startQueryInput.LogGroupNames = append(startQueryInput.LogGroupNames, lg.Name)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -873,6 +873,204 @@ func TestQuery_GetQueryResults(t *testing.T) {
|
|||||||
}, resp)
|
}, resp)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func Test_expandLogGroupsMacro(t *testing.T) {
|
||||||
|
origNewCWLogsClient := NewCWLogsClient
|
||||||
|
t.Cleanup(func() {
|
||||||
|
NewCWLogsClient = origNewCWLogsClient
|
||||||
|
})
|
||||||
|
|
||||||
|
var cli fakeCWLogsClient
|
||||||
|
|
||||||
|
NewCWLogsClient = func(cfg aws.Config) models.CWLogsClient {
|
||||||
|
return &cli
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("expands $__logGroups macro with log group names when not a monitoring account", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource()
|
||||||
|
|
||||||
|
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM $__logGroups",
|
||||||
|
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}]
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['group1', 'group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("expands $__logGroups macro with ARNs when monitoring account", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource(func(ds *DataSource) {
|
||||||
|
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||||
|
})
|
||||||
|
|
||||||
|
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM $__logGroups",
|
||||||
|
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}],
|
||||||
|
"region": "us-east-1"
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1', 'arn:aws:logs:us-east-1:123456789012:log-group:group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("strips trailing * from ARNs when expanding macro", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource(func(ds *DataSource) {
|
||||||
|
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||||
|
})
|
||||||
|
|
||||||
|
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM $__logGroups",
|
||||||
|
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1*", "name": "group1"}],
|
||||||
|
"region": "us-east-1"
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1'])`", *cli.calls.startQuery[0].QueryString)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("returns error when $__logGroups macro is used but no log groups are selected", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource()
|
||||||
|
|
||||||
|
resp, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM $__logGroups"
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Contains(t, resp.Responses["A"].Error.Error(), "query contains $__logGroups but no log groups are selected")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("does not expand macro when query does not contain $__logGroups", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource()
|
||||||
|
|
||||||
|
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM ` + "`logGroups(logGroupIdentifier: ['my-log-group'])`" + `"
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['my-log-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("does not expand macro for non-SQL query languages", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource()
|
||||||
|
|
||||||
|
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "CWLI",
|
||||||
|
"queryString":"fields @message | $__logGroups",
|
||||||
|
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}]
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Contains(t, *cli.calls.startQuery[0].QueryString, "$__logGroups")
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("expands macro with single log group", func(t *testing.T) {
|
||||||
|
cli = fakeCWLogsClient{}
|
||||||
|
ds := newTestDatasource()
|
||||||
|
|
||||||
|
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||||
|
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||||
|
Queries: []backend.DataQuery{
|
||||||
|
{
|
||||||
|
RefID: "A",
|
||||||
|
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||||
|
JSON: json.RawMessage(`{
|
||||||
|
"type": "logAction",
|
||||||
|
"subtype": "StartQuery",
|
||||||
|
"queryLanguage": "SQL",
|
||||||
|
"queryString":"SELECT * FROM $__logGroups",
|
||||||
|
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:single-group", "name": "single-group"}]
|
||||||
|
}`),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
assert.NoError(t, err)
|
||||||
|
require.Len(t, cli.calls.startQuery, 1)
|
||||||
|
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['single-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
func TestGroupResponseFrame(t *testing.T) {
|
func TestGroupResponseFrame(t *testing.T) {
|
||||||
t.Run("Doesn't group results without time field", func(t *testing.T) {
|
t.Run("Doesn't group results without time field", func(t *testing.T) {
|
||||||
frame := data.NewFrameOfFieldTypes("test", 0, data.FieldTypeString, data.FieldTypeInt32)
|
frame := data.NewFrameOfFieldTypes("test", 0, data.FieldTypeString, data.FieldTypeInt32)
|
||||||
|
|||||||
@@ -25,6 +25,10 @@ export class ExportAsCode extends ShareExportTab {
|
|||||||
public getTabLabel(): string {
|
public getTabLabel(): string {
|
||||||
return t('export.json.title', 'Export dashboard');
|
return t('export.json.title', 'Export dashboard');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public getSubtitle(): string | undefined {
|
||||||
|
return t('export.json.info-text', 'Copy or download a file containing the definition of your dashboard');
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
||||||
@@ -53,12 +57,6 @@ function ExportAsCodeRenderer({ model }: SceneComponentProps<ExportAsCode>) {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div data-testid={selector.container} className={styles.container}>
|
<div data-testid={selector.container} className={styles.container}>
|
||||||
<p>
|
|
||||||
<Trans i18nKey="export.json.info-text">
|
|
||||||
Copy or download a file containing the definition of your dashboard
|
|
||||||
</Trans>
|
|
||||||
</p>
|
|
||||||
|
|
||||||
{config.featureToggles.kubernetesDashboards ? (
|
{config.featureToggles.kubernetesDashboards ? (
|
||||||
<ResourceExport
|
<ResourceExport
|
||||||
dashboardJson={dashboardJson}
|
dashboardJson={dashboardJson}
|
||||||
|
|||||||
@@ -0,0 +1,189 @@
|
|||||||
|
import { render, screen, within } from '@testing-library/react';
|
||||||
|
import userEvent from '@testing-library/user-event';
|
||||||
|
import { AsyncState } from 'react-use/lib/useAsync';
|
||||||
|
|
||||||
|
import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
||||||
|
import { Dashboard } from '@grafana/schema';
|
||||||
|
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||||
|
|
||||||
|
import { ExportMode, ResourceExport } from './ResourceExport';
|
||||||
|
|
||||||
|
type DashboardJsonState = AsyncState<{
|
||||||
|
json: Dashboard | DashboardV2Spec | { error: unknown };
|
||||||
|
hasLibraryPanels?: boolean;
|
||||||
|
initialSaveModelVersion: 'v1' | 'v2';
|
||||||
|
}>;
|
||||||
|
|
||||||
|
const selector = e2eSelectors.pages.ExportDashboardDrawer.ExportAsJson;
|
||||||
|
|
||||||
|
const createDefaultProps = (overrides?: Partial<Parameters<typeof ResourceExport>[0]>) => {
|
||||||
|
const defaultProps: Parameters<typeof ResourceExport>[0] = {
|
||||||
|
dashboardJson: {
|
||||||
|
loading: false,
|
||||||
|
value: {
|
||||||
|
json: { title: 'Test Dashboard' } as Dashboard,
|
||||||
|
hasLibraryPanels: false,
|
||||||
|
initialSaveModelVersion: 'v1',
|
||||||
|
},
|
||||||
|
} as DashboardJsonState,
|
||||||
|
isSharingExternally: false,
|
||||||
|
exportMode: ExportMode.Classic,
|
||||||
|
isViewingYAML: false,
|
||||||
|
onExportModeChange: jest.fn(),
|
||||||
|
onShareExternallyChange: jest.fn(),
|
||||||
|
onViewYAML: jest.fn(),
|
||||||
|
};
|
||||||
|
|
||||||
|
return { ...defaultProps, ...overrides };
|
||||||
|
};
|
||||||
|
|
||||||
|
const createV2DashboardJson = (hasLibraryPanels = false): DashboardJsonState => ({
|
||||||
|
loading: false,
|
||||||
|
value: {
|
||||||
|
json: {
|
||||||
|
title: 'Test V2 Dashboard',
|
||||||
|
spec: {
|
||||||
|
elements: {},
|
||||||
|
},
|
||||||
|
} as unknown as DashboardV2Spec,
|
||||||
|
hasLibraryPanels,
|
||||||
|
initialSaveModelVersion: 'v2',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const expandOptions = async () => {
|
||||||
|
const button = screen.getByRole('button', { expanded: false });
|
||||||
|
await userEvent.click(button);
|
||||||
|
};
|
||||||
|
|
||||||
|
describe('ResourceExport', () => {
|
||||||
|
describe('export mode options for v1 dashboard', () => {
|
||||||
|
it('should show three export mode options in correct order: Classic, V1 Resource, V2 Resource', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps()} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||||
|
const labels = within(radioGroup)
|
||||||
|
.getAllByRole('radio')
|
||||||
|
.map((radio) => radio.parentElement?.textContent?.trim());
|
||||||
|
|
||||||
|
expect(labels).toHaveLength(3);
|
||||||
|
expect(labels).toEqual(['Classic', 'V1 Resource', 'V2 Resource']);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should have first option selected by default when exportMode is Classic', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||||
|
const radios = within(radioGroup).getAllByRole('radio');
|
||||||
|
expect(radios[0]).toBeChecked();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onExportModeChange when export mode is changed', async () => {
|
||||||
|
const onExportModeChange = jest.fn();
|
||||||
|
render(<ResourceExport {...createDefaultProps({ onExportModeChange })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const radioGroup = screen.getByRole('radiogroup', { name: /model/i });
|
||||||
|
const radios = within(radioGroup).getAllByRole('radio');
|
||||||
|
await userEvent.click(radios[1]); // V1 Resource
|
||||||
|
expect(onExportModeChange).toHaveBeenCalledWith(ExportMode.V1Resource);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('export mode options for v2 dashboard', () => {
|
||||||
|
it('should not show export mode options', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ dashboardJson: createV2DashboardJson() })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
expect(screen.queryByRole('radiogroup', { name: /model/i })).not.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('format options', () => {
|
||||||
|
it('should not show format options when export mode is Classic', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
||||||
|
expect(screen.queryByRole('radiogroup', { name: /format/i })).not.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it.each([ExportMode.V1Resource, ExportMode.V2Resource])(
|
||||||
|
'should show format options when export mode is %s',
|
||||||
|
async (exportMode) => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
expect(screen.getByRole('radiogroup', { name: /model/i })).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('radiogroup', { name: /format/i })).toBeInTheDocument();
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
it('should have first format option selected when isViewingYAML is false', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: false })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||||
|
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||||
|
expect(formatRadios[0]).toBeChecked(); // JSON
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should have second format option selected when isViewingYAML is true', async () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, isViewingYAML: true })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||||
|
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||||
|
expect(formatRadios[1]).toBeChecked(); // YAML
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onViewYAML when format is changed', async () => {
|
||||||
|
const onViewYAML = jest.fn();
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.V1Resource, onViewYAML })} />);
|
||||||
|
await expandOptions();
|
||||||
|
|
||||||
|
const formatGroup = screen.getByRole('radiogroup', { name: /format/i });
|
||||||
|
const formatRadios = within(formatGroup).getAllByRole('radio');
|
||||||
|
await userEvent.click(formatRadios[1]); // YAML
|
||||||
|
expect(onViewYAML).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('share externally switch', () => {
|
||||||
|
it('should show share externally switch for Classic mode', () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic })} />);
|
||||||
|
|
||||||
|
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show share externally switch for V2Resource mode with V2 dashboard', () => {
|
||||||
|
render(
|
||||||
|
<ResourceExport
|
||||||
|
{...createDefaultProps({
|
||||||
|
dashboardJson: createV2DashboardJson(),
|
||||||
|
exportMode: ExportMode.V2Resource,
|
||||||
|
})}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onShareExternallyChange when switch is toggled', async () => {
|
||||||
|
const onShareExternallyChange = jest.fn();
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, onShareExternallyChange })} />);
|
||||||
|
|
||||||
|
const switchElement = screen.getByTestId(selector.exportExternallyToggle);
|
||||||
|
await userEvent.click(switchElement);
|
||||||
|
expect(onShareExternallyChange).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should reflect isSharingExternally value in switch', () => {
|
||||||
|
render(<ResourceExport {...createDefaultProps({ exportMode: ExportMode.Classic, isSharingExternally: true })} />);
|
||||||
|
|
||||||
|
expect(screen.getByTestId(selector.exportExternallyToggle)).toBeChecked();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -4,7 +4,8 @@ import { selectors as e2eSelectors } from '@grafana/e2e-selectors';
|
|||||||
import { Trans, t } from '@grafana/i18n';
|
import { Trans, t } from '@grafana/i18n';
|
||||||
import { Dashboard } from '@grafana/schema';
|
import { Dashboard } from '@grafana/schema';
|
||||||
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
import { Spec as DashboardV2Spec } from '@grafana/schema/dist/esm/schema/dashboard/v2';
|
||||||
import { Alert, Label, RadioButtonGroup, Stack, Switch } from '@grafana/ui';
|
import { Alert, Icon, Label, RadioButtonGroup, Stack, Switch, Box, Tooltip } from '@grafana/ui';
|
||||||
|
import { QueryOperationRow } from 'app/core/components/QueryOperationRow/QueryOperationRow';
|
||||||
import { DashboardJson } from 'app/features/manage-dashboards/types';
|
import { DashboardJson } from 'app/features/manage-dashboards/types';
|
||||||
|
|
||||||
import { ExportableResource } from '../ShareExportTab';
|
import { ExportableResource } from '../ShareExportTab';
|
||||||
@@ -48,80 +49,90 @@ export function ResourceExport({
|
|||||||
|
|
||||||
const switchExportLabel =
|
const switchExportLabel =
|
||||||
exportMode === ExportMode.V2Resource
|
exportMode === ExportMode.V2Resource
|
||||||
? t('export.json.export-remove-ds-refs', 'Remove deployment details')
|
? t('dashboard-scene.resource-export.share-externally', 'Share dashboard with another instance')
|
||||||
: t('share-modal.export.share-externally-label', `Export for sharing externally`);
|
: t('share-modal.export.share-externally-label', 'Export for sharing externally');
|
||||||
|
const switchExportTooltip = t(
|
||||||
|
'dashboard-scene.resource-export.share-externally-tooltip',
|
||||||
|
'Removes all instance-specific metadata and data source references from the resource before export.'
|
||||||
|
);
|
||||||
const switchExportModeLabel = t('export.json.export-mode', 'Model');
|
const switchExportModeLabel = t('export.json.export-mode', 'Model');
|
||||||
const switchExportFormatLabel = t('export.json.export-format', 'Format');
|
const switchExportFormatLabel = t('export.json.export-format', 'Format');
|
||||||
|
|
||||||
|
const exportResourceOptions = [
|
||||||
|
{
|
||||||
|
label: t('dashboard-scene.resource-export.label.classic', 'Classic'),
|
||||||
|
value: ExportMode.Classic,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
||||||
|
value: ExportMode.V1Resource,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
||||||
|
value: ExportMode.V2Resource,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Stack gap={2} direction="column">
|
<>
|
||||||
<Stack gap={1} direction="column">
|
<QueryOperationRow
|
||||||
{initialSaveModelVersion === 'v1' && (
|
id="Advanced options"
|
||||||
<Stack alignItems="center">
|
index={0}
|
||||||
<Label>{switchExportModeLabel}</Label>
|
title={t('dashboard-scene.resource-export.label.advanced-options', 'Advanced options')}
|
||||||
<RadioButtonGroup
|
isOpen={false}
|
||||||
options={[
|
>
|
||||||
{ label: t('dashboard-scene.resource-export.label.classic', 'Classic'), value: ExportMode.Classic },
|
<Box marginTop={2}>
|
||||||
{
|
<Stack gap={1} direction="column">
|
||||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
{initialSaveModelVersion === 'v1' && (
|
||||||
value: ExportMode.V1Resource,
|
<Stack gap={1} alignItems="center">
|
||||||
},
|
<Label>{switchExportModeLabel}</Label>
|
||||||
{
|
<RadioButtonGroup
|
||||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
options={exportResourceOptions}
|
||||||
value: ExportMode.V2Resource,
|
value={exportMode}
|
||||||
},
|
onChange={(value) => onExportModeChange(value)}
|
||||||
]}
|
aria-label={switchExportModeLabel}
|
||||||
value={exportMode}
|
/>
|
||||||
onChange={(value) => onExportModeChange(value)}
|
</Stack>
|
||||||
/>
|
)}
|
||||||
|
|
||||||
|
{exportMode !== ExportMode.Classic && (
|
||||||
|
<Stack gap={1} alignItems="center">
|
||||||
|
<Label>{switchExportFormatLabel}</Label>
|
||||||
|
<RadioButtonGroup
|
||||||
|
options={[
|
||||||
|
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
||||||
|
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
||||||
|
]}
|
||||||
|
value={isViewingYAML ? 'yaml' : 'json'}
|
||||||
|
onChange={onViewYAML}
|
||||||
|
aria-label={switchExportFormatLabel}
|
||||||
|
/>
|
||||||
|
</Stack>
|
||||||
|
)}
|
||||||
</Stack>
|
</Stack>
|
||||||
)}
|
</Box>
|
||||||
{initialSaveModelVersion === 'v2' && (
|
</QueryOperationRow>
|
||||||
<Stack alignItems="center">
|
|
||||||
<Label>{switchExportModeLabel}</Label>
|
{(isV2Dashboard ||
|
||||||
<RadioButtonGroup
|
exportMode === ExportMode.Classic ||
|
||||||
options={[
|
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
||||||
{
|
<Stack gap={1} alignItems="start">
|
||||||
label: t('dashboard-scene.resource-export.label.v2-resource', 'V2 Resource'),
|
<Label>
|
||||||
value: ExportMode.V2Resource,
|
<Stack gap={0.5} alignItems="center">
|
||||||
},
|
<Tooltip content={switchExportTooltip} placement="bottom">
|
||||||
{
|
<Icon name="info-circle" size="sm" />
|
||||||
label: t('dashboard-scene.resource-export.label.v1-resource', 'V1 Resource'),
|
</Tooltip>
|
||||||
value: ExportMode.V1Resource,
|
{switchExportLabel}
|
||||||
},
|
</Stack>
|
||||||
]}
|
</Label>
|
||||||
value={exportMode}
|
<Switch
|
||||||
onChange={(value) => onExportModeChange(value)}
|
label={switchExportLabel}
|
||||||
/>
|
value={isSharingExternally}
|
||||||
</Stack>
|
onChange={onShareExternallyChange}
|
||||||
)}
|
data-testid={selector.exportExternallyToggle}
|
||||||
{exportMode !== ExportMode.Classic && (
|
/>
|
||||||
<Stack gap={1} alignItems="center">
|
</Stack>
|
||||||
<Label>{switchExportFormatLabel}</Label>
|
)}
|
||||||
<RadioButtonGroup
|
|
||||||
options={[
|
|
||||||
{ label: t('dashboard-scene.resource-export.label.json', 'JSON'), value: 'json' },
|
|
||||||
{ label: t('dashboard-scene.resource-export.label.yaml', 'YAML'), value: 'yaml' },
|
|
||||||
]}
|
|
||||||
value={isViewingYAML ? 'yaml' : 'json'}
|
|
||||||
onChange={onViewYAML}
|
|
||||||
/>
|
|
||||||
</Stack>
|
|
||||||
)}
|
|
||||||
{(isV2Dashboard ||
|
|
||||||
exportMode === ExportMode.Classic ||
|
|
||||||
(initialSaveModelVersion === 'v2' && exportMode === ExportMode.V1Resource)) && (
|
|
||||||
<Stack gap={1} alignItems="start">
|
|
||||||
<Label>{switchExportLabel}</Label>
|
|
||||||
<Switch
|
|
||||||
label={switchExportLabel}
|
|
||||||
value={isSharingExternally}
|
|
||||||
onChange={onShareExternallyChange}
|
|
||||||
data-testid={selector.exportExternallyToggle}
|
|
||||||
/>
|
|
||||||
</Stack>
|
|
||||||
)}
|
|
||||||
</Stack>
|
|
||||||
|
|
||||||
{showV2LibPanelAlert && (
|
{showV2LibPanelAlert && (
|
||||||
<Alert
|
<Alert
|
||||||
@@ -130,6 +141,7 @@ export function ResourceExport({
|
|||||||
'Library panels will be converted to regular panels'
|
'Library panels will be converted to regular panels'
|
||||||
)}
|
)}
|
||||||
severity="warning"
|
severity="warning"
|
||||||
|
topSpacing={2}
|
||||||
>
|
>
|
||||||
<Trans i18nKey="dashboard-scene.save-dashboard-form.schema-v2-library-panels-export">
|
<Trans i18nKey="dashboard-scene.save-dashboard-form.schema-v2-library-panels-export">
|
||||||
Due to limitations in the new dashboard schema (V2), library panels will be converted to regular panels with
|
Due to limitations in the new dashboard schema (V2), library panels will be converted to regular panels with
|
||||||
@@ -137,6 +149,6 @@ export function ResourceExport({
|
|||||||
</Trans>
|
</Trans>
|
||||||
</Alert>
|
</Alert>
|
||||||
)}
|
)}
|
||||||
</Stack>
|
</>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -66,7 +66,12 @@ function ShareDrawerRenderer({ model }: SceneComponentProps<ShareDrawer>) {
|
|||||||
const dashboard = getDashboardSceneFor(model);
|
const dashboard = getDashboardSceneFor(model);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Drawer title={activeShare?.getTabLabel()} onClose={model.onDismiss} size="md">
|
<Drawer
|
||||||
|
title={activeShare?.getTabLabel()}
|
||||||
|
subtitle={activeShare?.getSubtitle?.()}
|
||||||
|
onClose={model.onDismiss}
|
||||||
|
size="md"
|
||||||
|
>
|
||||||
<ShareDrawerContext.Provider value={{ dashboard, onDismiss: model.onDismiss }}>
|
<ShareDrawerContext.Provider value={{ dashboard, onDismiss: model.onDismiss }}>
|
||||||
{activeShare && <activeShare.Component model={activeShare} />}
|
{activeShare && <activeShare.Component model={activeShare} />}
|
||||||
</ShareDrawerContext.Provider>
|
</ShareDrawerContext.Provider>
|
||||||
|
|||||||
@@ -66,6 +66,10 @@ export class ShareExportTab extends SceneObjectBase<ShareExportTabState> impleme
|
|||||||
return t('share-modal.tab-title.export', 'Export');
|
return t('share-modal.tab-title.export', 'Export');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public getSubtitle(): string | undefined {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
public onShareExternallyChange = () => {
|
public onShareExternallyChange = () => {
|
||||||
this.setState({
|
this.setState({
|
||||||
isSharingExternally: !this.state.isSharingExternally,
|
isSharingExternally: !this.state.isSharingExternally,
|
||||||
|
|||||||
@@ -15,5 +15,6 @@ export interface SceneShareTab<T extends SceneShareTabState = SceneShareTabState
|
|||||||
|
|
||||||
export interface ShareView extends SceneObject {
|
export interface ShareView extends SceneObject {
|
||||||
getTabLabel(): string;
|
getTabLabel(): string;
|
||||||
|
getSubtitle?(): string | undefined;
|
||||||
onDismiss?: () => void;
|
onDismiss?: () => void;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,8 +2,9 @@ import { render, screen } from '@testing-library/react';
|
|||||||
import { defaultsDeep } from 'lodash';
|
import { defaultsDeep } from 'lodash';
|
||||||
import { Provider } from 'react-redux';
|
import { Provider } from 'react-redux';
|
||||||
|
|
||||||
import { FieldType, getDefaultTimeRange, LoadingState } from '@grafana/data';
|
import { CoreApp, EventBusSrv, FieldType, getDefaultTimeRange, LoadingState } from '@grafana/data';
|
||||||
import { PanelDataErrorViewProps } from '@grafana/runtime';
|
import { config, PanelDataErrorViewProps } from '@grafana/runtime';
|
||||||
|
import { usePanelContext } from '@grafana/ui';
|
||||||
import { configureStore } from 'app/store/configureStore';
|
import { configureStore } from 'app/store/configureStore';
|
||||||
|
|
||||||
import { PanelDataErrorView } from './PanelDataErrorView';
|
import { PanelDataErrorView } from './PanelDataErrorView';
|
||||||
@@ -16,7 +17,24 @@ jest.mock('app/features/dashboard/services/DashboardSrv', () => ({
|
|||||||
},
|
},
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
jest.mock('@grafana/ui', () => ({
|
||||||
|
...jest.requireActual('@grafana/ui'),
|
||||||
|
usePanelContext: jest.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const mockUsePanelContext = jest.mocked(usePanelContext);
|
||||||
|
const RUN_QUERY_MESSAGE = 'Run a query to visualize it here or go to all visualizations to add other panel types';
|
||||||
|
const panelContextRoot = {
|
||||||
|
app: CoreApp.Dashboard,
|
||||||
|
eventsScope: 'global',
|
||||||
|
eventBus: new EventBusSrv(),
|
||||||
|
};
|
||||||
|
|
||||||
describe('PanelDataErrorView', () => {
|
describe('PanelDataErrorView', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
||||||
|
});
|
||||||
|
|
||||||
it('show No data when there is no data', () => {
|
it('show No data when there is no data', () => {
|
||||||
renderWithProps();
|
renderWithProps();
|
||||||
|
|
||||||
@@ -70,6 +88,45 @@ describe('PanelDataErrorView', () => {
|
|||||||
|
|
||||||
expect(screen.getByText('Query returned nothing')).toBeInTheDocument();
|
expect(screen.getByText('Query returned nothing')).toBeInTheDocument();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should show "Run a query..." message when no query is configured and feature toggle is enabled', () => {
|
||||||
|
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
||||||
|
|
||||||
|
const originalFeatureToggle = config.featureToggles.newVizSuggestions;
|
||||||
|
config.featureToggles.newVizSuggestions = true;
|
||||||
|
|
||||||
|
renderWithProps({
|
||||||
|
data: {
|
||||||
|
state: LoadingState.Done,
|
||||||
|
series: [],
|
||||||
|
timeRange: getDefaultTimeRange(),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(screen.getByText(RUN_QUERY_MESSAGE)).toBeInTheDocument();
|
||||||
|
|
||||||
|
config.featureToggles.newVizSuggestions = originalFeatureToggle;
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show "No data" message when feature toggle is disabled even without queries', () => {
|
||||||
|
mockUsePanelContext.mockReturnValue(panelContextRoot);
|
||||||
|
|
||||||
|
const originalFeatureToggle = config.featureToggles.newVizSuggestions;
|
||||||
|
config.featureToggles.newVizSuggestions = false;
|
||||||
|
|
||||||
|
renderWithProps({
|
||||||
|
data: {
|
||||||
|
state: LoadingState.Done,
|
||||||
|
series: [],
|
||||||
|
timeRange: getDefaultTimeRange(),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(screen.getByText('No data')).toBeInTheDocument();
|
||||||
|
expect(screen.queryByText(RUN_QUERY_MESSAGE)).not.toBeInTheDocument();
|
||||||
|
|
||||||
|
config.featureToggles.newVizSuggestions = originalFeatureToggle;
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
function renderWithProps(overrides?: Partial<PanelDataErrorViewProps>) {
|
function renderWithProps(overrides?: Partial<PanelDataErrorViewProps>) {
|
||||||
|
|||||||
@@ -5,14 +5,15 @@ import {
|
|||||||
FieldType,
|
FieldType,
|
||||||
getPanelDataSummary,
|
getPanelDataSummary,
|
||||||
GrafanaTheme2,
|
GrafanaTheme2,
|
||||||
|
PanelData,
|
||||||
PanelDataSummary,
|
PanelDataSummary,
|
||||||
PanelPluginVisualizationSuggestion,
|
PanelPluginVisualizationSuggestion,
|
||||||
} from '@grafana/data';
|
} from '@grafana/data';
|
||||||
import { selectors } from '@grafana/e2e-selectors';
|
import { selectors } from '@grafana/e2e-selectors';
|
||||||
import { t, Trans } from '@grafana/i18n';
|
import { t, Trans } from '@grafana/i18n';
|
||||||
import { PanelDataErrorViewProps, locationService } from '@grafana/runtime';
|
import { PanelDataErrorViewProps, locationService, config } from '@grafana/runtime';
|
||||||
import { VizPanel } from '@grafana/scenes';
|
import { VizPanel } from '@grafana/scenes';
|
||||||
import { usePanelContext, useStyles2 } from '@grafana/ui';
|
import { Icon, usePanelContext, useStyles2 } from '@grafana/ui';
|
||||||
import { CardButton } from 'app/core/components/CardButton';
|
import { CardButton } from 'app/core/components/CardButton';
|
||||||
import { LS_VISUALIZATION_SELECT_TAB_KEY } from 'app/core/constants';
|
import { LS_VISUALIZATION_SELECT_TAB_KEY } from 'app/core/constants';
|
||||||
import store from 'app/core/store';
|
import store from 'app/core/store';
|
||||||
@@ -24,6 +25,11 @@ import { findVizPanelByKey, getVizPanelKeyForPanelId } from 'app/features/dashbo
|
|||||||
import { useDispatch } from 'app/types/store';
|
import { useDispatch } from 'app/types/store';
|
||||||
|
|
||||||
import { changePanelPlugin } from '../state/actions';
|
import { changePanelPlugin } from '../state/actions';
|
||||||
|
import { hasData } from '../suggestions/utils';
|
||||||
|
|
||||||
|
function hasNoQueryConfigured(data: PanelData): boolean {
|
||||||
|
return !data.request?.targets || data.request.targets.length === 0;
|
||||||
|
}
|
||||||
|
|
||||||
export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
@@ -93,8 +99,14 @@ export function PanelDataErrorView(props: PanelDataErrorViewProps) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const noData = !hasData(props.data);
|
||||||
|
const noQueryConfigured = hasNoQueryConfigured(props.data);
|
||||||
|
const showEmptyState =
|
||||||
|
config.featureToggles.newVizSuggestions && context.app === CoreApp.PanelEditor && noQueryConfigured && noData;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className={styles.wrapper}>
|
<div className={styles.wrapper}>
|
||||||
|
{showEmptyState && <Icon name="chart-line" size="xxxl" className={styles.emptyStateIcon} />}
|
||||||
<div className={styles.message} data-testid={selectors.components.Panels.Panel.PanelDataErrorMessage}>
|
<div className={styles.message} data-testid={selectors.components.Panels.Panel.PanelDataErrorMessage}>
|
||||||
{message}
|
{message}
|
||||||
</div>
|
</div>
|
||||||
@@ -131,7 +143,17 @@ function getMessageFor(
|
|||||||
return message;
|
return message;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!data.series || data.series.length === 0 || data.series.every((frame) => frame.length === 0)) {
|
const noData = !hasData(data);
|
||||||
|
const noQueryConfigured = hasNoQueryConfigured(data);
|
||||||
|
|
||||||
|
if (config.featureToggles.newVizSuggestions && noQueryConfigured && noData) {
|
||||||
|
return t(
|
||||||
|
'dashboard.new-panel.empty-state-message',
|
||||||
|
'Run a query to visualize it here or go to all visualizations to add other panel types'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (noData) {
|
||||||
return fieldConfig?.defaults.noValue ?? t('panel.panel-data-error-view.no-value.default', 'No data');
|
return fieldConfig?.defaults.noValue ?? t('panel.panel-data-error-view.no-value.default', 'No data');
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -176,5 +198,9 @@ const getStyles = (theme: GrafanaTheme2) => {
|
|||||||
width: '100%',
|
width: '100%',
|
||||||
maxWidth: '600px',
|
maxWidth: '600px',
|
||||||
}),
|
}),
|
||||||
|
emptyStateIcon: css({
|
||||||
|
color: theme.colors.text.secondary,
|
||||||
|
marginBottom: theme.spacing(2),
|
||||||
|
}),
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ export const DEFAULT_ANNOTATIONS_QUERY: Omit<CloudWatchAnnotationQuery, 'refId'>
|
|||||||
export const DEFAULT_CWLI_QUERY_STRING = 'fields @timestamp, @message |\nsort @timestamp desc |\nlimit 20';
|
export const DEFAULT_CWLI_QUERY_STRING = 'fields @timestamp, @message |\nsort @timestamp desc |\nlimit 20';
|
||||||
export const DEFAULT_PPL_QUERY_STRING = 'fields `@timestamp`, `@message`\n| sort - `@timestamp`\n| head 25s';
|
export const DEFAULT_PPL_QUERY_STRING = 'fields `@timestamp`, `@message`\n| sort - `@timestamp`\n| head 25s';
|
||||||
export const DEFAULT_SQL_QUERY_STRING =
|
export const DEFAULT_SQL_QUERY_STRING =
|
||||||
'SELECT `@timestamp`, `@message`\nFROM `log_group`\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
'SELECT `@timestamp`, `@message`\nFROM $__logGroups\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
||||||
|
|
||||||
export const getDefaultLogsQuery = (
|
export const getDefaultLogsQuery = (
|
||||||
defaultLogGroups?: LogGroup[],
|
defaultLogGroups?: LogGroup[],
|
||||||
|
|||||||
+11
-3
@@ -97,14 +97,22 @@ describe('LogsSQLCompletionItemProvider', () => {
|
|||||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 103 });
|
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 103 });
|
||||||
const suggestionLabels = suggestions.map((s) => s.label);
|
const suggestionLabels = suggestions.map((s) => s.label);
|
||||||
expect(suggestionLabels).toEqual(
|
expect(suggestionLabels).toEqual(
|
||||||
expect.arrayContaining([FROM, `${FROM} \`logGroups(logGroupIdentifier: [...])\``, CASE, ...ALL_FUNCTIONS])
|
expect.arrayContaining([
|
||||||
|
FROM,
|
||||||
|
`${FROM} $__logGroups`,
|
||||||
|
`${FROM} \`logGroups(logGroupIdentifier: [...])\``,
|
||||||
|
CASE,
|
||||||
|
...ALL_FUNCTIONS,
|
||||||
|
])
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('returns logGroups suggestion after from keyword', async () => {
|
it('returns logGroups and $__logGroups suggestion after from keyword', async () => {
|
||||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 108 });
|
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 108 });
|
||||||
const suggestionLabels = suggestions.map((s) => s.label);
|
const suggestionLabels = suggestions.map((s) => s.label);
|
||||||
expect(suggestionLabels).toEqual(expect.arrayContaining(['`logGroups(logGroupIdentifier: [...])`']));
|
expect(suggestionLabels).toEqual(
|
||||||
|
expect.arrayContaining(['$__logGroups', '`logGroups(logGroupIdentifier: [...])`'])
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('returns where, having, limit, group by, order by, and join suggestions after from arguments', async () => {
|
it('returns where, having, limit, group by, order by, and join suggestions after from arguments', async () => {
|
||||||
|
|||||||
+12
@@ -142,6 +142,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
|||||||
command: TRIGGER_SUGGEST,
|
command: TRIGGER_SUGGEST,
|
||||||
sortText: CompletionItemPriority.MediumHigh,
|
sortText: CompletionItemPriority.MediumHigh,
|
||||||
});
|
});
|
||||||
|
addSuggestion(`${FROM} $__logGroups`, {
|
||||||
|
insertText: `${FROM} $__logGroups`,
|
||||||
|
kind: monaco.languages.CompletionItemKind.Snippet,
|
||||||
|
sortText: CompletionItemPriority.High,
|
||||||
|
detail: 'Use selected log groups from the selector',
|
||||||
|
});
|
||||||
addSuggestion(`${FROM} \`logGroups(logGroupIdentifier: [...])\``, {
|
addSuggestion(`${FROM} \`logGroups(logGroupIdentifier: [...])\``, {
|
||||||
insertText: `${FROM} \`logGroups(logGroupIdentifier: [$0])\``,
|
insertText: `${FROM} \`logGroups(logGroupIdentifier: [$0])\``,
|
||||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||||
@@ -152,6 +158,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
|||||||
break;
|
break;
|
||||||
|
|
||||||
case SuggestionKind.AfterFromKeyword:
|
case SuggestionKind.AfterFromKeyword:
|
||||||
|
addSuggestion('$__logGroups', {
|
||||||
|
insertText: '$__logGroups',
|
||||||
|
kind: monaco.languages.CompletionItemKind.Variable,
|
||||||
|
sortText: CompletionItemPriority.High,
|
||||||
|
detail: 'Expands to selected log groups',
|
||||||
|
});
|
||||||
addSuggestion('`logGroups(logGroupIdentifier: [...])`', {
|
addSuggestion('`logGroups(logGroupIdentifier: [...])`', {
|
||||||
insertText: '`logGroups(logGroupIdentifier: [$0])`',
|
insertText: '`logGroups(logGroupIdentifier: [$0])`',
|
||||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||||
|
|||||||
@@ -488,6 +488,7 @@ export const language: CloudWatchLanguage = {
|
|||||||
root: [
|
root: [
|
||||||
{ include: '@comments' },
|
{ include: '@comments' },
|
||||||
{ include: '@whitespace' },
|
{ include: '@whitespace' },
|
||||||
|
{ include: '@macros' },
|
||||||
{ include: '@customParams' },
|
{ include: '@customParams' },
|
||||||
{ include: '@numbers' },
|
{ include: '@numbers' },
|
||||||
{ include: '@binaries' },
|
{ include: '@binaries' },
|
||||||
@@ -519,6 +520,7 @@ export const language: CloudWatchLanguage = {
|
|||||||
[/\*\//, { token: 'comment.quote', next: '@pop' }],
|
[/\*\//, { token: 'comment.quote', next: '@pop' }],
|
||||||
[/./, 'comment'],
|
[/./, 'comment'],
|
||||||
],
|
],
|
||||||
|
macros: [[/\$__[a-zA-Z0-9_]+/, 'type']],
|
||||||
customParams: [
|
customParams: [
|
||||||
[/\${[A-Za-z0-9._-]*}/, 'variable'],
|
[/\${[A-Za-z0-9._-]*}/, 'variable'],
|
||||||
[/\@\@{[A-Za-z0-9._-]*}/, 'variable'],
|
[/\@\@{[A-Za-z0-9._-]*}/, 'variable'],
|
||||||
|
|||||||
+12
-15
@@ -1,29 +1,26 @@
|
|||||||
import { SelectableValue } from '@grafana/data';
|
import { SelectableValue } from '@grafana/data';
|
||||||
import { RadioButtonGroup } from '@grafana/ui';
|
import { RadioButtonGroup } from '@grafana/ui';
|
||||||
|
|
||||||
import { useDispatch } from '../../hooks/useStatelessReducer';
|
|
||||||
import { EditorType } from '../../types';
|
import { EditorType } from '../../types';
|
||||||
|
|
||||||
import { useQuery } from './ElasticsearchQueryContext';
|
|
||||||
import { changeEditorTypeAndResetQuery } from './state';
|
|
||||||
|
|
||||||
const BASE_OPTIONS: Array<SelectableValue<EditorType>> = [
|
const BASE_OPTIONS: Array<SelectableValue<EditorType>> = [
|
||||||
{ value: 'builder', label: 'Builder' },
|
{ value: 'builder', label: 'Builder' },
|
||||||
{ value: 'code', label: 'Code' },
|
{ value: 'code', label: 'Code' },
|
||||||
];
|
];
|
||||||
|
|
||||||
export const EditorTypeSelector = () => {
|
interface Props {
|
||||||
const query = useQuery();
|
value: EditorType;
|
||||||
const dispatch = useDispatch();
|
onChange: (editorType: EditorType) => void;
|
||||||
|
}
|
||||||
// Default to 'builder' if editorType is empty
|
|
||||||
const editorType: EditorType = query.editorType === 'code' ? 'code' : 'builder';
|
|
||||||
|
|
||||||
const onChange = (newEditorType: EditorType) => {
|
|
||||||
dispatch(changeEditorTypeAndResetQuery(newEditorType));
|
|
||||||
};
|
|
||||||
|
|
||||||
|
export const EditorTypeSelector = ({ value, onChange }: Props) => {
|
||||||
return (
|
return (
|
||||||
<RadioButtonGroup<EditorType> fullWidth={false} options={BASE_OPTIONS} value={editorType} onChange={onChange} />
|
<RadioButtonGroup<EditorType>
|
||||||
|
data-testid="elasticsearch-editor-type-toggle"
|
||||||
|
size="sm"
|
||||||
|
options={BASE_OPTIONS}
|
||||||
|
value={value}
|
||||||
|
onChange={onChange}
|
||||||
|
/>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|||||||
+36
-14
@@ -10,9 +10,13 @@ interface Props {
|
|||||||
onRunQuery: () => void;
|
onRunQuery: () => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// This offset was chosen by testing to match Prometheus behavior
|
||||||
|
const EDITOR_HEIGHT_OFFSET = 2;
|
||||||
|
|
||||||
export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
const editorRef = useRef<monacoTypes.editor.IStandaloneCodeEditor | null>(null);
|
const editorRef = useRef<monacoTypes.editor.IStandaloneCodeEditor | null>(null);
|
||||||
|
const containerRef = useRef<HTMLDivElement | null>(null);
|
||||||
|
|
||||||
const handleEditorDidMount = useCallback(
|
const handleEditorDidMount = useCallback(
|
||||||
(editor: monacoTypes.editor.IStandaloneCodeEditor, monaco: Monaco) => {
|
(editor: monacoTypes.editor.IStandaloneCodeEditor, monaco: Monaco) => {
|
||||||
@@ -22,6 +26,22 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
editor.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.Enter, () => {
|
editor.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.Enter, () => {
|
||||||
onRunQuery();
|
onRunQuery();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Make the editor resize itself so that the content fits (grows taller when necessary)
|
||||||
|
// this code comes from the Prometheus query editor.
|
||||||
|
// We may wish to consider abstracting it into the grafana/ui repo in the future
|
||||||
|
const updateElementHeight = () => {
|
||||||
|
const containerDiv = containerRef.current;
|
||||||
|
if (containerDiv !== null) {
|
||||||
|
const pixelHeight = editor.getContentHeight();
|
||||||
|
containerDiv.style.height = `${pixelHeight + EDITOR_HEIGHT_OFFSET}px`;
|
||||||
|
const pixelWidth = containerDiv.clientWidth;
|
||||||
|
editor.layout({ width: pixelWidth, height: pixelHeight });
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
editor.onDidContentSizeChange(updateElementHeight);
|
||||||
|
updateElementHeight();
|
||||||
},
|
},
|
||||||
[onRunQuery]
|
[onRunQuery]
|
||||||
);
|
);
|
||||||
@@ -65,7 +85,17 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<Box>
|
<Box>
|
||||||
<div className={styles.header}>
|
<div ref={containerRef} className={styles.editorContainer}>
|
||||||
|
<CodeEditor
|
||||||
|
value={value ?? ''}
|
||||||
|
language="json"
|
||||||
|
width="100%"
|
||||||
|
onBlur={handleQueryChange}
|
||||||
|
monacoOptions={monacoOptions}
|
||||||
|
onEditorDidMount={handleEditorDidMount}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div className={styles.footer}>
|
||||||
<Stack gap={1}>
|
<Stack gap={1}>
|
||||||
<Button
|
<Button
|
||||||
size="sm"
|
size="sm"
|
||||||
@@ -76,20 +106,8 @@ export function RawQueryEditor({ value, onChange, onRunQuery }: Props) {
|
|||||||
>
|
>
|
||||||
Format
|
Format
|
||||||
</Button>
|
</Button>
|
||||||
<Button size="sm" variant="primary" icon="play" onClick={onRunQuery} tooltip="Run query (Ctrl/Cmd+Enter)">
|
|
||||||
Run
|
|
||||||
</Button>
|
|
||||||
</Stack>
|
</Stack>
|
||||||
</div>
|
</div>
|
||||||
<CodeEditor
|
|
||||||
value={value ?? ''}
|
|
||||||
language="json"
|
|
||||||
height={200}
|
|
||||||
width="100%"
|
|
||||||
onBlur={handleQueryChange}
|
|
||||||
monacoOptions={monacoOptions}
|
|
||||||
onEditorDidMount={handleEditorDidMount}
|
|
||||||
/>
|
|
||||||
</Box>
|
</Box>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -100,7 +118,11 @@ const getStyles = (theme: GrafanaTheme2) => ({
|
|||||||
flexDirection: 'column',
|
flexDirection: 'column',
|
||||||
gap: theme.spacing(1),
|
gap: theme.spacing(1),
|
||||||
}),
|
}),
|
||||||
header: css({
|
editorContainer: css({
|
||||||
|
width: '100%',
|
||||||
|
overflow: 'hidden',
|
||||||
|
}),
|
||||||
|
footer: css({
|
||||||
display: 'flex',
|
display: 'flex',
|
||||||
justifyContent: 'flex-end',
|
justifyContent: 'flex-end',
|
||||||
padding: theme.spacing(0.5, 0),
|
padding: theme.spacing(0.5, 0),
|
||||||
|
|||||||
@@ -1,16 +1,16 @@
|
|||||||
import { css } from '@emotion/css';
|
import { css } from '@emotion/css';
|
||||||
import { useEffect, useId, useState } from 'react';
|
import { useCallback, useEffect, useId, useState } from 'react';
|
||||||
import { SemVer } from 'semver';
|
import { SemVer } from 'semver';
|
||||||
|
|
||||||
import { getDefaultTimeRange, GrafanaTheme2, QueryEditorProps } from '@grafana/data';
|
import { getDefaultTimeRange, GrafanaTheme2, QueryEditorProps } from '@grafana/data';
|
||||||
import { config } from '@grafana/runtime';
|
import { config } from '@grafana/runtime';
|
||||||
import { Alert, InlineField, InlineLabel, Input, QueryField, useStyles2 } from '@grafana/ui';
|
import { Alert, ConfirmModal, InlineField, InlineLabel, Input, QueryField, useStyles2 } from '@grafana/ui';
|
||||||
|
|
||||||
import { ElasticsearchDataQuery } from '../../dataquery.gen';
|
import { ElasticsearchDataQuery } from '../../dataquery.gen';
|
||||||
import { ElasticDatasource } from '../../datasource';
|
import { ElasticDatasource } from '../../datasource';
|
||||||
import { useNextId } from '../../hooks/useNextId';
|
import { useNextId } from '../../hooks/useNextId';
|
||||||
import { useDispatch } from '../../hooks/useStatelessReducer';
|
import { useDispatch } from '../../hooks/useStatelessReducer';
|
||||||
import { ElasticsearchOptions } from '../../types';
|
import { EditorType, ElasticsearchOptions } from '../../types';
|
||||||
import { isSupportedVersion, isTimeSeriesQuery, unsupportedVersionMessage } from '../../utils';
|
import { isSupportedVersion, isTimeSeriesQuery, unsupportedVersionMessage } from '../../utils';
|
||||||
|
|
||||||
import { BucketAggregationsEditor } from './BucketAggregationsEditor';
|
import { BucketAggregationsEditor } from './BucketAggregationsEditor';
|
||||||
@@ -20,7 +20,7 @@ import { MetricAggregationsEditor } from './MetricAggregationsEditor';
|
|||||||
import { metricAggregationConfig } from './MetricAggregationsEditor/utils';
|
import { metricAggregationConfig } from './MetricAggregationsEditor/utils';
|
||||||
import { QueryTypeSelector } from './QueryTypeSelector';
|
import { QueryTypeSelector } from './QueryTypeSelector';
|
||||||
import { RawQueryEditor } from './RawQueryEditor';
|
import { RawQueryEditor } from './RawQueryEditor';
|
||||||
import { changeAliasPattern, changeQuery, changeRawDSLQuery } from './state';
|
import { changeAliasPattern, changeEditorTypeAndResetQuery, changeQuery, changeRawDSLQuery } from './state';
|
||||||
|
|
||||||
export type ElasticQueryEditorProps = QueryEditorProps<ElasticDatasource, ElasticsearchDataQuery, ElasticsearchOptions>;
|
export type ElasticQueryEditorProps = QueryEditorProps<ElasticDatasource, ElasticsearchDataQuery, ElasticsearchOptions>;
|
||||||
|
|
||||||
@@ -97,31 +97,61 @@ const QueryEditorForm = ({ value, onRunQuery }: Props & { onRunQuery: () => void
|
|||||||
const inputId = useId();
|
const inputId = useId();
|
||||||
const styles = useStyles2(getStyles);
|
const styles = useStyles2(getStyles);
|
||||||
|
|
||||||
|
const [switchModalOpen, setSwitchModalOpen] = useState(false);
|
||||||
|
const [pendingEditorType, setPendingEditorType] = useState<EditorType | null>(null);
|
||||||
|
|
||||||
const isTimeSeries = isTimeSeriesQuery(value);
|
const isTimeSeries = isTimeSeriesQuery(value);
|
||||||
|
|
||||||
const isCodeEditor = value.editorType === 'code';
|
const isCodeEditor = value.editorType === 'code';
|
||||||
const rawDSLFeatureEnabled = config.featureToggles.elasticsearchRawDSLQuery;
|
const rawDSLFeatureEnabled = config.featureToggles.elasticsearchRawDSLQuery;
|
||||||
|
|
||||||
|
// Default to 'builder' if editorType is empty
|
||||||
|
const currentEditorType: EditorType = value.editorType === 'code' ? 'code' : 'builder';
|
||||||
|
|
||||||
const showBucketAggregationsEditor = value.metrics?.every(
|
const showBucketAggregationsEditor = value.metrics?.every(
|
||||||
(metric) => metricAggregationConfig[metric.type].impliedQueryType === 'metrics'
|
(metric) => metricAggregationConfig[metric.type].impliedQueryType === 'metrics'
|
||||||
);
|
);
|
||||||
|
|
||||||
|
const onEditorTypeChange = useCallback((newEditorType: EditorType) => {
|
||||||
|
// Show warning modal when switching modes
|
||||||
|
setPendingEditorType(newEditorType);
|
||||||
|
setSwitchModalOpen(true);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const confirmEditorTypeChange = useCallback(() => {
|
||||||
|
if (pendingEditorType) {
|
||||||
|
dispatch(changeEditorTypeAndResetQuery(pendingEditorType));
|
||||||
|
}
|
||||||
|
setSwitchModalOpen(false);
|
||||||
|
setPendingEditorType(null);
|
||||||
|
}, [dispatch, pendingEditorType]);
|
||||||
|
|
||||||
|
const cancelEditorTypeChange = useCallback(() => {
|
||||||
|
setSwitchModalOpen(false);
|
||||||
|
setPendingEditorType(null);
|
||||||
|
}, []);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
|
<ConfirmModal
|
||||||
|
isOpen={switchModalOpen}
|
||||||
|
title="Switch editor"
|
||||||
|
body="Switching between editors will reset your query. Are you sure you want to continue?"
|
||||||
|
confirmText="Continue"
|
||||||
|
onConfirm={confirmEditorTypeChange}
|
||||||
|
onDismiss={cancelEditorTypeChange}
|
||||||
|
/>
|
||||||
<div className={styles.root}>
|
<div className={styles.root}>
|
||||||
<InlineLabel width={17}>Query type</InlineLabel>
|
<InlineLabel width={17}>Query type</InlineLabel>
|
||||||
<div className={styles.queryItem}>
|
<div className={styles.queryItem}>
|
||||||
<QueryTypeSelector />
|
<QueryTypeSelector />
|
||||||
</div>
|
</div>
|
||||||
</div>
|
{rawDSLFeatureEnabled && (
|
||||||
{rawDSLFeatureEnabled && (
|
<div style={{ marginLeft: 'auto' }}>
|
||||||
<div className={styles.root}>
|
<EditorTypeSelector value={currentEditorType} onChange={onEditorTypeChange} />
|
||||||
<InlineLabel width={17}>Editor type</InlineLabel>
|
|
||||||
<div className={styles.queryItem}>
|
|
||||||
<EditorTypeSelector />
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
)}
|
||||||
)}
|
</div>
|
||||||
|
|
||||||
{isCodeEditor && rawDSLFeatureEnabled && (
|
{isCodeEditor && rawDSLFeatureEnabled && (
|
||||||
<RawQueryEditor
|
<RawQueryEditor
|
||||||
|
|||||||
@@ -6383,12 +6383,15 @@
|
|||||||
},
|
},
|
||||||
"resource-export": {
|
"resource-export": {
|
||||||
"label": {
|
"label": {
|
||||||
|
"advanced-options": "Advanced options",
|
||||||
"classic": "Classic",
|
"classic": "Classic",
|
||||||
"json": "JSON",
|
"json": "JSON",
|
||||||
"v1-resource": "V1 Resource",
|
"v1-resource": "V1 Resource",
|
||||||
"v2-resource": "V2 Resource",
|
"v2-resource": "V2 Resource",
|
||||||
"yaml": "YAML"
|
"yaml": "YAML"
|
||||||
}
|
},
|
||||||
|
"share-externally": "Share dashboard with another instance",
|
||||||
|
"share-externally-tooltip": "Removes all instance-specific metadata and data source references from the resource before export."
|
||||||
},
|
},
|
||||||
"revert-dashboard-modal": {
|
"revert-dashboard-modal": {
|
||||||
"body-restore-version": "Are you sure you want to restore the dashboard to version {{version}}? All unsaved changes will be lost.",
|
"body-restore-version": "Are you sure you want to restore the dashboard to version {{version}}? All unsaved changes will be lost.",
|
||||||
@@ -7842,7 +7845,6 @@
|
|||||||
"export-externally-label": "Export the dashboard to use in another instance",
|
"export-externally-label": "Export the dashboard to use in another instance",
|
||||||
"export-format": "Format",
|
"export-format": "Format",
|
||||||
"export-mode": "Model",
|
"export-mode": "Model",
|
||||||
"export-remove-ds-refs": "Remove deployment details",
|
|
||||||
"info-text": "Copy or download a file containing the definition of your dashboard",
|
"info-text": "Copy or download a file containing the definition of your dashboard",
|
||||||
"title": "Export dashboard"
|
"title": "Export dashboard"
|
||||||
},
|
},
|
||||||
|
|||||||
Reference in New Issue
Block a user