Compare commits
6 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 2d1f51d02f | |||
| 9f2f93b401 | |||
| 9e399e0b19 | |||
| 2f520454ae | |||
| f9eb5b7360 | |||
| 785c578e2f |
@@ -270,7 +270,17 @@ Click **View in CloudWatch console** to interactively view, search, and analyze
|
||||
|
||||
### Query Log groups with OpenSearch SQL
|
||||
|
||||
When querying log groups with OpenSearch SQL, you **must** explicitly state the log group identifier or ARN in the `FROM` clause:
|
||||
When querying log groups with OpenSearch SQL, you can use the `$__logGroups` macro to automatically reference log groups selected in the query editor's log group selector. This is the recommended approach as it allows you to manage log groups through the UI.
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
FROM $__logGroups
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
The `$__logGroups` macro expands to the proper `logGroups(logGroupIdentifier: [...])` syntax with the log groups you've selected in the UI.
|
||||
|
||||
Alternatively, you can manually specify a single log group directly in the `FROM` clause:
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
@@ -278,7 +288,7 @@ FROM `log_group`
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
or, when querying multiple log groups:
|
||||
When querying multiple log groups you **must** use the `logGroups(logGroupIdentifier: [...])` syntax:
|
||||
|
||||
```sql
|
||||
SELECT window.start, COUNT(*) AS exceptionCount
|
||||
@@ -286,6 +296,8 @@ FROM `logGroups( logGroupIdentifier: ['LogGroup1', 'LogGroup2'])`
|
||||
WHERE `@message` LIKE '%Exception%'
|
||||
```
|
||||
|
||||
To reference log groups in a monitoring account, use ARNs instead of LogGroup names.
|
||||
|
||||
You can also write queries returning time series data by using the [`stats` command](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_Insights-Visualizing-Log-Data.html).
|
||||
When making `stats` queries in [Explore](ref:explore), ensure you are in Metrics Explore mode.
|
||||
|
||||
|
||||
+8
-2
@@ -4,7 +4,8 @@ comments: |
|
||||
This file is used in the following visualizations: candlestick, heatmap, state timeline, status history, time series.
|
||||
---
|
||||
|
||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||
You can pan the panel time range left and right, and zoom it and in and out.
|
||||
This, in turn, changes the dashboard time range.
|
||||
|
||||
**Zoom in** - Click and drag on the panel to zoom in on a particular time range.
|
||||
|
||||
@@ -16,4 +17,9 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
||||
- Next range: 8:30 - 10:29
|
||||
- Next range: 7:30 - 11:29
|
||||
|
||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#zoom-panel-time-range).
|
||||
**Pan** - Click and drag the x-axis area of the panel to pan the time range.
|
||||
|
||||
The time range shifts by the distance you drag.
|
||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||
|
||||
For screen recordings showing these interactions, refer to the [Panel overview documentation](https://grafana.com/docs/grafana/<GRAFANA_VERSION>/visualizations/panels-visualizations/panel-overview/#pan-and-zoom-panel-time-range).
|
||||
@@ -317,13 +317,16 @@ Click the **Copy time range to clipboard** icon to copy the current time range t
|
||||
|
||||
You can also copy and paste a time range using the keyboard shortcuts `t+c` and `t+v` respectively.
|
||||
|
||||
#### Zoom out (Cmd+Z or Ctrl+Z)
|
||||
#### Zoom out
|
||||
|
||||
Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualization.
|
||||
- Click the **Zoom out** icon to view a larger time range in the dashboard or panel visualizations
|
||||
- Double click on the panel graph area (time series family visualizations only)
|
||||
- Type the `t-` keyboard shortcut
|
||||
|
||||
#### Zoom in (only applicable to graph visualizations)
|
||||
#### Zoom in
|
||||
|
||||
Click and drag to select the time range in the visualization that you want to view.
|
||||
- Click and drag horizontally in the panel graph area to select a time range (time series family visualizations only)
|
||||
- Type the `t+` keyboard shortcut
|
||||
|
||||
#### Refresh dashboard
|
||||
|
||||
|
||||
@@ -175,9 +175,10 @@ By hovering over a panel with the mouse you can use some shortcuts that will tar
|
||||
- `pl`: Hide or show legend
|
||||
- `pr`: Remove Panel
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
You can zoom the panel time range in and out, which in turn, changes the dashboard time range.
|
||||
You can pan the panel time range left and right, and zoom it and in and out.
|
||||
This, in turn, changes the dashboard time range.
|
||||
|
||||
This feature is supported for the following visualizations:
|
||||
|
||||
@@ -191,7 +192,7 @@ This feature is supported for the following visualizations:
|
||||
|
||||
Click and drag on the panel to zoom in on a particular time range.
|
||||
|
||||
The following screen recordings show this interaction in the time series and x visualizations:
|
||||
The following screen recordings show this interaction in the time series and candlestick visualizations:
|
||||
|
||||
Time series
|
||||
|
||||
@@ -211,7 +212,7 @@ For example, if the original time range is from 9:00 to 9:59, the time range cha
|
||||
- Next range: 8:30 - 10:29
|
||||
- Next range: 7:30 - 11:29
|
||||
|
||||
The following screen recordings demonstrate the preceding example in the time series and x visualizations:
|
||||
The following screen recordings demonstrate the preceding example in the time series and heatmap visualizations:
|
||||
|
||||
Time series
|
||||
|
||||
@@ -221,6 +222,19 @@ Heatmap
|
||||
|
||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-heatmap-panel-time-zoom-out-mouse.mp4" >}}
|
||||
|
||||
### Pan
|
||||
|
||||
Click and drag the x-axis area of the panel to pan the time range.
|
||||
|
||||
The time range shifts by the distance you drag.
|
||||
For example, if the original time range is from 9:00 to 9:59 and you drag 30 minutes to the right, the time range changes to 9:30 to 10:29.
|
||||
|
||||
The following screen recordings show this interaction in the time series visualization:
|
||||
|
||||
Time series
|
||||
|
||||
{{< video-embed src="/media/docs/grafana/panels-visualizations/recording-ts-time-pan-mouse.mp4" >}}
|
||||
|
||||
## Add a panel
|
||||
|
||||
To add a panel in a new dashboard click **+ Add visualization** in the middle of the dashboard:
|
||||
|
||||
+2
-2
@@ -92,9 +92,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/media/docs/grafana/panels-visualizations/screenshot-candles-volume-v11.6.png" max-width="750px" alt="A candlestick visualization showing the price movements of specific asset." >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
@@ -79,9 +79,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/static/img/docs/heatmap-panel/heatmap.png" max-width="1025px" alt="A heatmap visualization showing the random walk distribution over time" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -93,9 +93,9 @@ You can also create a state timeline visualization using time series data. To do
|
||||
|
||||

|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -85,9 +85,9 @@ The data is converted as follows:
|
||||
|
||||
{{< figure src="/static/img/docs/status-history-panel/status_history.png" max-width="1025px" alt="A status history panel with two time columns showing the status of two servers" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
+2
-2
@@ -167,9 +167,9 @@ The following example shows three series: Min, Max, and Value. The Min and Max s
|
||||
|
||||
{{< docs/shared lookup="visualizations/multiple-y-axes.md" source="grafana" version="<GRAFANA_VERSION>" leveloffset="+2" >}}
|
||||
|
||||
## Zoom panel time range
|
||||
## Pan and zoom panel time range
|
||||
|
||||
{{< docs/shared lookup="visualizations/panel-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
{{< docs/shared lookup="visualizations/panel-pan-zoom.md" source="grafana" version="<GRAFANA_VERSION>" >}}
|
||||
|
||||
## Configuration options
|
||||
|
||||
|
||||
Generated
-1
@@ -31,7 +31,6 @@ export interface Options extends common.SingleStatBaseOptions {
|
||||
endpointMarker?: ('point' | 'glow' | 'none');
|
||||
minVizHeight: number;
|
||||
minVizWidth: number;
|
||||
neutral?: number;
|
||||
segmentCount: number;
|
||||
segmentSpacing: number;
|
||||
shape: ('circle' | 'gauge');
|
||||
|
||||
@@ -1,6 +1,4 @@
|
||||
import { useMemo } from 'react';
|
||||
|
||||
import { colorManipulator, FALLBACK_COLOR, FieldDisplay } from '@grafana/data';
|
||||
import { FALLBACK_COLOR, FieldDisplay } from '@grafana/data';
|
||||
|
||||
import { useTheme2 } from '../../themes/ThemeContext';
|
||||
|
||||
@@ -8,6 +6,7 @@ import { RadialArcPath } from './RadialArcPath';
|
||||
import { RadialShape, RadialGaugeDimensions, GradientStop } from './types';
|
||||
|
||||
export interface RadialBarProps {
|
||||
angle: number;
|
||||
angleRange: number;
|
||||
dimensions: RadialGaugeDimensions;
|
||||
fieldDisplay: FieldDisplay;
|
||||
@@ -16,12 +15,11 @@ export interface RadialBarProps {
|
||||
endpointMarker?: 'point' | 'glow';
|
||||
shape: RadialShape;
|
||||
startAngle: number;
|
||||
startValueAngle: number;
|
||||
endValueAngle: number;
|
||||
glowFilter?: string;
|
||||
endpointMarkerGlowFilter?: string;
|
||||
}
|
||||
export function RadialBar({
|
||||
angle,
|
||||
angleRange,
|
||||
dimensions,
|
||||
fieldDisplay,
|
||||
@@ -30,45 +28,26 @@ export function RadialBar({
|
||||
endpointMarker,
|
||||
shape,
|
||||
startAngle,
|
||||
startValueAngle,
|
||||
endValueAngle,
|
||||
glowFilter,
|
||||
endpointMarkerGlowFilter,
|
||||
}: RadialBarProps) {
|
||||
const theme = useTheme2();
|
||||
const colorProps = gradient ? { gradient } : { color: fieldDisplay.display.color ?? FALLBACK_COLOR };
|
||||
const trackColor = useMemo(
|
||||
() => colorManipulator.onBackground(theme.colors.action.hover, theme.colors.background.primary).toHexString(),
|
||||
[theme]
|
||||
);
|
||||
|
||||
return (
|
||||
<>
|
||||
{/** Track before value */}
|
||||
{startValueAngle !== 0 && (
|
||||
<RadialArcPath
|
||||
arcLengthDeg={startValueAngle}
|
||||
fieldDisplay={fieldDisplay}
|
||||
color={trackColor}
|
||||
dimensions={dimensions}
|
||||
roundedBars={roundedBars}
|
||||
shape={shape}
|
||||
startAngle={startAngle}
|
||||
/>
|
||||
)}
|
||||
{/** Track after value */}
|
||||
{/** Track */}
|
||||
<RadialArcPath
|
||||
arcLengthDeg={angleRange - endValueAngle - startValueAngle}
|
||||
arcLengthDeg={angleRange - angle}
|
||||
fieldDisplay={fieldDisplay}
|
||||
color={trackColor}
|
||||
color={theme.colors.action.hover}
|
||||
dimensions={dimensions}
|
||||
roundedBars={roundedBars}
|
||||
shape={shape}
|
||||
startAngle={startAngle + startValueAngle + endValueAngle}
|
||||
startAngle={startAngle + angle}
|
||||
/>
|
||||
{/** The colored bar */}
|
||||
<RadialArcPath
|
||||
arcLengthDeg={endValueAngle}
|
||||
arcLengthDeg={angle}
|
||||
barEndcaps={shape === 'circle' && roundedBars}
|
||||
dimensions={dimensions}
|
||||
endpointMarker={roundedBars ? endpointMarker : undefined}
|
||||
@@ -77,7 +56,7 @@ export function RadialBar({
|
||||
glowFilter={glowFilter}
|
||||
roundedBars={roundedBars}
|
||||
shape={shape}
|
||||
startAngle={startAngle + startValueAngle}
|
||||
startAngle={startAngle}
|
||||
{...colorProps}
|
||||
/>
|
||||
</>
|
||||
|
||||
@@ -11,7 +11,6 @@ import {
|
||||
getFieldConfigMinMax,
|
||||
getFieldDisplayProcessor,
|
||||
getOptimalSegmentCount,
|
||||
getValuePercentageForValue,
|
||||
} from './utils';
|
||||
|
||||
export interface RadialBarSegmentedProps {
|
||||
@@ -19,8 +18,6 @@ export interface RadialBarSegmentedProps {
|
||||
dimensions: RadialGaugeDimensions;
|
||||
angleRange: number;
|
||||
startAngle: number;
|
||||
startValueAngle: number;
|
||||
endValueAngle: number;
|
||||
glowFilter?: string;
|
||||
segmentCount: number;
|
||||
segmentSpacing: number;
|
||||
@@ -39,24 +36,22 @@ export const RadialBarSegmented = memo(
|
||||
segmentCount,
|
||||
segmentSpacing,
|
||||
shape,
|
||||
startValueAngle,
|
||||
endValueAngle,
|
||||
}: RadialBarSegmentedProps) => {
|
||||
const theme = useTheme2();
|
||||
const segments: React.ReactNode[] = [];
|
||||
const segmentCountAdjusted = getOptimalSegmentCount(dimensions, segmentSpacing, segmentCount, angleRange);
|
||||
const [min, max] = getFieldConfigMinMax(fieldDisplay);
|
||||
const value = fieldDisplay.display.numeric;
|
||||
const angleBetweenSegments = getAngleBetweenSegments(segmentSpacing, segmentCount, angleRange);
|
||||
const segmentArcLengthDeg = angleRange / segmentCountAdjusted - angleBetweenSegments;
|
||||
const displayProcessor = getFieldDisplayProcessor(fieldDisplay);
|
||||
|
||||
for (let i = 0; i < segmentCountAdjusted; i++) {
|
||||
const value = min + ((max - min) / segmentCountAdjusted) * i;
|
||||
const segmentAngle = getValuePercentageForValue(fieldDisplay, value) * angleRange;
|
||||
const isTrack = segmentAngle < startValueAngle || segmentAngle >= startValueAngle + endValueAngle;
|
||||
const segmentStartAngle = startAngle + (angleRange / segmentCountAdjusted) * i + 0.01;
|
||||
const segmentColor = isTrack ? theme.colors.border.medium : (displayProcessor(value).color ?? FALLBACK_COLOR);
|
||||
const colorProps = !isTrack && gradient ? { gradient } : { color: segmentColor };
|
||||
const angleValue = min + ((max - min) / segmentCountAdjusted) * i;
|
||||
const segmentAngle = startAngle + (angleRange / segmentCountAdjusted) * i + 0.01;
|
||||
const segmentColor =
|
||||
angleValue >= value ? theme.colors.border.medium : (displayProcessor(angleValue).color ?? FALLBACK_COLOR);
|
||||
const colorProps = angleValue < value && gradient ? { gradient } : { color: segmentColor };
|
||||
|
||||
segments.push(
|
||||
<RadialArcPath
|
||||
@@ -66,7 +61,7 @@ export const RadialBarSegmented = memo(
|
||||
fieldDisplay={fieldDisplay}
|
||||
glowFilter={glowFilter}
|
||||
shape={shape}
|
||||
startAngle={segmentStartAngle}
|
||||
startAngle={segmentAngle}
|
||||
{...colorProps}
|
||||
/>
|
||||
);
|
||||
|
||||
@@ -50,7 +50,6 @@ const meta: Meta<StoryProps> = {
|
||||
thresholdsBar: false,
|
||||
colorScheme: FieldColorModeId.Thresholds,
|
||||
decimals: 0,
|
||||
neutral: undefined,
|
||||
},
|
||||
argTypes: {
|
||||
barWidthFactor: { control: { type: 'range', min: 0.1, max: 1, step: 0.01 } },
|
||||
@@ -76,7 +75,6 @@ const meta: Meta<StoryProps> = {
|
||||
],
|
||||
},
|
||||
decimals: { control: { type: 'range', min: 0, max: 7 } },
|
||||
neutral: { control: { type: 'number' } },
|
||||
},
|
||||
};
|
||||
|
||||
@@ -272,23 +270,6 @@ export const Examples: StoryFn<StoryProps> = (args) => {
|
||||
barWidthFactor={0.7}
|
||||
/>
|
||||
</Stack>
|
||||
<div>
|
||||
Neutral <em>(range -50 to 50, neutral = 0)</em>
|
||||
</div>
|
||||
<Stack direction={'row'} gap={3}>
|
||||
<RadialGaugeExample
|
||||
min={-50}
|
||||
max={50}
|
||||
value={-20}
|
||||
colorScheme={FieldColorModeId.Thresholds}
|
||||
gradient
|
||||
shape="gauge"
|
||||
glowCenter={true}
|
||||
roundedBars={false}
|
||||
barWidthFactor={0.7}
|
||||
neutral={0}
|
||||
/>
|
||||
</Stack>
|
||||
</Stack>
|
||||
);
|
||||
};
|
||||
@@ -349,7 +330,6 @@ interface ExampleProps {
|
||||
endpointMarker?: RadialGaugeProps['endpointMarker'];
|
||||
decimals?: number;
|
||||
showScaleLabels?: boolean;
|
||||
neutral?: number;
|
||||
}
|
||||
|
||||
export function RadialGaugeExample({
|
||||
@@ -377,7 +357,6 @@ export function RadialGaugeExample({
|
||||
endpointMarker = 'glow',
|
||||
decimals = 0,
|
||||
showScaleLabels,
|
||||
neutral,
|
||||
}: ExampleProps) {
|
||||
const theme = useTheme2();
|
||||
|
||||
@@ -463,7 +442,6 @@ export function RadialGaugeExample({
|
||||
thresholdsBar={thresholdsBar}
|
||||
showScaleLabels={showScaleLabels}
|
||||
endpointMarker={endpointMarker}
|
||||
neutral={neutral}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -16,7 +16,6 @@ describe('RadialGauge', () => {
|
||||
{ description: 'with endpoint marker point', props: { roundedBars: true, endpointMarker: 'point' } },
|
||||
{ description: 'with thresholds bar', props: { thresholdsBar: true } },
|
||||
{ description: 'with sparkline', props: { sparkline: true } },
|
||||
{ description: 'with neutral value', props: { neutral: 50 } },
|
||||
] satisfies Array<{ description: string; props?: ComponentProps<typeof RadialGaugeExample> }>)(
|
||||
'should render $description without throwing',
|
||||
({ props }) => {
|
||||
|
||||
@@ -67,11 +67,6 @@ export interface RadialGaugeProps {
|
||||
/** Specify which text should be visible */
|
||||
textMode?: RadialTextMode;
|
||||
showScaleLabels?: boolean;
|
||||
/**
|
||||
* If set, the gauge will use the neutral value instead of the min value as the starting point for a gauge.
|
||||
* this is most useful when you need to show positive and negative values on a gauge.
|
||||
*/
|
||||
neutral?: number;
|
||||
/** For data links */
|
||||
onClick?: React.MouseEventHandler<HTMLElement>;
|
||||
timeRange?: TimeRange;
|
||||
@@ -96,7 +91,6 @@ export function RadialGauge(props: RadialGaugeProps) {
|
||||
roundedBars = true,
|
||||
thresholdsBar = false,
|
||||
showScaleLabels = false,
|
||||
neutral,
|
||||
endpointMarker,
|
||||
onClick,
|
||||
values,
|
||||
@@ -119,13 +113,7 @@ export function RadialGauge(props: RadialGaugeProps) {
|
||||
|
||||
for (let barIndex = 0; barIndex < values.length; barIndex++) {
|
||||
const displayValue = values[barIndex];
|
||||
const { startValueAngle, endValueAngle, angleRange } = getValueAngleForValue(
|
||||
displayValue,
|
||||
startAngle,
|
||||
endAngle,
|
||||
neutral
|
||||
);
|
||||
|
||||
const { angle, angleRange } = getValueAngleForValue(displayValue, startAngle, endAngle);
|
||||
const gradientStops = gradient ? buildGradientColors(theme, displayValue) : undefined;
|
||||
const color = displayValue.display.color ?? FALLBACK_COLOR;
|
||||
const dimensions = calculateDimensions(
|
||||
@@ -152,7 +140,7 @@ export function RadialGauge(props: RadialGaugeProps) {
|
||||
<SpotlightGradient
|
||||
key={spotlightGradientId}
|
||||
id={spotlightGradientId}
|
||||
angle={endValueAngle + startAngle}
|
||||
angle={angle + startAngle}
|
||||
dimensions={dimensions}
|
||||
roundedBars={roundedBars}
|
||||
theme={theme}
|
||||
@@ -168,8 +156,6 @@ export function RadialGauge(props: RadialGaugeProps) {
|
||||
fieldDisplay={displayValue}
|
||||
angleRange={angleRange}
|
||||
startAngle={startAngle}
|
||||
startValueAngle={startValueAngle}
|
||||
endValueAngle={endValueAngle}
|
||||
glowFilter={glowFilterRef}
|
||||
segmentCount={segmentCount}
|
||||
segmentSpacing={segmentSpacing}
|
||||
@@ -182,10 +168,9 @@ export function RadialGauge(props: RadialGaugeProps) {
|
||||
<RadialBar
|
||||
key={`radial-bar-${barIndex}-${gaugeId}`}
|
||||
dimensions={dimensions}
|
||||
angle={angle}
|
||||
angleRange={angleRange}
|
||||
startAngle={startAngle}
|
||||
startValueAngle={startValueAngle}
|
||||
endValueAngle={endValueAngle}
|
||||
roundedBars={roundedBars}
|
||||
glowFilter={glowFilterRef}
|
||||
endpointMarkerGlowFilter={spotlightGradientRef}
|
||||
|
||||
@@ -233,8 +233,7 @@ describe('RadialGauge utils', () => {
|
||||
const fieldDisplay = createFieldDisplay(50, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360);
|
||||
|
||||
expect(result.startValueAngle).toBe(0);
|
||||
expect(result.endValueAngle).toBe(180); // 50% of 360°
|
||||
expect(result.angle).toBe(180); // 50% of 360°
|
||||
expect(result.angleRange).toBe(360);
|
||||
});
|
||||
|
||||
@@ -242,8 +241,7 @@ describe('RadialGauge utils', () => {
|
||||
const fieldDisplay = createFieldDisplay(50, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 90, 270);
|
||||
|
||||
expect(result.startValueAngle).toBe(0);
|
||||
expect(result.endValueAngle).toBe(135); // 50% of 360° range
|
||||
expect(result.angle).toBe(135); // 50% of 360° range
|
||||
expect(result.angleRange).toBe(270);
|
||||
});
|
||||
|
||||
@@ -251,28 +249,28 @@ describe('RadialGauge utils', () => {
|
||||
const fieldDisplay = createFieldDisplay(150, 0, 100); // value exceeds max
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360);
|
||||
|
||||
expect(result.endValueAngle).toBe(360); // clamped to angleRange
|
||||
expect(result.angle).toBe(360); // clamped to angleRange
|
||||
});
|
||||
|
||||
it('should handle minimum values', () => {
|
||||
const fieldDisplay = createFieldDisplay(0, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360);
|
||||
|
||||
expect(result.endValueAngle).toBe(0);
|
||||
expect(result.angle).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle maximum values', () => {
|
||||
const fieldDisplay = createFieldDisplay(100, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360);
|
||||
|
||||
expect(result.endValueAngle).toBe(360);
|
||||
expect(result.angle).toBe(360);
|
||||
});
|
||||
|
||||
it('should handle values lower than min', () => {
|
||||
const fieldDisplay = createFieldDisplay(-50, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 240, 120);
|
||||
|
||||
expect(result.endValueAngle).toBe(0);
|
||||
expect(result.angle).toBe(0);
|
||||
});
|
||||
|
||||
it('should handle values higher than max', () => {
|
||||
@@ -280,39 +278,7 @@ describe('RadialGauge utils', () => {
|
||||
const result = getValueAngleForValue(fieldDisplay, 240, 120);
|
||||
|
||||
// Expect the angle to be clamped to the maximum range
|
||||
expect(result.endValueAngle).toBe(240);
|
||||
});
|
||||
|
||||
it('should handle neutral values', () => {
|
||||
const fieldDisplay = createFieldDisplay(75, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360, 50);
|
||||
|
||||
expect(result.startValueAngle).toBe(180); // Neutral at 50% of 360°
|
||||
expect(result.endValueAngle).toBe(90); // 75% - 50% = 25% of 360°
|
||||
});
|
||||
|
||||
it('should handle neutral values equal to value', () => {
|
||||
const fieldDisplay = createFieldDisplay(50, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360, 50);
|
||||
|
||||
expect(result.startValueAngle).toBe(180); // Neutral at 50% of 360°
|
||||
expect(result.endValueAngle).toBe(0); // No difference
|
||||
});
|
||||
|
||||
it('should handle neutral values greater than value', () => {
|
||||
const fieldDisplay = createFieldDisplay(25, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360, 150);
|
||||
|
||||
expect(result.startValueAngle).toBe(90);
|
||||
expect(result.endValueAngle).toBe(270); // remaining angle to 360
|
||||
});
|
||||
|
||||
it('should handle neutral values below range', () => {
|
||||
const fieldDisplay = createFieldDisplay(25, 0, 100);
|
||||
const result = getValueAngleForValue(fieldDisplay, 0, 360, -50);
|
||||
|
||||
expect(result.startValueAngle).toBe(0);
|
||||
expect(result.endValueAngle).toBe(90);
|
||||
expect(result.angle).toBe(240);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -28,32 +28,19 @@ export function getValueAngleForValue(
|
||||
fieldDisplay: FieldDisplay,
|
||||
startAngle: number,
|
||||
endAngle: number,
|
||||
neutral?: number
|
||||
value = fieldDisplay.display.numeric
|
||||
) {
|
||||
const angleRange = (360 % (startAngle === 0 ? 1 : startAngle)) + endAngle;
|
||||
const value = fieldDisplay.display.numeric;
|
||||
|
||||
const valueAngle = getValuePercentageForValue(fieldDisplay, value) * angleRange;
|
||||
let angle = getValuePercentageForValue(fieldDisplay, value) * angleRange;
|
||||
|
||||
let endValueAngle = valueAngle;
|
||||
|
||||
let startValueAngle = 0;
|
||||
if (typeof neutral === 'number') {
|
||||
const [min, max] = getFieldConfigMinMax(fieldDisplay);
|
||||
const clampedNeutral = Math.min(Math.max(min, neutral), max);
|
||||
const neutralAngle = getValuePercentageForValue(fieldDisplay, clampedNeutral) * angleRange;
|
||||
if (neutralAngle <= valueAngle) {
|
||||
startValueAngle = neutralAngle;
|
||||
endValueAngle = valueAngle - neutralAngle;
|
||||
} else {
|
||||
startValueAngle = valueAngle;
|
||||
endValueAngle = neutralAngle - valueAngle;
|
||||
}
|
||||
if (angle > angleRange) {
|
||||
angle = angleRange;
|
||||
} else if (angle < 0) {
|
||||
angle = 0;
|
||||
}
|
||||
|
||||
const clampedEndValueAngle = Math.min(Math.max(endValueAngle, 0), angleRange);
|
||||
|
||||
return { angleRange, startValueAngle, endValueAngle: clampedEndValueAngle };
|
||||
return { angleRange, angle };
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -32,6 +32,8 @@ import (
|
||||
var (
|
||||
logger = glog.New("data-proxy-log")
|
||||
client = newHTTPClient()
|
||||
|
||||
errPluginProxyRouteAccessDenied = errors.New("plugin proxy route access denied")
|
||||
)
|
||||
|
||||
type DataSourceProxy struct {
|
||||
@@ -308,12 +310,21 @@ func (proxy *DataSourceProxy) validateRequest() error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
// issues/116273: When we have an empty input route (or input that becomes relative to "."), we do not want it
|
||||
// to be ".". This is because the `CleanRelativePath` function will never return "./" prefixes, and as such,
|
||||
// the common prefix we need is an empty string.
|
||||
if r1 == "." && proxy.proxyPath != "." {
|
||||
r1 = ""
|
||||
}
|
||||
if r2 == "." && route.Path != "." {
|
||||
r2 = ""
|
||||
}
|
||||
if !strings.HasPrefix(r1, r2) {
|
||||
continue
|
||||
}
|
||||
|
||||
if !proxy.hasAccessToRoute(route) {
|
||||
return errors.New("plugin proxy route access denied")
|
||||
return errPluginProxyRouteAccessDenied
|
||||
}
|
||||
|
||||
proxy.matchedRoute = route
|
||||
|
||||
@@ -673,6 +673,94 @@ func TestIntegrationDataSourceProxy_routeRule(t *testing.T) {
|
||||
runDatasourceAuthTest(t, secretsService, secretsStore, cfg, test)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Regression of 116273: Fallback routes should apply fallback route roles", func(t *testing.T) {
|
||||
for _, tc := range []struct {
|
||||
InputPath string
|
||||
ConfigurationPath string
|
||||
ExpectError bool
|
||||
}{
|
||||
{
|
||||
InputPath: "api/v2/leak-ur-secrets",
|
||||
ConfigurationPath: "",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: "",
|
||||
ConfigurationPath: "",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: ".",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: true,
|
||||
},
|
||||
{
|
||||
InputPath: "",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: false,
|
||||
},
|
||||
{
|
||||
InputPath: "api",
|
||||
ConfigurationPath: ".",
|
||||
ExpectError: false,
|
||||
},
|
||||
} {
|
||||
orEmptyStr := func(s string) string {
|
||||
if s == "" {
|
||||
return "<empty>"
|
||||
}
|
||||
return s
|
||||
}
|
||||
t.Run(
|
||||
fmt.Sprintf("with inputPath=%s, configurationPath=%s, expectError=%v",
|
||||
orEmptyStr(tc.InputPath), orEmptyStr(tc.ConfigurationPath), tc.ExpectError),
|
||||
func(t *testing.T) {
|
||||
ds := &datasources.DataSource{
|
||||
UID: "dsUID",
|
||||
JsonData: simplejson.New(),
|
||||
}
|
||||
routes := []*plugins.Route{
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "GET",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "POST",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "PUT",
|
||||
},
|
||||
{
|
||||
Path: tc.ConfigurationPath,
|
||||
ReqRole: org.RoleAdmin,
|
||||
Method: "DELETE",
|
||||
},
|
||||
}
|
||||
|
||||
req, err := http.NewRequestWithContext(t.Context(), "GET", "http://localhost/"+tc.InputPath, nil)
|
||||
require.NoError(t, err, "failed to create HTTP request")
|
||||
ctx := &contextmodel.ReqContext{
|
||||
Context: &web.Context{Req: req},
|
||||
SignedInUser: &user.SignedInUser{OrgRole: org.RoleViewer},
|
||||
}
|
||||
proxy, err := setupDSProxyTest(t, ctx, ds, routes, tc.InputPath)
|
||||
require.NoError(t, err, "failed to setup proxy test")
|
||||
err = proxy.validateRequest()
|
||||
if tc.ExpectError {
|
||||
require.ErrorIs(t, err, errPluginProxyRouteAccessDenied, "request was not denied due to access denied?")
|
||||
} else {
|
||||
require.NoError(t, err, "request was unexpectedly denied access")
|
||||
}
|
||||
},
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// test DataSourceProxy request handling.
|
||||
|
||||
@@ -14,6 +14,7 @@ import (
|
||||
"github.com/grafana/grafana/pkg/apimachinery/validation"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/db"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/dbutil"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/rvmanager"
|
||||
"github.com/grafana/grafana/pkg/storage/unified/sql/sqltemplate"
|
||||
gocache "github.com/patrickmn/go-cache"
|
||||
)
|
||||
@@ -868,10 +869,18 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
if key.Action == DataActionDeleted {
|
||||
generation = 0
|
||||
}
|
||||
|
||||
// In compatibility mode, the previous RV, when available, is saved as a microsecond
|
||||
// timestamp, as is done in the SQL backend.
|
||||
previousRV := event.PreviousRV
|
||||
if event.PreviousRV > 0 && isSnowflake(event.PreviousRV) {
|
||||
previousRV = rvmanager.RVFromSnowflake(event.PreviousRV)
|
||||
}
|
||||
|
||||
_, err := dbutil.Exec(ctx, tx, sqlKVUpdateLegacyResourceHistory, sqlKVLegacyUpdateHistoryRequest{
|
||||
SQLTemplate: sqltemplate.New(kv.dialect),
|
||||
GUID: key.GUID,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
Generation: generation,
|
||||
})
|
||||
|
||||
@@ -900,7 +909,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
Name: key.Name,
|
||||
Action: action,
|
||||
Folder: key.Folder,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -916,7 +925,7 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
Name: key.Name,
|
||||
Action: action,
|
||||
Folder: key.Folder,
|
||||
PreviousRV: event.PreviousRV,
|
||||
PreviousRV: previousRV,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
@@ -938,3 +947,15 @@ func (d *dataStore) applyBackwardsCompatibleChanges(ctx context.Context, tx db.T
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// isSnowflake returns whether the argument passed is a snowflake ID (new) or a microsecond timestamp (old).
|
||||
// We try to interpret the number as a microsecond timestamp first. If it represents a time in the past,
|
||||
// it is considered a microsecond timestamp. Snowflake IDs are much larger integers and would lead
|
||||
// to dates in the future if interpreted as a microsecond timestamp.
|
||||
func isSnowflake(rv int64) bool {
|
||||
ts := time.UnixMicro(rv)
|
||||
oneHourFromNow := time.Now().Add(time.Hour)
|
||||
isMicroSecRV := ts.Before(oneHourFromNow)
|
||||
|
||||
return !isMicroSecRV
|
||||
}
|
||||
|
||||
@@ -456,33 +456,27 @@ func testNotifierWatchMultipleEvents(t *testing.T, ctx context.Context, notifier
|
||||
},
|
||||
}
|
||||
|
||||
errCh := make(chan error)
|
||||
go func() {
|
||||
for _, event := range testEvents {
|
||||
err := eventStore.Save(ctx, event)
|
||||
require.NoError(t, err)
|
||||
errCh <- eventStore.Save(ctx, event)
|
||||
}
|
||||
}()
|
||||
|
||||
// Receive events
|
||||
receivedEvents := make([]Event, 0, len(testEvents))
|
||||
for i := 0; i < len(testEvents); i++ {
|
||||
receivedEvents := make([]string, 0, len(testEvents))
|
||||
for len(receivedEvents) != len(testEvents) {
|
||||
select {
|
||||
case event := <-events:
|
||||
receivedEvents = append(receivedEvents, event)
|
||||
receivedEvents = append(receivedEvents, event.Name)
|
||||
case err := <-errCh:
|
||||
require.NoError(t, err)
|
||||
case <-time.After(1 * time.Second):
|
||||
t.Fatalf("Timed out waiting for event %d", i+1)
|
||||
t.Fatalf("Timed out waiting for event %d", len(receivedEvents)+1)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify all events were received
|
||||
assert.Len(t, receivedEvents, len(testEvents))
|
||||
|
||||
// Verify the events match and ordered by resource version
|
||||
receivedNames := make([]string, len(receivedEvents))
|
||||
for i, event := range receivedEvents {
|
||||
receivedNames[i] = event.Name
|
||||
}
|
||||
|
||||
expectedNames := []string{"test-resource-1", "test-resource-2", "test-resource-3"}
|
||||
assert.ElementsMatch(t, expectedNames, receivedNames)
|
||||
assert.ElementsMatch(t, expectedNames, receivedEvents)
|
||||
}
|
||||
|
||||
@@ -473,8 +473,6 @@ func (k *sqlKV) Delete(ctx context.Context, section string, key string) error {
|
||||
return ErrNotFound
|
||||
}
|
||||
|
||||
// TODO reflect change to resource table
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
|
||||
@@ -347,7 +347,7 @@ func (k *kvStorageBackend) WriteEvent(ctx context.Context, event WriteEvent) (in
|
||||
return 0, fmt.Errorf("failed to write data: %w", err)
|
||||
}
|
||||
|
||||
rv = rvmanager.SnowflakeFromRv(rv)
|
||||
rv = rvmanager.SnowflakeFromRV(rv)
|
||||
dataKey.ResourceVersion = rv
|
||||
} else {
|
||||
err := k.dataStore.Save(ctx, dataKey, bytes.NewReader(event.Value))
|
||||
|
||||
@@ -307,7 +307,7 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
||||
// Allocate the RVs
|
||||
for i, guid := range guids {
|
||||
guidToRV[guid] = rv
|
||||
guidToSnowflakeRV[guid] = SnowflakeFromRv(rv)
|
||||
guidToSnowflakeRV[guid] = SnowflakeFromRV(rv)
|
||||
rvs[i] = rv
|
||||
rv++
|
||||
}
|
||||
@@ -364,12 +364,20 @@ func (m *ResourceVersionManager) execBatch(ctx context.Context, group, resource
|
||||
}
|
||||
}
|
||||
|
||||
// takes a unix microsecond rv and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||
// takes a unix microsecond RV and transforms into a snowflake format. The timestamp is converted from microsecond to
|
||||
// millisecond (the integer division) and the remainder is saved in the stepbits section. machine id is always 0
|
||||
func SnowflakeFromRv(rv int64) int64 {
|
||||
func SnowflakeFromRV(rv int64) int64 {
|
||||
return (((rv / 1000) - snowflake.Epoch) << (snowflake.NodeBits + snowflake.StepBits)) + (rv % 1000)
|
||||
}
|
||||
|
||||
// It is generally not possible to convert from a snowflakeID to a microsecond RV due to the loss in precision
|
||||
// (snowflake ID stores timestamp in milliseconds). However, this implementation stores the microsecond fraction
|
||||
// in the step bits (see SnowflakeFromRV), allowing us to compute the microsecond timestamp.
|
||||
func RVFromSnowflake(snowflakeID int64) int64 {
|
||||
microSecFraction := snowflakeID & ((1 << snowflake.StepBits) - 1)
|
||||
return ((snowflakeID>>(snowflake.NodeBits+snowflake.StepBits))+snowflake.Epoch)*1000 + microSecFraction
|
||||
}
|
||||
|
||||
// helper utility to compare two RVs. The first RV must be in snowflake format. Will convert rv2 to snowflake and retry
|
||||
// if comparison fails
|
||||
func IsRvEqual(rv1, rv2 int64) bool {
|
||||
@@ -377,7 +385,7 @@ func IsRvEqual(rv1, rv2 int64) bool {
|
||||
return true
|
||||
}
|
||||
|
||||
return rv1 == SnowflakeFromRv(rv2)
|
||||
return rv1 == SnowflakeFromRV(rv2)
|
||||
}
|
||||
|
||||
// Lock locks the resource version for the given key
|
||||
|
||||
@@ -63,3 +63,13 @@ func TestResourceVersionManager(t *testing.T) {
|
||||
require.Equal(t, rv, int64(200))
|
||||
})
|
||||
}
|
||||
|
||||
func TestSnowflakeFromRVRoundtrips(t *testing.T) {
|
||||
// 2026-01-12 19:33:58.806211 +0000 UTC
|
||||
offset := int64(1768246438806211) // in microseconds
|
||||
|
||||
for n := range int64(100) {
|
||||
ts := offset + n
|
||||
require.Equal(t, ts, RVFromSnowflake(SnowflakeFromRV(ts)))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -200,7 +200,7 @@ func verifyKeyPath(t *testing.T, db sqldb.DB, ctx context.Context, key *resource
|
||||
var keyPathRV int64
|
||||
if isSqlBackend {
|
||||
// Convert microsecond RV to snowflake for key_path construction
|
||||
keyPathRV = rvmanager.SnowflakeFromRv(resourceVersion)
|
||||
keyPathRV = rvmanager.SnowflakeFromRV(resourceVersion)
|
||||
} else {
|
||||
// KV backend already provides snowflake RV
|
||||
keyPathRV = resourceVersion
|
||||
@@ -434,9 +434,6 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
|
||||
rows, err := db.QueryContext(ctx, query, namespace)
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceHistoryRecord
|
||||
for rows.Next() {
|
||||
@@ -460,33 +457,34 @@ func verifyResourceHistoryTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
for resourceIdx, res := range resources {
|
||||
// Check create record (action=1, generation=1)
|
||||
createRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, createRecord, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||
verifyResourceHistoryRecord(t, createRecord, namespace, res, resourceIdx, 1, 0, 1, resourceVersions[resourceIdx][0])
|
||||
recordIndex++
|
||||
}
|
||||
|
||||
for resourceIdx, res := range resources {
|
||||
// Check update record (action=2, generation=2)
|
||||
updateRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, updateRecord, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||
verifyResourceHistoryRecord(t, updateRecord, namespace, res, resourceIdx, 2, resourceVersions[resourceIdx][0], 2, resourceVersions[resourceIdx][1])
|
||||
recordIndex++
|
||||
}
|
||||
|
||||
for resourceIdx, res := range resources[:2] {
|
||||
// Check delete record (action=3, generation=0) - only first 2 resources were deleted
|
||||
deleteRecord := records[recordIndex]
|
||||
verifyResourceHistoryRecord(t, deleteRecord, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||
verifyResourceHistoryRecord(t, deleteRecord, namespace, res, resourceIdx, 3, resourceVersions[resourceIdx][1], 0, resourceVersions[resourceIdx][2])
|
||||
recordIndex++
|
||||
}
|
||||
}
|
||||
|
||||
// verifyResourceHistoryRecord validates a single resource_history record
|
||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||
func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, namespace string, expectedRes struct{ name, folder string }, resourceIdx, expectedAction int, expectedPrevRV int64, expectedGeneration int, expectedRV int64) {
|
||||
// Validate GUID (should be non-empty)
|
||||
require.NotEmpty(t, record.GUID, "GUID should not be empty")
|
||||
|
||||
// Validate group/resource/namespace/name
|
||||
require.Equal(t, "playlist.grafana.app", record.Group)
|
||||
require.Equal(t, "playlists", record.Resource)
|
||||
require.Equal(t, namespace, record.Namespace)
|
||||
require.Equal(t, expectedRes.name, record.Name)
|
||||
|
||||
// Validate value contains expected JSON - server modifies/formats the JSON differently for different operations
|
||||
@@ -513,8 +511,12 @@ func verifyResourceHistoryRecord(t *testing.T, record ResourceHistoryRecord, exp
|
||||
// For KV backend operations, expectedPrevRV is now in snowflake format (returned by KV backend)
|
||||
// but resource_history table stores microsecond RV, so we need to use IsRvEqual for comparison
|
||||
if strings.Contains(record.Namespace, "-kv") {
|
||||
require.True(t, rvmanager.IsRvEqual(expectedPrevRV, record.PreviousResourceVersion),
|
||||
"Previous resource version should match (KV backend snowflake format)")
|
||||
if expectedPrevRV == 0 {
|
||||
require.Zero(t, record.PreviousResourceVersion)
|
||||
} else {
|
||||
require.Equal(t, expectedPrevRV, rvmanager.SnowflakeFromRV(record.PreviousResourceVersion),
|
||||
"Previous resource version should match (KV backend snowflake format)")
|
||||
}
|
||||
} else {
|
||||
require.Equal(t, expectedPrevRV, record.PreviousResourceVersion)
|
||||
}
|
||||
@@ -546,9 +548,6 @@ func verifyResourceTable(t *testing.T, db sqldb.DB, namespace string, resources
|
||||
|
||||
rows, err := db.QueryContext(ctx, query, namespace)
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceRecord
|
||||
for rows.Next() {
|
||||
@@ -612,9 +611,6 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
// Check that we have exactly one entry for playlist.grafana.app/playlists
|
||||
rows, err := db.QueryContext(ctx, query, "playlist.grafana.app", "playlists")
|
||||
require.NoError(t, err)
|
||||
defer func() {
|
||||
_ = rows.Close()
|
||||
}()
|
||||
|
||||
var records []ResourceVersionRecord
|
||||
for rows.Next() {
|
||||
@@ -649,7 +645,7 @@ func verifyResourceVersionTable(t *testing.T, db sqldb.DB, namespace string, res
|
||||
isKvBackend := strings.Contains(namespace, "-kv")
|
||||
recordResourceVersion := record.ResourceVersion
|
||||
if isKvBackend {
|
||||
recordResourceVersion = rvmanager.SnowflakeFromRv(record.ResourceVersion)
|
||||
recordResourceVersion = rvmanager.SnowflakeFromRV(record.ResourceVersion)
|
||||
}
|
||||
|
||||
require.Less(t, recordResourceVersion, int64(9223372036854775807), "resource_version should be reasonable")
|
||||
@@ -841,24 +837,20 @@ func runMixedConcurrentOperations(t *testing.T, sqlServer, kvServer resource.Res
|
||||
}
|
||||
|
||||
// SQL backend operations
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
wg.Go(func() {
|
||||
<-startBarrier // Wait for signal to start
|
||||
if err := runBackendOperationsWithCounts(ctx, sqlServer, namespace+"-sql", "sql", opCounts); err != nil {
|
||||
errors <- fmt.Errorf("SQL backend operations failed: %w", err)
|
||||
}
|
||||
}()
|
||||
})
|
||||
|
||||
// KV backend operations
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
wg.Go(func() {
|
||||
<-startBarrier // Wait for signal to start
|
||||
if err := runBackendOperationsWithCounts(ctx, kvServer, namespace+"-kv", "kv", opCounts); err != nil {
|
||||
errors <- fmt.Errorf("KV backend operations failed: %w", err)
|
||||
}
|
||||
}()
|
||||
})
|
||||
|
||||
// Start both goroutines simultaneously
|
||||
close(startBarrier)
|
||||
|
||||
@@ -30,6 +30,7 @@ const (
|
||||
defaultLogGroupLimit = int32(50)
|
||||
logIdentifierInternal = "__log__grafana_internal__"
|
||||
logStreamIdentifierInternal = "__logstream__grafana_internal__"
|
||||
logGroupsMacro = "$__logGroups"
|
||||
)
|
||||
|
||||
type AWSError struct {
|
||||
@@ -189,6 +190,47 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
logsQuery.QueryLanguage = &cwli
|
||||
}
|
||||
|
||||
region := logsQuery.Region
|
||||
if region == "" || region == defaultRegion {
|
||||
region = ds.Settings.Region
|
||||
}
|
||||
|
||||
useARN := false
|
||||
if len(logsQuery.LogGroups) > 0 && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) && region != "" {
|
||||
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
||||
if err != nil {
|
||||
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
||||
} else {
|
||||
useARN = isMonitoringAccount
|
||||
}
|
||||
}
|
||||
|
||||
var logGroupIdentifiers []string
|
||||
if len(logsQuery.LogGroups) > 0 {
|
||||
// Log queries should use ARNs when querying a monitoring account because log group names are not unique across accounts.
|
||||
if useARN {
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Arn != "" {
|
||||
// The startQuery api does not support arns with a trailing * so we need to remove it
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, strings.TrimSuffix(lg.Arn, "*"))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
||||
seen := make(map[string]struct{}, len(logsQuery.LogGroups))
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Name == "" {
|
||||
continue
|
||||
}
|
||||
if _, exists := seen[lg.Name]; exists {
|
||||
continue
|
||||
}
|
||||
seen[lg.Name] = struct{}{}
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, lg.Name)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
finalQueryString := logsQuery.QueryString
|
||||
// Only for CWLI queries
|
||||
// The fields @log and @logStream are always included in the results of a user's query
|
||||
@@ -200,6 +242,21 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
logStreamIdentifierInternal + "|" + logsQuery.QueryString
|
||||
}
|
||||
|
||||
// Expand $__logGroups macro for SQL queries
|
||||
if *logsQuery.QueryLanguage == dataquery.LogsQueryLanguageSQL {
|
||||
if strings.Contains(finalQueryString, logGroupsMacro) {
|
||||
if len(logGroupIdentifiers) == 0 {
|
||||
return nil, backend.DownstreamError(fmt.Errorf("query contains %s but no log groups are selected", logGroupsMacro))
|
||||
}
|
||||
quoted := make([]string, len(logGroupIdentifiers))
|
||||
for i, id := range logGroupIdentifiers {
|
||||
quoted[i] = fmt.Sprintf("'%s'", id)
|
||||
}
|
||||
replacement := fmt.Sprintf("`logGroups(logGroupIdentifier: [%s])`", strings.Join(quoted, ", "))
|
||||
finalQueryString = strings.Replace(finalQueryString, logGroupsMacro, replacement, 1)
|
||||
}
|
||||
}
|
||||
|
||||
startQueryInput := &cloudwatchlogs.StartQueryInput{
|
||||
StartTime: aws.Int64(startTime.Unix()),
|
||||
// Usually grafana time range allows only second precision, but you can create ranges with milliseconds
|
||||
@@ -213,47 +270,13 @@ func (ds *DataSource) executeStartQuery(ctx context.Context, logsClient models.C
|
||||
|
||||
// log group identifiers can be left out if the query is an SQL query
|
||||
if *logsQuery.QueryLanguage != dataquery.LogsQueryLanguageSQL {
|
||||
useLogGroupIdentifiers := false
|
||||
logGroupsFromQuery := len(logsQuery.LogGroups) > 0
|
||||
if logGroupsFromQuery && features.IsEnabled(ctx, features.FlagCloudWatchCrossAccountQuerying) {
|
||||
region := logsQuery.Region
|
||||
if region == "" || region == defaultRegion {
|
||||
region = ds.Settings.Region
|
||||
}
|
||||
if region != "" {
|
||||
isMonitoringAccount, err := ds.isMonitoringAccount(ctx, region)
|
||||
if err != nil {
|
||||
ds.logger.FromContext(ctx).Debug("failed to determine monitoring account status", "err", err)
|
||||
} else if isMonitoringAccount {
|
||||
// monitoring accounts require querying by log group identifiers because log group names are not unique across accounts.
|
||||
var logGroupIdentifiers []string
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
// due to a bug in the startQuery api, we remove * from the arn, otherwise it throws an error
|
||||
arn := strings.TrimSuffix(lg.Arn, "*")
|
||||
logGroupIdentifiers = append(logGroupIdentifiers, arn)
|
||||
}
|
||||
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
||||
useLogGroupIdentifiers = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !useLogGroupIdentifiers {
|
||||
if useARN {
|
||||
startQueryInput.LogGroupIdentifiers = logGroupIdentifiers
|
||||
} else {
|
||||
// even though logsQuery.LogGroupNames is deprecated, we still need to support it for backwards compatibility and alert queries
|
||||
startQueryInput.LogGroupNames = append([]string(nil), logsQuery.LogGroupNames...)
|
||||
if len(startQueryInput.LogGroupNames) == 0 && logGroupsFromQuery {
|
||||
// deduplicate log group names because we only deduplicate log groups by their ARNs instead of their names when the query is created
|
||||
seenLogGroupNames := make(map[string]struct{}, len(logsQuery.LogGroups))
|
||||
for _, lg := range logsQuery.LogGroups {
|
||||
if lg.Name == "" {
|
||||
continue
|
||||
}
|
||||
if _, exists := seenLogGroupNames[lg.Name]; exists {
|
||||
continue
|
||||
}
|
||||
seenLogGroupNames[lg.Name] = struct{}{}
|
||||
startQueryInput.LogGroupNames = append(startQueryInput.LogGroupNames, lg.Name)
|
||||
}
|
||||
if len(startQueryInput.LogGroupNames) == 0 && len(logGroupIdentifiers) > 0 {
|
||||
startQueryInput.LogGroupNames = logGroupIdentifiers
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -873,6 +873,204 @@ func TestQuery_GetQueryResults(t *testing.T) {
|
||||
}, resp)
|
||||
}
|
||||
|
||||
func Test_expandLogGroupsMacro(t *testing.T) {
|
||||
origNewCWLogsClient := NewCWLogsClient
|
||||
t.Cleanup(func() {
|
||||
NewCWLogsClient = origNewCWLogsClient
|
||||
})
|
||||
|
||||
var cli fakeCWLogsClient
|
||||
|
||||
NewCWLogsClient = func(cfg aws.Config) models.CWLogsClient {
|
||||
return &cli
|
||||
}
|
||||
|
||||
t.Run("expands $__logGroups macro with log group names when not a monitoring account", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['group1', 'group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("expands $__logGroups macro with ARNs when monitoring account", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource(func(ds *DataSource) {
|
||||
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||
})
|
||||
|
||||
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}, {"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group2", "name": "group2"}],
|
||||
"region": "us-east-1"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1', 'arn:aws:logs:us-east-1:123456789012:log-group:group2'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("strips trailing * from ARNs when expanding macro", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource(func(ds *DataSource) {
|
||||
ds.monitoringAccountCache.Store("us-east-1", true)
|
||||
})
|
||||
|
||||
_, err := ds.QueryData(contextWithFeaturesEnabled(features.FlagCloudWatchCrossAccountQuerying), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1*", "name": "group1"}],
|
||||
"region": "us-east-1"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['arn:aws:logs:us-east-1:123456789012:log-group:group1'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("returns error when $__logGroups macro is used but no log groups are selected", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
resp, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
assert.Contains(t, resp.Responses["A"].Error.Error(), "query contains $__logGroups but no log groups are selected")
|
||||
})
|
||||
|
||||
t.Run("does not expand macro when query does not contain $__logGroups", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM ` + "`logGroups(logGroupIdentifier: ['my-log-group'])`" + `"
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['my-log-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
|
||||
t.Run("does not expand macro for non-SQL query languages", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "CWLI",
|
||||
"queryString":"fields @message | $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:group1", "name": "group1"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Contains(t, *cli.calls.startQuery[0].QueryString, "$__logGroups")
|
||||
})
|
||||
|
||||
t.Run("expands macro with single log group", func(t *testing.T) {
|
||||
cli = fakeCWLogsClient{}
|
||||
ds := newTestDatasource()
|
||||
|
||||
_, err := ds.QueryData(context.Background(), &backend.QueryDataRequest{
|
||||
PluginContext: backend.PluginContext{DataSourceInstanceSettings: &backend.DataSourceInstanceSettings{}},
|
||||
Queries: []backend.DataQuery{
|
||||
{
|
||||
RefID: "A",
|
||||
TimeRange: backend.TimeRange{From: time.Unix(0, 0), To: time.Unix(1, 0)},
|
||||
JSON: json.RawMessage(`{
|
||||
"type": "logAction",
|
||||
"subtype": "StartQuery",
|
||||
"queryLanguage": "SQL",
|
||||
"queryString":"SELECT * FROM $__logGroups",
|
||||
"logGroups":[{"arn": "arn:aws:logs:us-east-1:123456789012:log-group:single-group", "name": "single-group"}]
|
||||
}`),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
assert.NoError(t, err)
|
||||
require.Len(t, cli.calls.startQuery, 1)
|
||||
assert.Equal(t, "SELECT * FROM `logGroups(logGroupIdentifier: ['single-group'])`", *cli.calls.startQuery[0].QueryString)
|
||||
})
|
||||
}
|
||||
|
||||
func TestGroupResponseFrame(t *testing.T) {
|
||||
t.Run("Doesn't group results without time field", func(t *testing.T) {
|
||||
frame := data.NewFrameOfFieldTypes("test", 0, data.FieldTypeString, data.FieldTypeInt32)
|
||||
|
||||
@@ -36,7 +36,7 @@ export const DEFAULT_ANNOTATIONS_QUERY: Omit<CloudWatchAnnotationQuery, 'refId'>
|
||||
export const DEFAULT_CWLI_QUERY_STRING = 'fields @timestamp, @message |\nsort @timestamp desc |\nlimit 20';
|
||||
export const DEFAULT_PPL_QUERY_STRING = 'fields `@timestamp`, `@message`\n| sort - `@timestamp`\n| head 25s';
|
||||
export const DEFAULT_SQL_QUERY_STRING =
|
||||
'SELECT `@timestamp`, `@message`\nFROM `log_group`\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
||||
'SELECT `@timestamp`, `@message`\nFROM $__logGroups\nORDER BY `@timestamp` DESC\nLIMIT 25;';
|
||||
|
||||
export const getDefaultLogsQuery = (
|
||||
defaultLogGroups?: LogGroup[],
|
||||
|
||||
+11
-3
@@ -97,14 +97,22 @@ describe('LogsSQLCompletionItemProvider', () => {
|
||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 103 });
|
||||
const suggestionLabels = suggestions.map((s) => s.label);
|
||||
expect(suggestionLabels).toEqual(
|
||||
expect.arrayContaining([FROM, `${FROM} \`logGroups(logGroupIdentifier: [...])\``, CASE, ...ALL_FUNCTIONS])
|
||||
expect.arrayContaining([
|
||||
FROM,
|
||||
`${FROM} $__logGroups`,
|
||||
`${FROM} \`logGroups(logGroupIdentifier: [...])\``,
|
||||
CASE,
|
||||
...ALL_FUNCTIONS,
|
||||
])
|
||||
);
|
||||
});
|
||||
|
||||
it('returns logGroups suggestion after from keyword', async () => {
|
||||
it('returns logGroups and $__logGroups suggestion after from keyword', async () => {
|
||||
const suggestions = await getSuggestions(singleLineFullQuery.query, { lineNumber: 1, column: 108 });
|
||||
const suggestionLabels = suggestions.map((s) => s.label);
|
||||
expect(suggestionLabels).toEqual(expect.arrayContaining(['`logGroups(logGroupIdentifier: [...])`']));
|
||||
expect(suggestionLabels).toEqual(
|
||||
expect.arrayContaining(['$__logGroups', '`logGroups(logGroupIdentifier: [...])`'])
|
||||
);
|
||||
});
|
||||
|
||||
it('returns where, having, limit, group by, order by, and join suggestions after from arguments', async () => {
|
||||
|
||||
+12
@@ -142,6 +142,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
||||
command: TRIGGER_SUGGEST,
|
||||
sortText: CompletionItemPriority.MediumHigh,
|
||||
});
|
||||
addSuggestion(`${FROM} $__logGroups`, {
|
||||
insertText: `${FROM} $__logGroups`,
|
||||
kind: monaco.languages.CompletionItemKind.Snippet,
|
||||
sortText: CompletionItemPriority.High,
|
||||
detail: 'Use selected log groups from the selector',
|
||||
});
|
||||
addSuggestion(`${FROM} \`logGroups(logGroupIdentifier: [...])\``, {
|
||||
insertText: `${FROM} \`logGroups(logGroupIdentifier: [$0])\``,
|
||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||
@@ -152,6 +158,12 @@ export class LogsSQLCompletionItemProvider extends CompletionItemProvider {
|
||||
break;
|
||||
|
||||
case SuggestionKind.AfterFromKeyword:
|
||||
addSuggestion('$__logGroups', {
|
||||
insertText: '$__logGroups',
|
||||
kind: monaco.languages.CompletionItemKind.Variable,
|
||||
sortText: CompletionItemPriority.High,
|
||||
detail: 'Expands to selected log groups',
|
||||
});
|
||||
addSuggestion('`logGroups(logGroupIdentifier: [...])`', {
|
||||
insertText: '`logGroups(logGroupIdentifier: [$0])`',
|
||||
insertTextRules: monaco.languages.CompletionItemInsertTextRule.InsertAsSnippet,
|
||||
|
||||
@@ -488,6 +488,7 @@ export const language: CloudWatchLanguage = {
|
||||
root: [
|
||||
{ include: '@comments' },
|
||||
{ include: '@whitespace' },
|
||||
{ include: '@macros' },
|
||||
{ include: '@customParams' },
|
||||
{ include: '@numbers' },
|
||||
{ include: '@binaries' },
|
||||
@@ -519,6 +520,7 @@ export const language: CloudWatchLanguage = {
|
||||
[/\*\//, { token: 'comment.quote', next: '@pop' }],
|
||||
[/./, 'comment'],
|
||||
],
|
||||
macros: [[/\$__[a-zA-Z0-9_]+/, 'type']],
|
||||
customParams: [
|
||||
[/\${[A-Za-z0-9._-]*}/, 'variable'],
|
||||
[/\@\@{[A-Za-z0-9._-]*}/, 'variable'],
|
||||
|
||||
@@ -32,27 +32,26 @@ export function RadialBarPanel({
|
||||
|
||||
return (
|
||||
<RadialGauge
|
||||
alignmentFactors={valueProps.alignmentFactors}
|
||||
values={[value]}
|
||||
width={width}
|
||||
height={height}
|
||||
barWidthFactor={options.barWidthFactor}
|
||||
endpointMarker={options.endpointMarker !== 'none' ? options.endpointMarker : undefined}
|
||||
gradient={options.effects?.gradient}
|
||||
glowBar={options.effects?.barGlow}
|
||||
glowCenter={options.effects?.centerGlow}
|
||||
gradient={options.effects?.gradient}
|
||||
height={height}
|
||||
nameManualFontSize={options.text?.titleSize}
|
||||
neutral={options.neutral}
|
||||
onClick={menuProps.openMenu}
|
||||
roundedBars={options.barShape === 'rounded'}
|
||||
vizCount={valueProps.count}
|
||||
shape={options.shape}
|
||||
segmentCount={options.segmentCount}
|
||||
segmentSpacing={options.segmentSpacing}
|
||||
shape={options.shape}
|
||||
showScaleLabels={options.showThresholdLabels}
|
||||
textMode={options.textMode}
|
||||
thresholdsBar={options.showThresholdMarkers}
|
||||
showScaleLabels={options.showThresholdLabels}
|
||||
alignmentFactors={valueProps.alignmentFactors}
|
||||
valueManualFontSize={options.text?.valueSize}
|
||||
values={[value]}
|
||||
vizCount={valueProps.count}
|
||||
width={width}
|
||||
nameManualFontSize={options.text?.titleSize}
|
||||
endpointMarker={options.endpointMarker !== 'none' ? options.endpointMarker : undefined}
|
||||
onClick={menuProps.openMenu}
|
||||
textMode={options.textMode}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -161,17 +161,6 @@ export const plugin = new PanelPlugin<Options>(RadialBarPanel)
|
||||
defaultValue: defaultOptions.textMode,
|
||||
});
|
||||
|
||||
builder.addNumberInput({
|
||||
path: 'neutral',
|
||||
name: t('radialbar.config.neutral.title', 'Neutral value'),
|
||||
description: t('radialbar.config.neutral.description', 'Leave empty to use Min as neutral point'),
|
||||
category,
|
||||
settings: {
|
||||
placeholder: t('radialbar.config.neutral.placeholder', 'none'),
|
||||
step: 1,
|
||||
},
|
||||
});
|
||||
|
||||
builder.addBooleanSwitch({
|
||||
path: 'sparkline',
|
||||
name: t('radialbar.config.sparkline', 'Show sparkline'),
|
||||
|
||||
@@ -42,8 +42,7 @@ composableKinds: PanelCfg: {
|
||||
barWidthFactor: number | *0.5
|
||||
barShape: "flat" | "rounded" | *"flat"
|
||||
endpointMarker?: "point" | "glow" | "none" | *"point"
|
||||
textMode?: "auto" | "value_and_name" | "value" | "name" | "none" | *"auto"
|
||||
neutral?: number
|
||||
textMode?: "auto" | "value_and_name" | "value" | "name" | "none" | *"auto"
|
||||
effects: GaugePanelEffects | *{}
|
||||
sizing: common.BarGaugeSizing & (*"auto" | _)
|
||||
minVizWidth: uint32 | *75
|
||||
|
||||
@@ -29,7 +29,6 @@ export interface Options extends common.SingleStatBaseOptions {
|
||||
endpointMarker?: ('point' | 'glow' | 'none');
|
||||
minVizHeight: number;
|
||||
minVizWidth: number;
|
||||
neutral?: number;
|
||||
segmentCount: number;
|
||||
segmentSpacing: number;
|
||||
shape: ('circle' | 'gauge');
|
||||
|
||||
@@ -12572,11 +12572,6 @@
|
||||
"endpoint-marker-glow": "Glow",
|
||||
"endpoint-marker-none": "None",
|
||||
"endpoint-marker-point": "Point",
|
||||
"neutral": {
|
||||
"description": "Leave empty to use Min as neutral point",
|
||||
"placeholder": "none",
|
||||
"title": "Neutral value"
|
||||
},
|
||||
"segment-count": "Segments",
|
||||
"segment-spacing": "Segment spacing",
|
||||
"shape": "Style",
|
||||
|
||||
Reference in New Issue
Block a user