Compare commits

..

14 Commits

Author SHA1 Message Date
Dominik Prokop
f1bf478733 Packages: publish packages@6.3.0-alpha.33 2019-07-09 12:06:41 +02:00
Dominik Prokop
8dd6b18ff2 Rename packages scripts 2019-07-09 12:05:00 +02:00
ryan
cec8e94227 add next tag to releases 2019-07-08 11:39:32 -07:00
ryan
9fdd0d9aac use --no-git-tag-version --no-push 2019-07-08 11:30:56 -07:00
ryan
ca0ce6837d add skip-git to version bump 2019-07-08 11:25:02 -07:00
ryan
37ae46c791 v6.3.0-alpha.32 2019-07-08 11:20:46 -07:00
ryan
81053cc539 Merge branch 'master' into lerna
* master:
  Refactor: fix range util imports (#17988)
  Refactor: move dom utils to @grafana/ui (#17976)
  Docs: Documents new features available with Loki data source in Explore (#17984)
  Prometheus: added time range filter to series labels query (#16851)
  Explore: Adds support for new loki 'start' and 'end' params for labels endpoint (#17512)
  Chore: Removes custom debounce utility in favor of lodash/debounce (#17977)
  Api: Fix auth tokens returning wrong seenAt value (#17980)
2019-07-08 11:19:43 -07:00
ryan
06611d75fc prepare to prep 2019-07-08 11:06:56 -07:00
ryan
ce1e74aff1 v6.3.0-alpha.31 2019-07-08 10:57:28 -07:00
ryan
4859e30684 put back prepare :) I'm trying to do v31 2019-07-08 10:56:42 -07:00
Dominik Prokop
5af791e3dd Temporarily remove the prepare script 2019-07-08 17:07:59 +02:00
Dominik Prokop
eef02dc9b1 v6.3.0-alpha.30 2019-07-08 16:38:42 +02:00
Dominik Prokop
6f6d9c842c Add basic scripts 2019-07-08 16:37:22 +02:00
Dominik Prokop
4d6bcb6a14 First attempt for lerna integration 2019-07-08 16:19:50 +02:00
73 changed files with 492 additions and 1870 deletions

View File

@@ -60,9 +60,9 @@ aliases = ["/v1.1", "/guides/reference/admin", "/v3.1"]
<h4>Provisioning</h4>
<p>A guide to help you automate your Grafana setup & configuration.</p>
</a>
<a href="{{< relref "guides/whats-new-in-v6-3.md" >}}" class="nav-cards__item nav-cards__item--guide">
<h4>What's new in v6.3</h4>
<p>Article on all the new cool features and enhancements in v6.3</p>
<a href="{{< relref "guides/whats-new-in-v6-2.md" >}}" class="nav-cards__item nav-cards__item--guide">
<h4>What's new in v6.2</h4>
<p>Article on all the new cool features and enhancements in v6.2</p>
</a>
<a href="{{< relref "tutorials/screencasts.md" >}}" class="nav-cards__item nav-cards__item--guide">
<h4>Screencasts</h4>

View File

@@ -99,18 +99,3 @@ allow_sign_up = true
allowed_organizations = github google
```
### Team Sync (Enterprise only)
> Only available in Grafana Enterprise v6.3+
With Team Sync you can map your GitHub org teams to teams in Grafana so that your users will automatically be added to
the correct teams.
Your GitHub teams can be referenced in two ways:
- `https://github.com/orgs/<org>/teams/<team name>`
- `@<org>/<team name>`
Example: `@grafana/developers`
[Learn more about Team Sync]({{< relref "auth/enhanced_ldap.md" >}})

View File

@@ -1,144 +0,0 @@
+++
title = "What's New in Grafana v6.3"
description = "Feature & improvement highlights for Grafana v6.3"
keywords = ["grafana", "new", "documentation", "6.3"]
type = "docs"
[menu.docs]
name = "Version 6.3"
identifier = "v6.3"
parent = "whatsnew"
weight = -14
+++
# What's New in Grafana v6.3
For all details please read the full [CHANGELOG.md](https://github.com/grafana/grafana/blob/master/CHANGELOG.md)
## Highlights
- New Explore features
- [Loki Live Streaming]({{< relref "#loki-live-streaming" >}})
- [Loki Context Queries]({{< relref "#loki-context-queries" >}})
- [Elasticsearch Logs Support]({{< relref "#elasticsearch-logs-support" >}})
- [InfluxDB Logs Support]({{< relref "#influxdb-logs-support" >}})
- [Data links]({{< relref "#data-links" >}})
- [New Time Picker]({{< relref "#new-time-picker" >}})
- [Graph Area Gradients]({{< relref "#graph-gradients" >}}) - A new graph display option!
- Grafana Enterprise
- [LDAP Active Sync]({{< relref "#ldap-active-sync" >}}) - LDAP Active Sync
- [SAML Authentication]({{< relref "#saml-authentication" >}}) - SAML Authentication
## Explore improvements
This release adds a ton of enhancements to Explore. Both in terms of new general enhancements but also in
new data source specific features.
### Loki live streaming
For log queries using the Loki data source you can now stream logs live directly to the Explore UI.
### Loki context queries
After finding a log line through the heavy use of query filters it can then be useful to
see the log lines surrounding the line your searched for. The `show context` feature
allows you to view lines before and after the line of interest.
### Elasticsearch logs support
This release adds support for searching & visualizing logs stored in Elasticsearch in the Explore mode. With a special
simplified query interface specifically designed for logs search.
{{< docs-imagebox img="/img/docs/v63/elasticsearch_explore_logs.png" max-width="600px" caption="New Time Picker" >}}
Please read [Using Elasticsearch in Grafana](/features/datasources/elasticsearch/#querying-logs-beta) for more detailed information on how to get started and use it.
### InfluxDB logs support
This release adds support for searching & visualizing logs stored in InfluxDB in the Explore mode. With a special
simplified query interface specifically designed for logs search.
{{< docs-imagebox img="/img/docs/v63/influxdb_explore_logs.png" max-width="600px" caption="New Time Picker" >}}
Please read [Using InfluxDB in Grafana](/features/datasources/influxdb/#querying-logs-beta) for more detailed information on how to get started and use it.
## Data Links
We have simplified the UI for defining panel drilldown links (and renamed them to Panel links). We have also added a
new type of link named `Data link`. The reason to have two different types is to make it clear how they are used
and what variables you can use in the link. Panel links are only shown in the top left corner of
the panel and you cannot reference series name or any data field.
While `Data links` are used by the actual visualization and can reference data fields.
Example:
```url
http://my-grafana.com/d/bPCI6VSZz/other-dashboard?var-server=${__series_name}
```
You have access to these variables:
Name | Description
------------ | -------------
*${__series_name}* | The name of the time series (or table)
*${__value_time}* | The time of the point your clicking on (in millisecond epoch)
*${__url_time_range}* | Interpolates as the full time range (i.e. from=21312323412&to=21312312312)
*${__all_variables}* | Adds all current variables (and current values) to the url
You can then click on point in the Graph.
{{< docs-imagebox img="/img/docs/v63/graph_datalink.png" max-width="400px" caption="New Time Picker" >}}
For now only the Graph panel supports `Data links` but we hope to add these to many visualizations.
## New Time Picker
The time picker has been re-designed and with a more basic design that makes accessing quick ranges more easy.
{{< docs-imagebox img="/img/docs/v63/time_picker.png" max-width="400px" caption="New Time Picker" >}}
## Graph Gradients
Want more eye candy in your graphs? Then the fill gradient option might be for you! Works really well for
graphs with only a single series.
{{< docs-imagebox img="/img/docs/v63/graph_gradient_area.jpeg" max-width="800px" caption="Graph Gradient Area" >}}
Looks really nice in light theme as well.
{{< docs-imagebox img="/img/docs/v63/graph_gradients_white.png" max-width="800px" caption="Graph Gradient Area" >}}
## Grafana Enterprise
Substantial refactoring and improvements to the external auth systems has gone in to this release making the features
listed below possible as well as laying a foundation for future enhancements.
### LDAP Active Sync
This is a new Enterprise feature that enables background syncing of user information, org role and teams memberships.
This syncing is otherwise only done at login time. With this feature you can schedule how often this user synchronization should
occur.
For example, lets say a user is removed from an LDAP group. In previous versions of Grafana an admin would have to
wait for the user to logout or the session to expire for the Grafana permissions to update, a process that can take days.
With active sync the user would be automatically removed from the corresponding team in Grafana or even logged out and disabled if no longer
belonging to an LDAP group that gives them access to Grafana.
[Read more](/auth/enhanced_ldap/#active-ldap-synchronization)
### SAML Authentication
Built-in support for SAML is now available in Grafana Enterprise.
### Team Sync for GitHub OAuth
When setting up OAuth with GitHub it's now possible to sync GitHub teams with Teams in Grafana.
[See docs]({{< relref "auth/github.md" >}})
### Team Sync for Auth Proxy
We've added support for enriching the Auth Proxy headers with Teams information, which makes it possible
to use Team Sync with Auth Proxy.
[See docs](/auth/auth-proxy/#auth-proxy-authentication)

View File

@@ -2,5 +2,5 @@
"npmClient": "yarn",
"useWorkspaces": true,
"packages": ["packages/*"],
"version": "6.3.0-beta.1"
"version": "6.3.0-alpha.33"
}

View File

@@ -146,9 +146,9 @@
"prettier:write": "prettier --list-different \"**/*.{ts,tsx,scss}\" --write",
"precommit": "grafana-toolkit precommit",
"themes:generate": "ts-node --project ./scripts/cli/tsconfig.json ./scripts/cli/generateSassVariableFiles.ts",
"packages:prepare": "lerna run clean && npm run test && lerna version --tag-version-prefix=\"packages@\" -m \"Packages: publish %s\" --no-push",
"packages:prepare": "npm run test && lerna version --tag-version-prefix=\"packages@\" -m \"Packages: publish %s\" --no-push",
"packages:build": "lerna run clean && lerna run build",
"packages:publish": "lerna publish from-package --contents dist --dist-tag next --tag-version-prefix=\"packages@\""
"packages:publish": "lerna publish from-package --contents dist --tag-version-prefix=\"packages@\" --dist-tag next"
},
"husky": {
"hooks": {

View File

@@ -1,15 +0,0 @@
## Grafana frontend packages
## Releasing new version
We use [Lerna](https://github.com/lerna/lerna) for packages versioning and releases
### Manual release
1. Run `packages:prepare` script from root directory. This will perform cleanup, run all tests and bump version for all packages. Also, it will create `@packages@[version]` tag and version bump commit with `Packages: publish [version]` message.
2. Run `packages:build` script that will prepare distribution packages.
3. Run `packages:publish` to publish new versions
- add `--dist-tag next` to publish under `next` tag
4. Push version commit
### Building individual packages
To build induvidual packages run `grafana-toolkit package:build --scope=<ui|toolkit|runtime|data>`

View File

@@ -1,3 +1,3 @@
# Grafana Data Library
This package holds the root data types and functions used within Grafana.
The core data components

View File

@@ -1,6 +1,6 @@
{
"name": "@grafana/data",
"version": "6.3.0-beta.1",
"version": "6.3.0-alpha.33",
"description": "Grafana Data Library",
"keywords": [
"typescript"
@@ -11,8 +11,7 @@
"typecheck": "tsc --noEmit",
"clean": "rimraf ./dist ./compiled",
"bundle": "rollup -c rollup.config.ts",
"build": "grafana-toolkit package:build --scope=data",
"postpublish": "npm run clean"
"build": "grafana-toolkit package:build --scope=data"
},
"author": "Grafana Labs",
"license": "Apache-2.0",

View File

@@ -97,9 +97,6 @@ export interface AnnotationEvent {
dashboardId?: number;
panelId?: number;
userId?: number;
login?: string;
email?: string;
avatarUrl?: string;
time?: number;
timeEnd?: number;
isRegion?: boolean;

View File

@@ -91,58 +91,4 @@ describe('Stats Calculators', () => {
expect(stats.step).toEqual(100);
expect(stats.delta).toEqual(300);
});
it('consistent results for first/last value with null', () => {
const info = [
{
rows: [[null], [200], [null]], // first/last value is null
result: 200,
},
{
rows: [[null], [null], [null]], // All null
result: undefined,
},
{
rows: [], // Empty row
result: undefined,
},
];
const fields = [{ name: 'A' }];
const stats = reduceField({
series: { rows: info[0].rows, fields },
fieldIndex: 0,
reducers: [ReducerID.first, ReducerID.last, ReducerID.firstNotNull, ReducerID.lastNotNull], // uses standard path
});
expect(stats[ReducerID.first]).toEqual(null);
expect(stats[ReducerID.last]).toEqual(null);
expect(stats[ReducerID.firstNotNull]).toEqual(200);
expect(stats[ReducerID.lastNotNull]).toEqual(200);
const reducers = [ReducerID.lastNotNull, ReducerID.firstNotNull];
for (const input of info) {
for (const reducer of reducers) {
const v1 = reduceField({
series: { rows: input.rows, fields },
fieldIndex: 0,
reducers: [reducer, ReducerID.mean], // uses standard path
})[reducer];
const v2 = reduceField({
series: { rows: input.rows, fields },
fieldIndex: 0,
reducers: [reducer], // uses optimized path
})[reducer];
if (v1 !== v2 || v1 !== input.result) {
const msg =
`Invalid ${reducer} result for: ` +
input.rows.join(', ') +
` Expected: ${input.result}` + // configured
` Recieved: Multiple: ${v1}, Single: ${v2}`;
expect(msg).toEqual(null);
}
}
}
});
});

View File

@@ -17,9 +17,6 @@ export enum ReducerID {
delta = 'delta',
step = 'step',
firstNotNull = 'firstNotNull',
lastNotNull = 'lastNotNull',
changeCount = 'changeCount',
distinctCount = 'distinctCount',
@@ -134,29 +131,15 @@ let hasBuiltIndex = false;
function getById(id: string): FieldReducerInfo | undefined {
if (!hasBuiltIndex) {
[
{
id: ReducerID.lastNotNull,
name: 'Last (not null)',
description: 'Last non-null value',
standard: true,
alias: 'current',
reduce: calculateLastNotNull,
},
{
id: ReducerID.last,
name: 'Last',
description: 'Last Value',
description: 'Last Value (current)',
standard: true,
alias: 'current',
reduce: calculateLast,
},
{ id: ReducerID.first, name: 'First', description: 'First Value', standard: true, reduce: calculateFirst },
{
id: ReducerID.firstNotNull,
name: 'First (not null)',
description: 'First non-null value',
standard: true,
reduce: calculateFirstNotNull,
},
{ id: ReducerID.min, name: 'Min', description: 'Minimum Value', standard: true },
{ id: ReducerID.max, name: 'Max', description: 'Maximum Value', standard: true },
{ id: ReducerID.mean, name: 'Mean', description: 'Average Value', standard: true, alias: 'avg' },
@@ -248,8 +231,6 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
mean: null,
last: null,
first: null,
lastNotNull: undefined,
firstNotNull: undefined,
count: 0,
nonNullCount: 0,
allIsNull: true,
@@ -265,10 +246,6 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
for (let i = 0; i < data.rows.length; i++) {
let currentValue = data.rows[i][fieldIndex];
if (i === 0) {
calcs.first = currentValue;
}
calcs.last = currentValue;
if (currentValue === null) {
if (ignoreNulls) {
@@ -280,9 +257,9 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
}
if (currentValue !== null) {
const isFirst = calcs.firstNotNull === undefined;
const isFirst = calcs.first === null;
if (isFirst) {
calcs.firstNotNull = currentValue;
calcs.first = currentValue;
}
if (isNumber(currentValue)) {
@@ -291,12 +268,12 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
calcs.nonNullCount++;
if (!isFirst) {
const step = currentValue - calcs.lastNotNull!;
const step = currentValue - calcs.last!;
if (calcs.step > step) {
calcs.step = step; // the minimum interval
}
if (calcs.lastNotNull! > currentValue) {
if (calcs.last! > currentValue) {
// counter reset
calcs.previousDeltaUp = false;
if (i === data.rows.length - 1) {
@@ -330,7 +307,7 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
calcs.allIsZero = false;
}
calcs.lastNotNull = currentValue;
calcs.last = currentValue;
}
}
@@ -354,8 +331,10 @@ function doStandardCalcs(data: DataFrame, fieldIndex: number, ignoreNulls: boole
calcs.range = calcs.max - calcs.min;
}
if (isNumber(calcs.firstNotNull) && isNumber(calcs.lastNotNull)) {
calcs.diff = calcs.lastNotNull - calcs.firstNotNull;
if (calcs.first !== null && calcs.last !== null) {
if (isNumber(calcs.first) && isNumber(calcs.last)) {
calcs.diff = calcs.last - calcs.first;
}
}
return calcs;
@@ -365,41 +344,10 @@ function calculateFirst(data: DataFrame, fieldIndex: number, ignoreNulls: boolea
return { first: data.rows[0][fieldIndex] };
}
function calculateFirstNotNull(
data: DataFrame,
fieldIndex: number,
ignoreNulls: boolean,
nullAsZero: boolean
): FieldCalcs {
for (let idx = 0; idx < data.rows.length; idx++) {
const v = data.rows[idx][fieldIndex];
if (v != null) {
return { firstNotNull: v };
}
}
return { firstNotNull: undefined };
}
function calculateLast(data: DataFrame, fieldIndex: number, ignoreNulls: boolean, nullAsZero: boolean): FieldCalcs {
return { last: data.rows[data.rows.length - 1][fieldIndex] };
}
function calculateLastNotNull(
data: DataFrame,
fieldIndex: number,
ignoreNulls: boolean,
nullAsZero: boolean
): FieldCalcs {
let idx = data.rows.length - 1;
while (idx >= 0) {
const v = data.rows[idx--][fieldIndex];
if (v != null) {
return { lastNotNull: v };
}
}
return { lastNotNull: undefined };
}
function calculateChangeCount(
data: DataFrame,
fieldIndex: number,

View File

@@ -1,11 +1,19 @@
{
"extends": "../tsconfig.json",
"extends": "../../tsconfig.json",
"include": ["src/**/*.ts", "src/**/*.tsx", "../../public/app/types/jquery/*.ts"],
"exclude": ["dist", "node_modules"],
"compilerOptions": {
"rootDirs": ["."],
"typeRoots": ["./node_modules/@types", "types"],
"module": "esnext",
"outDir": "compiled",
"declaration": true,
"declarationDir": "dist",
"outDir": "compiled"
"strict": true,
"alwaysStrict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"typeRoots": ["./node_modules/@types", "types"],
"skipLibCheck": true, // Temp workaround for Duplicate identifier tsc errors,
"removeComments": false
}
}

View File

@@ -1,3 +1,3 @@
# Grafana Runtime library
This package allows access to grafana services. It requires Grafana to be running already and the functions to be imported as externals.
Interfaces that let you use the runtime...

View File

@@ -1,9 +1,11 @@
{
"name": "@grafana/runtime",
"version": "6.3.0-beta.1",
"version": "6.3.0-alpha.33",
"description": "Grafana Runtime Library",
"keywords": [
"grafana"
"typescript",
"react",
"react-component"
],
"main": "src/index.ts",
"scripts": {
@@ -11,8 +13,7 @@
"typecheck": "tsc --noEmit",
"clean": "rimraf ./dist ./compiled",
"bundle": "rollup -c rollup.config.ts",
"build": "grafana-toolkit package:build --scope=runtime",
"postpublish": "npm run clean"
"build": "grafana-toolkit package:build --scope=runtime"
},
"author": "Grafana Labs",
"license": "Apache-2.0",

View File

@@ -20,7 +20,7 @@ const buildCjsPackage = ({ env }) => {
globals: {},
},
],
external: ['lodash', '@grafana/ui', '@grafana/data'], // Use Lodash from grafana
external: ['lodash', '@grafana/ui'], // Use Lodash from grafana
plugins: [
commonjs({
include: /node_modules/,

View File

@@ -1,11 +1,19 @@
{
"extends": "../tsconfig.json",
"extends": "../../tsconfig.json",
"include": ["src/**/*.ts", "src/**/*.tsx", "../../public/app/types/jquery/*.ts"],
"exclude": ["dist", "node_modules"],
"compilerOptions": {
"rootDirs": ["."],
"typeRoots": ["./node_modules/@types", "types"],
"module": "esnext",
"outDir": "compiled",
"declaration": true,
"declarationDir": "dist",
"outDir": "compiled"
"strict": true,
"alwaysStrict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"typeRoots": ["./node_modules/@types", "types"],
"skipLibCheck": true, // Temp workaround for Duplicate identifier tsc errors,
"removeComments": false
}
}

View File

@@ -84,7 +84,7 @@ Adidtionaly, you can also provide additional Jest config via package.json file.
## Working with CSS & static assets
We support pure css, SASS and CSS in JS approach (via Emotion).
We support pure css, SASS and CSS in JS approach (via Emotion). All static assets referenced in your code (i.e. images) should be placed under `src/static` directory and referenced using relative paths.
1. Single css/sass file
Create your css/sass file and import it in your plugin entry point (typically module.ts):
@@ -101,8 +101,6 @@ If you want to provide different stylesheets for dark/light theme, create `dark.
TODO: add note about loadPluginCss
Note that static files (png, svg, json, html) are all copied to dist directory when the plugin is bundled. Relative paths to those files does not change.
3. Emotion
Starting from Grafana 6.2 our suggested way of styling plugins is by using [Emotion](https://emotion.sh). It's a css-in-js library that we use internaly at Grafana. The biggest advantage of using Emotion is that you will get access to Grafana Theme variables.

View File

@@ -1,11 +1,11 @@
{
"name": "@grafana/toolkit",
"version": "6.3.0-beta.1",
"version": "6.3.0-alpha.33",
"description": "Grafana Toolkit",
"keywords": [
"grafana",
"cli",
"plugins"
"typescript",
"react",
"react-component"
],
"bin": {
"grafana-toolkit": "./bin/grafana-toolkit.js"
@@ -15,8 +15,7 @@
"typecheck": "tsc --noEmit",
"precommit": "npm run tslint & npm run typecheck",
"clean": "rimraf ./dist ./compiled",
"build": "grafana-toolkit toolkit:build",
"postpublish": "npm run clean"
"build": "grafana-toolkit toolkit:build"
},
"author": "Grafana Labs",
"license": "Apache-2.0",
@@ -30,11 +29,9 @@
"@types/node": "^12.0.4",
"@types/react-dev-utils": "^9.0.1",
"@types/semver": "^6.0.0",
"@types/tmp": "^0.1.0",
"@types/webpack": "4.4.34",
"axios": "0.19.0",
"babel-loader": "8.0.6",
"babel-plugin-angularjs-annotate": "0.10.0",
"chalk": "^2.4.2",
"commander": "^2.20.0",
"concurrently": "4.1.0",
@@ -50,6 +47,7 @@
"jest-coverage-badges": "^1.1.2",
"lodash": "4.17.11",
"mini-css-extract-plugin": "^0.7.0",
"ng-annotate-webpack-plugin": "^0.3.0",
"node-sass": "^4.12.0",
"optimize-css-assets-webpack-plugin": "^5.0.3",
"ora": "^3.4.0",

View File

@@ -13,13 +13,7 @@ import { pluginTestTask } from './tasks/plugin.tests';
import { searchTestDataSetupTask } from './tasks/searchTestDataSetup';
import { closeMilestoneTask } from './tasks/closeMilestone';
import { pluginDevTask } from './tasks/plugin.dev';
import {
ciBuildPluginTask,
ciBundlePluginTask,
ciTestPluginTask,
ciDeployPluginTask,
ciSetupPluginTask,
} from './tasks/plugin.ci';
import { pluginCITask } from './tasks/plugin.ci';
import { buildPackageTask } from './tasks/package.build';
export const run = (includeInternalScripts = false) => {
@@ -147,47 +141,15 @@ export const run = (includeInternalScripts = false) => {
});
program
.command('plugin:ci-build')
.option('--platform <platform>', 'For backend task, which backend to run')
.description('Build the plugin, leaving artifacts in /dist')
.command('plugin:ci')
.option('--dryRun', "Dry run (don't post results)")
.description('Run Plugin CI task')
.action(async cmd => {
await execTask(ciBuildPluginTask)({
platform: cmd.platform,
await execTask(pluginCITask)({
dryRun: cmd.dryRun,
});
});
program
.command('plugin:ci-bundle')
.description('Create a zip artifact for the plugin')
.action(async cmd => {
await execTask(ciBundlePluginTask)({});
});
program
.command('plugin:ci-setup')
.option('--installer <installer>', 'Name of installer to download and run')
.description('Install and configure grafana')
.action(async cmd => {
await execTask(ciSetupPluginTask)({
installer: cmd.installer,
});
});
program
.command('plugin:ci-test')
.description('end-to-end test using bundle in /artifacts')
.action(async cmd => {
await execTask(ciTestPluginTask)({
platform: cmd.platform,
});
});
program
.command('plugin:ci-deploy')
.description('Publish plugin CI results')
.action(async cmd => {
await execTask(ciDeployPluginTask)({});
});
program.on('command:*', () => {
console.error('Invalid command: %s\nSee --help for a list of available commands.', program.args.join(' '));
process.exit(1);

View File

@@ -1,3 +1,4 @@
import axios from 'axios';
// @ts-ignore
import * as _ from 'lodash';
import { Task, TaskRunner } from './task';

View File

@@ -0,0 +1,191 @@
// import execa = require('execa');
// import { execTask } from '../utils/execTask';
// import { changeCwdToGrafanaUiDist, changeCwdToGrafanaUi, restoreCwd } from '../utils/cwd';
// import { ReleaseType, inc } from 'semver';
// import { prompt } from 'inquirer';
// import chalk from 'chalk';
// import { useSpinner } from '../utils/useSpinner';
// import { savePackage, buildTask, clean } from './grafanaui.build';
// import { TaskRunner, Task } from './task';
// type VersionBumpType = 'prerelease' | 'patch' | 'minor' | 'major';
// interface ReleaseTaskOptions {
// publishToNpm: boolean;
// usePackageJsonVersion: boolean;
// createVersionCommit: boolean;
// }
// const promptBumpType = async () => {
// return prompt<{ type: VersionBumpType }>([
// {
// type: 'list',
// message: 'Select version bump',
// name: 'type',
// choices: ['prerelease', 'patch', 'minor', 'major'],
// },
// ]);
// };
// const promptPrereleaseId = async (message = 'Is this a prerelease?', allowNo = true) => {
// return prompt<{ id: string }>([
// {
// type: 'list',
// message: message,
// name: 'id',
// choices: allowNo ? ['no', 'alpha', 'beta'] : ['alpha', 'beta'],
// },
// ]);
// };
// const promptConfirm = async (message?: string) => {
// return prompt<{ confirmed: boolean }>([
// {
// type: 'confirm',
// message: message || 'Is that correct?',
// name: 'confirmed',
// default: false,
// },
// ]);
// };
// // Since Grafana core depends on @grafana/ui highly, we run full check before release
// const runChecksAndTests = async () =>
// // @ts-ignore
// useSpinner<void>(`Running checks and tests`, async () => {
// try {
// await execa('npm', ['run', 'test']);
// } catch (e) {
// console.log(e);
// throw e;
// }
// })();
// const bumpVersion = (version: string) =>
// // @ts-ignore
// useSpinner<void>(`Saving version ${version} to package.json`, async () => {
// changeCwdToGrafanaUi();
// await execa('npm', ['version', version]);
// changeCwdToGrafanaUiDist();
// const pkg = require(`${process.cwd()}/package.json`);
// pkg.version = version;
// await savePackage({ path: `${process.cwd()}/package.json`, pkg });
// })();
// const publishPackage = (name: string, version: string) =>
// // @ts-ignore
// useSpinner<void>(`Publishing ${name} @ ${version} to npm registry...`, async () => {
// changeCwdToGrafanaUiDist();
// await execa('npm', ['publish', '--access', 'public']);
// })();
// const ensureMasterBranch = async () => {
// const currentBranch = await execa.stdout('git', ['symbolic-ref', '--short', 'HEAD']);
// const status = await execa.stdout('git', ['status', '--porcelain']);
// if (currentBranch !== 'master' && status !== '') {
// console.error(chalk.red.bold('You need to be on clean master branch to release @grafana/ui'));
// process.exit(1);
// }
// };
// const prepareVersionCommitAndPush = async (version: string) =>
// // @ts-ignore
// useSpinner<void>('Commiting and pushing @grafana/ui version update', async () => {
// await execa.stdout('git', ['commit', '-a', '-m', `Upgrade @grafana/ui version to v${version}`]);
// await execa.stdout('git', ['push']);
// })();
// const releaseTaskRunner: TaskRunner<ReleaseTaskOptions> = async ({
// publishToNpm,
// usePackageJsonVersion,
// createVersionCommit,
// }) => {
// changeCwdToGrafanaUi();
// // @ts-ignore
// await clean(); // Clean previous build if exists
// restoreCwd();
// if (publishToNpm) {
// // TODO: Ensure release branch
// // When need to update this when we star keeping @grafana/ui releases in sync with core
// await ensureMasterBranch();
// }
// await runChecksAndTests();
// await execTask(buildTask)({} as any);
// let releaseConfirmed = false;
// let nextVersion;
// changeCwdToGrafanaUiDist();
// const pkg = require(`${process.cwd()}/package.json`);
// console.log(`Current version: ${pkg.version}`);
// do {
// if (!usePackageJsonVersion) {
// const { type } = await promptBumpType();
// console.log(type);
// if (type === 'prerelease') {
// const { id } = await promptPrereleaseId('What kind of prerelease?', false);
// nextVersion = inc(pkg.version, type, id as any);
// } else {
// const { id } = await promptPrereleaseId();
// if (id !== 'no') {
// nextVersion = inc(pkg.version, `pre${type}` as ReleaseType, id as any);
// } else {
// nextVersion = inc(pkg.version, type as ReleaseType);
// }
// }
// } else {
// nextVersion = pkg.version;
// }
// console.log(chalk.yellowBright.bold(`You are going to release a new version of ${pkg.name}`));
// if (usePackageJsonVersion) {
// console.log(chalk.green(`Version based on package.json: `), chalk.bold.yellowBright(`${nextVersion}`));
// } else {
// console.log(chalk.green(`Version bump: ${pkg.version} ->`), chalk.bold.yellowBright(`${nextVersion}`));
// }
// const { confirmed } = await promptConfirm();
// releaseConfirmed = confirmed;
// } while (!releaseConfirmed);
// if (!usePackageJsonVersion) {
// await bumpVersion(nextVersion);
// }
// if (createVersionCommit) {
// await prepareVersionCommitAndPush(nextVersion);
// }
// if (publishToNpm) {
// console.log(chalk.yellowBright.bold(`\nReview dist package.json before proceeding!\n`));
// const { confirmed } = await promptConfirm('Are you ready to publish to npm?');
// if (!confirmed) {
// process.exit();
// }
// await publishPackage(pkg.name, nextVersion);
// console.log(chalk.green(`\nVersion ${nextVersion} of ${pkg.name} succesfully released!`));
// console.log(chalk.yellow(`\nUpdated @grafana/ui/package.json with version bump created.`));
// process.exit();
// } else {
// console.log(
// chalk.green(
// `\nVersion ${nextVersion} of ${pkg.name} succesfully prepared for release. See packages/grafana-ui/dist`
// )
// );
// console.log(chalk.green(`\nTo publish to npm registry run`), chalk.bold.blue(`npm run gui:publish`));
// }
// };
// export const releaseTask = new Task<ReleaseTaskOptions>('@grafana/ui release', releaseTaskRunner);

View File

@@ -3,6 +3,7 @@ import execa = require('execa');
import * as fs from 'fs';
// @ts-ignore
import * as path from 'path';
import { changeCwdToGrafanaUi, restoreCwd, changeCwdToPackage } from '../utils/cwd';
import chalk from 'chalk';
import { useSpinner } from '../utils/useSpinner';
import { Task, TaskRunner } from './task';

View File

@@ -4,6 +4,7 @@ import execa = require('execa');
import path = require('path');
import fs = require('fs');
import glob = require('glob');
import util = require('util');
import { Linter, Configuration, RuleFailure } from 'tslint';
import * as prettier from 'prettier';
@@ -16,6 +17,7 @@ interface PluginBuildOptions {
export const bundlePlugin = useSpinner<PluginBundleOptions>('Compiling...', async options => await bundleFn(options));
const readFileAsync = util.promisify(fs.readFile);
// @ts-ignore
export const clean = useSpinner<void>('Cleaning', async () => await execa('rimraf', [`${process.cwd()}/dist`]));

View File

@@ -1,5 +1,6 @@
import { Task, TaskRunner } from './task';
import { pluginBuildRunner } from './plugin.build';
import { useSpinner } from '../utils/useSpinner';
import { restoreCwd } from '../utils/cwd';
import { getPluginJson } from '../../config/utils/pluginValidation';
@@ -9,8 +10,7 @@ import path = require('path');
import fs = require('fs');
export interface PluginCIOptions {
platform?: string;
installer?: string;
dryRun?: boolean;
}
const calcJavascriptSize = (base: string, files?: string[]): number => {
@@ -33,106 +33,22 @@ const calcJavascriptSize = (base: string, files?: string[]): number => {
return size;
};
const getWorkFolder = () => {
let dir = `${process.cwd()}/work`;
if (process.env.CIRCLE_JOB) {
dir = path.resolve(dir, process.env.CIRCLE_JOB);
}
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}
return dir;
};
const writeWorkStats = (startTime: number, workDir: string) => {
const elapsed = Date.now() - startTime;
const stats = {
job: `${process.env.CIRCLE_JOB}`,
startTime,
buildTime: elapsed,
endTime: Date.now(),
};
const f = path.resolve(workDir, 'stats.json');
fs.writeFile(f, JSON.stringify(stats, null, 2), err => {
if (err) {
throw new Error('Unable to stats: ' + f);
}
});
};
/**
* 1. BUILD
*
* when platform exists it is building backend, otherwise frontend
*
* Each build writes data:
* ~/work/build_xxx/
*
* Anything that should be put into the final zip file should be put in:
* ~/work/build_xxx/dist
*/
const buildPluginRunner: TaskRunner<PluginCIOptions> = async ({ platform }) => {
const pluginCIRunner: TaskRunner<PluginCIOptions> = async ({ dryRun }) => {
const start = Date.now();
const workDir = getWorkFolder();
await execa('rimraf', [workDir]);
fs.mkdirSync(workDir);
const distDir = `${process.cwd()}/dist`;
const artifactsDir = `${process.cwd()}/artifacts`;
await execa('rimraf', [`${process.cwd()}/coverage`]);
await execa('rimraf', [artifactsDir]);
if (platform) {
console.log('TODO, backend support?');
const file = path.resolve(workDir, 'README.txt');
fs.writeFile(workDir + '/README.txt', 'TODO... build it!', err => {
if (err) {
throw new Error('Unable to write: ' + file);
}
});
} else {
// Do regular build process with coverage
await pluginBuildRunner({ coverage: true });
}
// Do regular build process
await pluginBuildRunner({ coverage: true });
const elapsed = Date.now() - start;
// Move dist to the scoped work folder
const distDir = path.resolve(process.cwd(), 'dist');
if (fs.existsSync(distDir)) {
fs.renameSync(distDir, path.resolve(workDir, 'dist'));
}
writeWorkStats(start, workDir);
};
export const ciBuildPluginTask = new Task<PluginCIOptions>('Build Plugin', buildPluginRunner);
/**
* 2. BUNDLE
*
* Take everything from `~/work/build_XXX/dist` and zip it into
* artifacts
*
*/
const bundlePluginRunner: TaskRunner<PluginCIOptions> = async () => {
const start = Date.now();
const workDir = getWorkFolder();
// Copy all `dist` folders to the root dist folder
const distDir = path.resolve(process.cwd(), 'dist');
if (!fs.existsSync(distDir)) {
fs.mkdirSync(distDir);
}
fs.mkdirSync(distDir, { recursive: true });
const dirs = fs.readdirSync(workDir);
for (const dir of dirs) {
if (dir.startsWith('build_')) {
const contents = path.resolve(dir, 'dist');
if (fs.existsSync(contents)) {
await execa('cp', ['-rp', contents, distDir]);
}
}
}
// Create an artifact
const artifactsDir = path.resolve(process.cwd(), 'artifacts');
if (!fs.existsSync(artifactsDir)) {
fs.mkdirSync(artifactsDir, { recursive: true });
fs.mkdirSync(artifactsDir);
}
// TODO? can this typed from @grafana/ui?
const pluginInfo = getPluginJson(`${distDir}/plugin.json`);
const zipName = pluginInfo.id + '-' + pluginInfo.info.version + '.zip';
const zipFile = path.resolve(artifactsDir, zipName);
@@ -140,165 +56,23 @@ const bundlePluginRunner: TaskRunner<PluginCIOptions> = async () => {
await execa('zip', ['-r', zipFile, '.']);
restoreCwd();
const zipStats = fs.statSync(zipFile);
if (zipStats.size < 100) {
throw new Error('Invalid zip file: ' + zipFile);
}
await execa('sha1sum', [zipFile, '>', zipFile + '.sha1']);
const info = {
name: zipName,
size: zipStats.size,
};
const f = path.resolve(artifactsDir, 'info.json');
fs.writeFile(f, JSON.stringify(info, null, 2), err => {
if (err) {
throw new Error('Error writing artifact info: ' + f);
}
});
writeWorkStats(start, workDir);
};
export const ciBundlePluginTask = new Task<PluginCIOptions>('Bundle Plugin', bundlePluginRunner);
/**
* 3. Setup (install grafana and setup provisioning)
*
* deploy the zip to a running grafana instance
*
*/
const setupPluginRunner: TaskRunner<PluginCIOptions> = async ({ installer }) => {
const start = Date.now();
if (!installer) {
throw new Error('Missing installer path');
}
// Download the grafana installer
const workDir = getWorkFolder();
const installFile = path.resolve(workDir, installer);
if (!fs.existsSync(installFile)) {
console.log('download', installer);
const exe = await execa('wget', ['-O', installFile, 'https://dl.grafana.com/oss/release/' + installer]);
console.log(exe.stdout);
}
// Find the plugin zip file
const artifactsDir = path.resolve(process.cwd(), 'artifacts');
const artifactsInfo = require(path.resolve(artifactsDir, 'info.json'));
const pluginZip = path.resolve(workDir, 'artifacts', artifactsInfo.name);
if (!fs.existsSync(pluginZip)) {
throw new Error('Missing zip file:' + pluginZip);
}
// Create a grafana runtime folder
const grafanaPluginsDir = path.resolve(require('os').homedir(), 'grafana', 'plugins');
await execa('rimraf', [grafanaPluginsDir]);
fs.mkdirSync(grafanaPluginsDir, { recursive: true });
// unzip package.zip -d /opt
let exe = await execa('unzip', [pluginZip, '-d', grafanaPluginsDir]);
console.log(exe.stdout);
// Write the custom settings
const customIniPath = '/usr/share/grafana/conf/custom.ini';
const customIniBody = `[paths] \n` + `plugins = ${grafanaPluginsDir}\n` + '';
fs.writeFile(customIniPath, customIniBody, err => {
if (err) {
throw new Error('Unable to write: ' + customIniPath);
}
});
console.log('Install Grafana');
exe = await execa('sudo', ['dpkg', 'i', installFile]);
console.log(exe.stdout);
exe = await execa('sudo', ['grafana-server', 'start']);
console.log(exe.stdout);
exe = await execa('grafana-cli', ['--version']);
writeWorkStats(start, workDir + '_setup');
};
export const ciSetupPluginTask = new Task<PluginCIOptions>('Setup Grafana', setupPluginRunner);
/**
* 4. Test (end-to-end)
*
* deploy the zip to a running grafana instance
*
*/
const testPluginRunner: TaskRunner<PluginCIOptions> = async ({ platform }) => {
const start = Date.now();
const workDir = getWorkFolder();
const args = {
withCredentials: true,
baseURL: process.env.GRAFANA_URL || 'http://localhost:3000/',
responseType: 'json',
auth: {
username: 'admin',
password: 'admin',
},
};
const axios = require('axios');
const frontendSettings = await axios.get('api/frontend/settings', args);
console.log('Grafana Version: ' + JSON.stringify(frontendSettings.data.buildInfo, null, 2));
const pluginInfo = getPluginJson(`${process.cwd()}/src/plugin.json`);
const pluginSettings = await axios.get(`api/plugins/${pluginInfo.id}/settings`, args);
console.log('Plugin Info: ' + JSON.stringify(pluginSettings.data, null, 2));
console.log('TODO puppeteer');
const elapsed = Date.now() - start;
const stats = {
job: `${process.env.CIRCLE_JOB}`,
sha1: `${process.env.CIRCLE_SHA1}`,
startTime: start,
buildTime: elapsed,
jsSize: calcJavascriptSize(distDir),
zipSize: fs.statSync(zipFile).size,
endTime: Date.now(),
};
console.log('TODO Puppeteer Tests', stats);
writeWorkStats(start, workDir);
};
export const ciTestPluginTask = new Task<PluginCIOptions>('Test Plugin (e2e)', testPluginRunner);
/**
* 4. Deploy
*
* deploy the zip to a running grafana instance
*
*/
const deployPluginRunner: TaskRunner<PluginCIOptions> = async () => {
const start = Date.now();
// TASK Time
if (process.env.CIRCLE_INTERNAL_TASK_DATA) {
const timingInfo = fs.readdirSync(`${process.env.CIRCLE_INTERNAL_TASK_DATA}`);
if (timingInfo) {
timingInfo.forEach(file => {
console.log('TIMING INFO: ', file);
});
fs.writeFile(artifactsDir + '/stats.json', JSON.stringify(stats, null, 2), err => {
if (err) {
throw new Error('Unable to write stats');
}
}
console.log('Stats', stats);
});
const elapsed = Date.now() - start;
const stats = {
job: `${process.env.CIRCLE_JOB}`,
sha1: `${process.env.CIRCLE_SHA1}`,
startTime: start,
buildTime: elapsed,
endTime: Date.now(),
};
console.log('TODO DEPLOY??', stats);
console.log(' if PR => write a comment to github with difference ');
console.log(' if master | vXYZ ==> upload artifacts to some repo ');
if (!dryRun) {
console.log('TODO send info to github?');
}
};
export const ciDeployPluginTask = new Task<PluginCIOptions>('Deploy plugin', deployPluginRunner);
export const pluginCITask = new Task<PluginCIOptions>('Plugin CI', pluginCIRunner);

View File

@@ -1,3 +1,5 @@
import path = require('path');
import fs = require('fs');
import webpack = require('webpack');
import { getWebpackConfig } from '../../../config/webpack.plugin.config';
import formatWebpackMessages = require('react-dev-utils/formatWebpackMessages');

View File

@@ -1,3 +1,4 @@
import path = require('path');
import * as jestCLI from 'jest-cli';
import { useSpinner } from '../../utils/useSpinner';
import { jestConfig } from '../../../config/jest.plugin.config';

View File

@@ -46,6 +46,7 @@ export async function getTeam(team: any): Promise<any> {
}
export async function addToTeam(team: any, user: any): Promise<any> {
const members = await client.get(`/teams/${team.id}/members`);
console.log(`Adding user ${user.name} to team ${team.name}`);
await client.post(`/teams/${team.id}/members`, { userId: user.id });
}

View File

@@ -1,5 +1,6 @@
import execa = require('execa');
import * as fs from 'fs';
import { changeCwdToGrafanaUi, restoreCwd, changeCwdToGrafanaToolkit } from '../utils/cwd';
import chalk from 'chalk';
import { useSpinner } from '../utils/useSpinner';
import { Task, TaskRunner } from './task';

View File

@@ -3,7 +3,7 @@ import { getPluginJson, validatePluginJson } from './pluginValidation';
describe('pluginValdation', () => {
describe('plugin.json', () => {
test('missing plugin.json file', () => {
expect(() => getPluginJson(`${__dirname}/mocks/missing-plugin.json`)).toThrowError();
expect(() => getPluginJson(`${__dirname}/mocks/missing-plugin-json`)).toThrow('plugin.json file is missing!');
});
});

View File

@@ -1,3 +1,5 @@
import path = require('path');
// See: packages/grafana-ui/src/types/plugin.ts
interface PluginJSONSchema {
id: string;
@@ -20,24 +22,15 @@ export const validatePluginJson = (pluginJson: any) => {
if (!pluginJson.info.version) {
throw new Error('Plugin info.version is missing in plugin.json');
}
const types = ['panel', 'datasource', 'app'];
const type = pluginJson.type;
if (!types.includes(type)) {
throw new Error('Invalid plugin type in plugin.json: ' + type);
}
if (!pluginJson.id.endsWith('-' + type)) {
throw new Error('[plugin.json] id should end with: -' + type);
}
};
export const getPluginJson = (path: string): PluginJSONSchema => {
export const getPluginJson = (root: string = process.cwd()): PluginJSONSchema => {
let pluginJson;
try {
pluginJson = require(path);
pluginJson = require(path.resolve(root, 'src/plugin.json'));
} catch (e) {
throw new Error('Unable to find: ' + path);
throw new Error('plugin.json file is missing!');
}
validatePluginJson(pluginJson);

View File

@@ -5,9 +5,9 @@ const ReplaceInFileWebpackPlugin = require('replace-in-file-webpack-plugin');
const TerserPlugin = require('terser-webpack-plugin');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const OptimizeCssAssetsPlugin = require('optimize-css-assets-webpack-plugin');
const ngAnnotatePlugin = require('ng-annotate-webpack-plugin');
import * as webpack from 'webpack';
import { getStyleLoaders, getStylesheetEntries, getFileLoaders } from './webpack/loaders';
import { hasThemeStylesheets, getStyleLoaders, getStylesheetEntries, getFileLoaders } from './webpack/loaders';
interface WebpackConfigurationOptions {
watch?: boolean;
@@ -51,7 +51,6 @@ const getManualChunk = (id: string) => {
};
}
}
return null;
};
const getEntries = () => {
@@ -84,8 +83,8 @@ const getCommonPlugins = (options: WebpackConfigurationOptions) => {
{ from: '../LICENSE', to: '.' },
{ from: 'img/*', to: '.' },
{ from: '**/*.json', to: '.' },
{ from: '**/*.svg', to: '.' },
{ from: '**/*.png', to: '.' },
// { from: '**/*.svg', to: '.' },
// { from: '**/*.png', to: '.' },
{ from: '**/*.html', to: '.' },
],
{ logLevel: options.watch ? 'silent' : 'warn' }
@@ -115,6 +114,7 @@ export const getWebpackConfig: WebpackConfigurationGetter = options => {
const optimization: { [key: string]: any } = {};
if (options.production) {
plugins.push(new ngAnnotatePlugin());
optimization.minimizer = [new TerserPlugin(), new OptimizeCssAssetsPlugin()];
}
@@ -177,12 +177,8 @@ export const getWebpackConfig: WebpackConfigurationGetter = options => {
loaders: [
{
loader: 'babel-loader',
options: {
presets: ['@babel/preset-env'],
plugins: ['angularjs-annotate'],
},
options: { presets: ['@babel/preset-env'] },
},
'ts-loader',
],
exclude: /(node_modules)/,

View File

@@ -3,6 +3,7 @@ import { getStylesheetEntries, hasThemeStylesheets } from './loaders';
describe('Loaders', () => {
describe('stylesheet helpers', () => {
const logSpy = jest.spyOn(console, 'log').mockImplementation();
const errorSpy = jest.spyOn(console, 'error').mockImplementation();
afterAll(() => {
logSpy.mockRestore();

View File

@@ -1,3 +1,6 @@
import { getPluginJson } from '../utils/pluginValidation';
const path = require('path');
const fs = require('fs');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
@@ -119,8 +122,8 @@ export const getFileLoaders = () => {
? {
loader: 'file-loader',
options: {
outputPath: '/',
name: '[path][name].[ext]',
outputPath: 'static',
name: '[name].[hash:8].[ext]',
},
}
: // When using single css import images are inlined as base64 URIs in the result bundle

View File

@@ -1,13 +1,17 @@
{
"extends": "../tsconfig.json",
"include": ["src/**/*.ts"],
"exclude": ["dist", "node_modules"],
"compilerOptions": {
"module": "commonjs",
"rootDirs": ["."],
"outDir": "dist/src",
"declaration": false,
"strict": true,
"alwaysStrict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"typeRoots": ["./node_modules/@types"],
"skipLibCheck": true, // Temp workaround for Duplicate identifier tsc errors,
"removeComments": false,
"esModuleInterop": true,
"lib": ["es2015", "es2017.string"]
}

View File

@@ -15,3 +15,37 @@ See [package source](https://github.com/grafana/grafana/tree/master/packages/gra
## Development
For development purposes we suggest using `yarn link` that will create symlink to @grafana/ui lib. To do so navigate to `packages/grafana-ui` and run `yarn link`. Then, navigate to your project and run `yarn link @grafana/ui` to use the linked version of the lib. To unlink follow the same procedure, but use `yarn unlink` instead.
## Building @grafana/ui
To build @grafana/ui run `npm run gui:build` script _from Grafana repository root_. The build will be created in `packages/grafana-ui/dist` directory. Following steps from [Development](#development) you can test built package.
## Releasing new version
To release new version run `npm run gui:release` script _from Grafana repository root_. This has to be done on the master branch. The script will prepare the distribution package as well as prompt you to bump library version and publish it to the NPM registry. When the new package is published, create a PR with the bumped version in package.json.
### Automatic version bump
When running `npm run gui:release` package.json file will be automatically updated. Also, package.json file will be commited and pushed to upstream branch.
### Manual version bump
Manually update the version in `package.json` and then run `npm run gui:release --usePackageJsonVersion` _from Grafana repository root_.
### Preparing release package without publishing to NPM registry
For testing purposes there is `npm run gui:releasePrepare` task that prepares distribution package without publishing it to the NPM registry.
### V1 release process overview
1. Package is compiled with TSC. Typings are created in `/dist` directory, and the compiled js lands in `/compiled` dir
2. Rollup creates a CommonJS package based on compiled sources, and outputs it to `/dist` directory
3. Readme, changelog and index.js files are moved to `/dist` directory
4. Package version is bumped in both `@grafana/ui` package dir and in dist directory.
5. Version commit is created and pushed to master branch
6. Package is published to npm
## Versioning
To limit the confusion related to @grafana/ui and Grafana versioning we decided to keep the major version in sync between those two.
This means, that first version of @grafana/ui is taged with 6.0.0-alpha.0 to keep version in sync with Grafana 6.0 release.

View File

@@ -1,9 +1,9 @@
{
"name": "@grafana/ui",
"version": "6.3.0-beta.1",
"version": "6.3.0-alpha.33",
"description": "Grafana Components Library",
"keywords": [
"grafana",
"typescript",
"react",
"react-component"
],
@@ -15,8 +15,7 @@
"storybook:build": "build-storybook -o ./dist/storybook -c .storybook",
"clean": "rimraf ./dist ./compiled",
"bundle": "rollup -c rollup.config.ts",
"build": "grafana-toolkit package:build --scope=ui",
"postpublish": "npm run clean"
"build": "grafana-toolkit package:build --scope=ui"
},
"author": "Grafana Labs",
"license": "Apache-2.0",

View File

@@ -1,11 +1,19 @@
{
"extends": "../tsconfig.json",
"extends": "../../tsconfig.json",
"include": ["src/**/*.ts", "src/**/*.tsx"],
"exclude": ["dist", "node_modules"],
"compilerOptions": {
"rootDirs": [".", "stories"],
"typeRoots": ["./node_modules/@types", "types"],
"module": "esnext",
"outDir": "compiled",
"declaration": true,
"declarationDir": "dist",
"outDir": "compiled"
"strict": true,
"alwaysStrict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"typeRoots": ["./node_modules/@types", "types"],
"skipLibCheck": true, // Temp workaround for Duplicate identifier tsc errors,
"removeComments": false
}
}

View File

@@ -1,13 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"module": "esnext",
"declaration": true,
"strict": true,
"alwaysStrict": true,
"noImplicitAny": true,
"strictNullChecks": true,
"skipLibCheck": true, // Temp workaround for Duplicate identifier tsc errors,
"removeComments": false
}
}

View File

@@ -93,26 +93,6 @@ func (sc *scenarioContext) fakeReqWithParams(method, url string, queryParams map
return sc
}
func (sc *scenarioContext) fakeReqNoAssertions(method, url string) *scenarioContext {
sc.resp = httptest.NewRecorder()
req, _ := http.NewRequest(method, url, nil)
sc.req = req
return sc
}
func (sc *scenarioContext) fakeReqNoAssertionsWithCookie(method, url string, cookie http.Cookie) *scenarioContext {
sc.resp = httptest.NewRecorder()
http.SetCookie(sc.resp, &cookie)
req, _ := http.NewRequest(method, url, nil)
req.Header = http.Header{"Cookie": sc.resp.Header()["Set-Cookie"]}
sc.req = req
return sc
}
type scenarioContext struct {
m *macaron.Macaron
context *m.ReqContext

View File

@@ -21,14 +21,8 @@ const (
LoginErrorCookieName = "login_error"
)
var setIndexViewData = (*HTTPServer).setIndexViewData
var getViewIndex = func() string {
return ViewIndex
}
func (hs *HTTPServer) LoginView(c *models.ReqContext) {
viewData, err := setIndexViewData(hs, c)
viewData, err := hs.setIndexViewData(c)
if err != nil {
c.Handle(500, "Failed to get settings", err)
return
@@ -47,14 +41,8 @@ func (hs *HTTPServer) LoginView(c *models.ReqContext) {
viewData.Settings["samlEnabled"] = hs.Cfg.SAMLEnabled
if loginError, ok := tryGetEncryptedCookie(c, LoginErrorCookieName); ok {
//this cookie is only set whenever an OAuth login fails
//therefore the loginError should be passed to the view data
//and the view should return immediately before attempting
//to login again via OAuth and enter to a redirect loop
deleteCookie(c, LoginErrorCookieName)
viewData.Settings["loginError"] = loginError
c.HTML(200, getViewIndex(), viewData)
return
}
if tryOAuthAutoLogin(c) {

View File

@@ -1,135 +0,0 @@
package api
import (
"encoding/hex"
"errors"
"github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/util"
"github.com/stretchr/testify/assert"
"io/ioutil"
"net/http"
"net/http/httptest"
"strings"
"testing"
)
func mockSetIndexViewData() {
setIndexViewData = func(*HTTPServer, *models.ReqContext) (*dtos.IndexViewData, error) {
data := &dtos.IndexViewData{
User: &dtos.CurrentUser{},
Settings: map[string]interface{}{},
NavTree: []*dtos.NavLink{},
}
return data, nil
}
}
func resetSetIndexViewData() {
setIndexViewData = (*HTTPServer).setIndexViewData
}
func mockViewIndex() {
getViewIndex = func() string {
return "index-template"
}
}
func resetViewIndex() {
getViewIndex = func() string {
return ViewIndex
}
}
func getBody(resp *httptest.ResponseRecorder) (string, error) {
responseData, err := ioutil.ReadAll(resp.Body)
if err != nil {
return "", err
}
return string(responseData), nil
}
func TestLoginErrorCookieApiEndpoint(t *testing.T) {
mockSetIndexViewData()
defer resetSetIndexViewData()
mockViewIndex()
defer resetViewIndex()
sc := setupScenarioContext("/login")
hs := &HTTPServer{
Cfg: setting.NewCfg(),
}
sc.defaultHandler = Wrap(func(w http.ResponseWriter, c *models.ReqContext) {
hs.LoginView(c)
})
setting.OAuthService = &setting.OAuther{}
setting.OAuthService.OAuthInfos = make(map[string]*setting.OAuthInfo)
setting.LoginCookieName = "grafana_session"
setting.SecretKey = "login_testing"
setting.OAuthService = &setting.OAuther{}
setting.OAuthService.OAuthInfos = make(map[string]*setting.OAuthInfo)
setting.OAuthService.OAuthInfos["github"] = &setting.OAuthInfo{
ClientId: "fake",
ClientSecret: "fakefake",
Enabled: true,
AllowSignup: true,
Name: "github",
}
setting.OAuthAutoLogin = true
oauthError := errors.New("User not a member of one of the required organizations")
encryptedError, _ := util.Encrypt([]byte(oauthError.Error()), setting.SecretKey)
cookie := http.Cookie{
Name: LoginErrorCookieName,
MaxAge: 60,
Value: hex.EncodeToString(encryptedError),
HttpOnly: true,
Path: setting.AppSubUrl + "/",
Secure: hs.Cfg.CookieSecure,
SameSite: hs.Cfg.CookieSameSite,
}
sc.m.Get(sc.url, sc.defaultHandler)
sc.fakeReqNoAssertionsWithCookie("GET", sc.url, cookie).exec()
assert.Equal(t, sc.resp.Code, 200)
responseString, err := getBody(sc.resp)
assert.Nil(t, err)
assert.True(t, strings.Contains(responseString, oauthError.Error()))
}
func TestLoginOAuthRedirect(t *testing.T) {
mockSetIndexViewData()
defer resetSetIndexViewData()
sc := setupScenarioContext("/login")
hs := &HTTPServer{
Cfg: setting.NewCfg(),
}
sc.defaultHandler = Wrap(func(c *models.ReqContext) {
hs.LoginView(c)
})
setting.OAuthService = &setting.OAuther{}
setting.OAuthService.OAuthInfos = make(map[string]*setting.OAuthInfo)
setting.OAuthService.OAuthInfos["github"] = &setting.OAuthInfo{
ClientId: "fake",
ClientSecret: "fakefake",
Enabled: true,
AllowSignup: true,
Name: "github",
}
setting.OAuthAutoLogin = true
sc.m.Get(sc.url, sc.defaultHandler)
sc.fakeReqNoAssertions("GET", sc.url).exec()
assert.Equal(t, sc.resp.Code, 307)
location, ok := sc.resp.Header()["Location"]
assert.True(t, ok)
assert.Equal(t, location[0], "/login/github")
}

View File

@@ -30,6 +30,23 @@ func GetTeamMembers(c *m.ReqContext) Response {
return JSON(200, query.Result)
}
func GetAuthProviderLabel(authModule string) string {
switch authModule {
case "oauth_github":
return "GitHub"
case "oauth_google":
return "Google"
case "oauth_gitlab":
return "GitLab"
case "oauth_grafana_com", "oauth_grafananet":
return "grafana.com"
case "ldap", "":
return "LDAP"
default:
return "OAuth"
}
}
// POST /api/teams/:teamId/members
func (hs *HTTPServer) AddTeamMember(c *m.ReqContext, cmd m.AddTeamMemberCommand) Response {
cmd.OrgId = c.OrgId

View File

@@ -29,11 +29,8 @@ func getUserUserProfile(userID int64) Response {
}
getAuthQuery := m.GetAuthInfoQuery{UserId: userID}
query.Result.AuthLabels = []string{}
if err := bus.Dispatch(&getAuthQuery); err == nil {
authLabel := GetAuthProviderLabel(getAuthQuery.Result.AuthModule)
query.Result.AuthLabels = append(query.Result.AuthLabels, authLabel)
query.Result.IsExternal = true
query.Result.AuthModule = []string{getAuthQuery.Result.AuthModule}
}
return JSON(200, query.Result)
@@ -280,12 +277,6 @@ func searchUser(c *m.ReqContext) (*m.SearchUsersQuery, error) {
for _, user := range query.Result.Users {
user.AvatarUrl = dtos.GetGravatarUrl(user.Email)
user.AuthLabels = make([]string, 0)
if user.AuthModule != nil && len(user.AuthModule) > 0 {
for _, authModule := range user.AuthModule {
user.AuthLabels = append(user.AuthLabels, GetAuthProviderLabel(authModule))
}
}
}
query.Result.Page = page
@@ -324,20 +315,3 @@ func ClearHelpFlags(c *m.ReqContext) Response {
return JSON(200, &util.DynMap{"message": "Help flag set", "helpFlags1": cmd.HelpFlags1})
}
func GetAuthProviderLabel(authModule string) string {
switch authModule {
case "oauth_github":
return "GitHub"
case "oauth_google":
return "Google"
case "oauth_gitlab":
return "GitLab"
case "oauth_grafana_com", "oauth_grafananet":
return "grafana.com"
case "ldap", "":
return "LDAP"
default:
return "OAuth"
}
}

View File

@@ -62,7 +62,7 @@ func EncryptDatasourcePaswords(c utils.CommandLine, sqlStore *sqlstore.SqlStore)
}
func migrateColumn(session *sqlstore.DBSession, column string) (int, error) {
var rows []map[string][]byte
var rows []map[string]string
session.Cols("id", column, "secure_json_data")
session.Table("data_source")
@@ -78,7 +78,7 @@ func migrateColumn(session *sqlstore.DBSession, column string) (int, error) {
return rowsUpdated, errutil.Wrapf(err, "failed to update column: %s", column)
}
func updateRows(session *sqlstore.DBSession, rows []map[string][]byte, passwordFieldName string) (int, error) {
func updateRows(session *sqlstore.DBSession, rows []map[string]string, passwordFieldName string) (int, error) {
var rowsUpdated int
for _, row := range rows {
@@ -94,7 +94,7 @@ func updateRows(session *sqlstore.DBSession, rows []map[string][]byte, passwordF
newRow := map[string]interface{}{"secure_json_data": data, passwordFieldName: ""}
session.Table("data_source")
session.Where("id = ?", string(row["id"]))
session.Where("id = ?", row["id"])
// Setting both columns while having value only for secure_json_data should clear the [passwordFieldName] column
session.Cols("secure_json_data", passwordFieldName)
@@ -108,20 +108,16 @@ func updateRows(session *sqlstore.DBSession, rows []map[string][]byte, passwordF
return rowsUpdated, nil
}
func getUpdatedSecureJSONData(row map[string][]byte, passwordFieldName string) (map[string]interface{}, error) {
encryptedPassword, err := util.Encrypt(row[passwordFieldName], setting.SecretKey)
func getUpdatedSecureJSONData(row map[string]string, passwordFieldName string) (map[string]interface{}, error) {
encryptedPassword, err := util.Encrypt([]byte(row[passwordFieldName]), setting.SecretKey)
if err != nil {
return nil, err
}
var secureJSONData map[string]interface{}
if len(row["secure_json_data"]) > 0 {
if err := json.Unmarshal(row["secure_json_data"], &secureJSONData); err != nil {
return nil, err
}
} else {
secureJSONData = map[string]interface{}{}
if err := json.Unmarshal([]byte(row["secure_json_data"]), &secureJSONData); err != nil {
return nil, err
}
jsonFieldName := util.ToCamelCase(passwordFieldName)

View File

@@ -20,30 +20,19 @@ func TestPasswordMigrationCommand(t *testing.T) {
datasources := []*models.DataSource{
{Type: "influxdb", Name: "influxdb", Password: "foobar"},
{Type: "graphite", Name: "graphite", BasicAuthPassword: "foobar"},
{Type: "prometheus", Name: "prometheus"},
{Type: "elasticsearch", Name: "elasticsearch", Password: "pwd"},
{Type: "prometheus", Name: "prometheus", SecureJsonData: securejsondata.GetEncryptedJsonData(map[string]string{})},
}
// set required default values
for _, ds := range datasources {
ds.Created = time.Now()
ds.Updated = time.Now()
if ds.Name == "elasticsearch" {
ds.SecureJsonData = securejsondata.GetEncryptedJsonData(map[string]string{
"key": "value",
})
} else {
ds.SecureJsonData = securejsondata.GetEncryptedJsonData(map[string]string{})
}
ds.SecureJsonData = securejsondata.GetEncryptedJsonData(map[string]string{})
}
_, err := session.Insert(&datasources)
assert.Nil(t, err)
// force secure_json_data to be null to verify that migration can handle that
_, err = session.Exec("update data_source set secure_json_data = null where name = 'influxdb'")
assert.Nil(t, err)
//run migration
err = EncryptDatasourcePaswords(&commandstest.FakeCommandLine{}, sqlstore)
assert.Nil(t, err)
@@ -52,7 +41,7 @@ func TestPasswordMigrationCommand(t *testing.T) {
var dss []*models.DataSource
err = session.SQL("select * from data_source").Find(&dss)
assert.Nil(t, err)
assert.Equal(t, len(dss), 4)
assert.Equal(t, len(dss), 3)
for _, ds := range dss {
sj := ds.SecureJsonData.Decrypt()
@@ -74,15 +63,5 @@ func TestPasswordMigrationCommand(t *testing.T) {
if ds.Name == "prometheus" {
assert.Equal(t, len(sj), 0)
}
if ds.Name == "elasticsearch" {
assert.Equal(t, ds.Password, "")
key, exist := sj["key"]
assert.True(t, exist)
password, exist := sj["password"]
assert.True(t, exist)
assert.Equal(t, password, "pwd", "expected password to be moved to securejson")
assert.Equal(t, key, "value", "expected existing key to be kept intact in securejson")
}
}
}

View File

@@ -85,7 +85,7 @@ func InstallPlugin(pluginName, version string, c utils.CommandLine) error {
}
logger.Infof("installing %v @ %v\n", pluginName, version)
logger.Infof("from: %v\n", downloadURL)
logger.Infof("from url: %v\n", downloadURL)
logger.Infof("into: %v\n", pluginFolder)
logger.Info("\n")
@@ -145,27 +145,18 @@ func downloadFile(pluginName, filePath, url string) (err error) {
}
}()
var bytes []byte
resp, err := http.Get(url) // #nosec
if err != nil {
return err
}
defer resp.Body.Close()
if _, err := os.Stat(url); err == nil {
bytes, err = ioutil.ReadFile(url)
if err != nil {
return err
}
} else {
resp, err := http.Get(url) // #nosec
if err != nil {
return err
}
defer resp.Body.Close()
bytes, err = ioutil.ReadAll(resp.Body)
if err != nil {
return err
}
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
return err
}
return extractFiles(bytes, pluginName, filePath)
return extractFiles(body, pluginName, filePath)
}
func extractFiles(body []byte, pluginName string, filePath string) error {

View File

@@ -182,10 +182,6 @@ func initContextWithBasicAuth(ctx *models.ReqContext, orgId int64) bool {
}
func initContextWithToken(authTokenService models.UserTokenService, ctx *models.ReqContext, orgID int64) bool {
if setting.LoginCookieName == "" {
return false
}
rawToken := ctx.GetCookie(setting.LoginCookieName)
if rawToken == "" {
return false

View File

@@ -216,8 +216,7 @@ type UserProfileDTO struct {
OrgId int64 `json:"orgId"`
IsGrafanaAdmin bool `json:"isGrafanaAdmin"`
IsDisabled bool `json:"isDisabled"`
IsExternal bool `json:"isExternal"`
AuthLabels []string `json:"authLabels"`
AuthModule []string `json:"authModule"`
}
type UserSearchHitDTO struct {
@@ -230,8 +229,7 @@ type UserSearchHitDTO struct {
IsDisabled bool `json:"isDisabled"`
LastSeenAt time.Time `json:"lastSeenAt"`
LastSeenAtAge string `json:"lastSeenAtAge"`
AuthLabels []string `json:"authLabels"`
AuthModule AuthModuleConversion `json:"-"`
AuthModule AuthModuleConversion `json:"authModule"`
}
type UserIdDTO struct {

View File

@@ -31,8 +31,7 @@ type IConnection interface {
type IServer interface {
Login(*models.LoginUserQuery) (*models.ExternalUserInfo, error)
Users([]string) ([]*models.ExternalUserInfo, error)
Bind() error
UserBind(string, string) error
Auth(string, string) error
Dial() error
Close()
}
@@ -44,23 +43,6 @@ type Server struct {
log log.Logger
}
// Bind authenticates the connection with the LDAP server
// - with the username and password setup in the config
// - or, anonymously
func (server *Server) Bind() error {
if server.shouldAuthAdmin() {
if err := server.AuthAdmin(); err != nil {
return err
}
} else {
err := server.Connection.UnauthenticatedBind(server.Config.BindDN)
if err != nil {
return err
}
}
return nil
}
// UsersMaxRequest is a max amount of users we can request via Users().
// Since many LDAP servers has limitations
// on how much items can we return in one request
@@ -158,19 +140,15 @@ func (server *Server) Login(query *models.LoginUserQuery) (
*models.ExternalUserInfo, error,
) {
var err error
var authAndBind bool
// Check if we can use a search user
// Do we need to authenticate the "admin" user first?
// Admin user should have access for the user search in LDAP server
if server.shouldAuthAdmin() {
if err := server.AuthAdmin(); err != nil {
return nil, err
}
} else if server.shouldSingleBind() {
authAndBind = true
err = server.UserBind(server.singleBindDN(query.Username), query.Password)
if err != nil {
return nil, err
}
// Or if anyone can perform the search in LDAP?
} else {
err := server.Connection.UnauthenticatedBind(server.Config.BindDN)
if err != nil {
@@ -195,25 +173,15 @@ func (server *Server) Login(query *models.LoginUserQuery) (
return nil, err
}
if !authAndBind {
// Authenticate user
err = server.UserBind(user.AuthId, query.Password)
if err != nil {
return nil, err
}
// Authenticate user
err = server.Auth(user.AuthId, query.Password)
if err != nil {
return nil, err
}
return user, nil
}
func (server *Server) singleBindDN(username string) string {
return fmt.Sprintf(server.Config.BindDN, username)
}
func (server *Server) shouldSingleBind() bool {
return strings.Contains(server.Config.BindDN, "%s")
}
// getUsersIteration is a helper function for Users() method.
// It divides the users by equal parts for the anticipated requests
func getUsersIteration(logins []string, fn func(int, int) error) error {
@@ -398,9 +366,9 @@ func (server *Server) shouldAuthAdmin() bool {
return server.Config.BindPassword != ""
}
// UserBind authenticates the connection with the LDAP server
func (server *Server) UserBind(username, password string) error {
err := server.userBind(username, password)
// Auth authentificates user in LDAP
func (server *Server) Auth(username, password string) error {
err := server.auth(username, password)
if err != nil {
server.log.Error(
fmt.Sprintf("Cannot authentificate user %s in LDAP", username),
@@ -415,7 +383,7 @@ func (server *Server) UserBind(username, password string) error {
// AuthAdmin authentificates LDAP admin user
func (server *Server) AuthAdmin() error {
err := server.userBind(server.Config.BindDN, server.Config.BindPassword)
err := server.auth(server.Config.BindDN, server.Config.BindPassword)
if err != nil {
server.log.Error(
"Cannot authentificate admin user in LDAP",
@@ -428,8 +396,8 @@ func (server *Server) AuthAdmin() error {
return nil
}
// userBind authenticates the connection with the LDAP server
func (server *Server) userBind(path, password string) error {
// auth is helper for several types of LDAP authentification
func (server *Server) auth(path, password string) error {
err := server.Connection.Bind(path, password)
if err != nil {
if ldapErr, ok := err.(*ldap.Error); ok {

View File

@@ -19,7 +19,7 @@ func TestLDAPLogin(t *testing.T) {
}
Convey("Login()", t, func() {
Convey("Should get invalid credentials when userBind fails", func() {
Convey("Should get invalid credentials when auth fails", func() {
connection := &MockConnection{}
entry := ldap.Entry{}
result := ldap.SearchResult{Entries: []*ldap.Entry{&entry}}
@@ -198,37 +198,5 @@ func TestLDAPLogin(t *testing.T) {
So(username, ShouldEqual, "test")
So(password, ShouldEqual, "pwd")
})
Convey("Should bind with user if %s exists in the bind_dn", func() {
connection := &MockConnection{}
entry := ldap.Entry{
DN: "test",
}
connection.setSearchResult(&ldap.SearchResult{Entries: []*ldap.Entry{&entry}})
authBindUser := ""
authBindPassword := ""
connection.BindProvider = func(name, pass string) error {
authBindUser = name
authBindPassword = pass
return nil
}
server := &Server{
Config: &ServerConfig{
BindDN: "cn=%s,ou=users,dc=grafana,dc=org",
SearchBaseDNs: []string{"BaseDNHere"},
},
Connection: connection,
log: log.New("test-logger"),
}
_, err := server.Login(defaultLogin)
So(err, ShouldBeNil)
So(authBindUser, ShouldEqual, "cn=user,ou=users,dc=grafana,dc=org")
So(authBindPassword, ShouldEqual, "pwd")
So(connection.BindCalled, ShouldBeTrue)
})
})
}

View File

@@ -145,7 +145,7 @@ func TestLDAPPrivateMethods(t *testing.T) {
})
Convey("shouldAuthAdmin()", t, func() {
Convey("it should require admin userBind", func() {
Convey("it should require admin auth", func() {
server := &Server{
Config: &ServerConfig{
BindPassword: "test",
@@ -156,7 +156,7 @@ func TestLDAPPrivateMethods(t *testing.T) {
So(result, ShouldBeTrue)
})
Convey("it should not require admin userBind", func() {
Convey("it should not require admin auth", func() {
server := &Server{
Config: &ServerConfig{
BindPassword: "",

View File

@@ -102,7 +102,7 @@ func TestPublicAPI(t *testing.T) {
})
})
Convey("UserBind()", t, func() {
Convey("Auth()", t, func() {
Convey("Should use provided DN and password", func() {
connection := &MockConnection{}
var actualUsername, actualPassword string
@@ -119,7 +119,7 @@ func TestPublicAPI(t *testing.T) {
}
dn := "cn=user,ou=users,dc=grafana,dc=org"
err := server.UserBind(dn, "pwd")
err := server.Auth(dn, "pwd")
So(err, ShouldBeNil)
So(actualUsername, ShouldEqual, dn)
@@ -141,7 +141,7 @@ func TestPublicAPI(t *testing.T) {
},
log: log.New("test-logger"),
}
err := server.UserBind("user", "pwd")
err := server.Auth("user", "pwd")
So(err, ShouldEqual, expected)
})
})

View File

@@ -109,10 +109,6 @@ func (multiples *MultiLDAP) User(login string) (
defer server.Close()
if err := server.Bind(); err != nil {
return nil, err
}
users, err := server.Users(search)
if err != nil {
return nil, err
@@ -146,10 +142,6 @@ func (multiples *MultiLDAP) Users(logins []string) (
defer server.Close()
if err := server.Bind(); err != nil {
return nil, err
}
users, err := server.Users(logins)
if err != nil {
return nil, err

View File

@@ -11,15 +11,12 @@ type MockLDAP struct {
loginCalledTimes int
closeCalledTimes int
usersCalledTimes int
bindCalledTimes int
dialErrReturn error
loginErrReturn error
loginReturn *models.ExternalUserInfo
bindErrReturn error
usersErrReturn error
usersFirstReturn []*models.ExternalUserInfo
usersRestReturn []*models.ExternalUserInfo
@@ -43,8 +40,8 @@ func (mock *MockLDAP) Users([]string) ([]*models.ExternalUserInfo, error) {
return mock.usersRestReturn, mock.usersErrReturn
}
// UserBind test fn
func (mock *MockLDAP) UserBind(string, string) error {
// Auth test fn
func (mock *MockLDAP) Auth(string, string) error {
return nil
}
@@ -59,11 +56,6 @@ func (mock *MockLDAP) Close() {
mock.closeCalledTimes = mock.closeCalledTimes + 1
}
func (mock *MockLDAP) Bind() error {
mock.bindCalledTimes++
return mock.bindErrReturn
}
// MockMultiLDAP represents testing struct for multildap testing
type MockMultiLDAP struct {
LoginCalledTimes int

View File

@@ -11,14 +11,9 @@ import { TagBadge } from './TagBadge';
import { NoOptionsMessage, IndicatorsContainer, resetSelectStyles } from '@grafana/ui';
import { escapeStringForRegex } from '../FilterInput/FilterInput';
export interface TermCount {
term: string;
count: number;
}
export interface Props {
tags: string[];
tagOptions: () => Promise<TermCount[]>;
tagOptions: () => any;
onChange: (tags: string[]) => void;
}
@@ -30,7 +25,7 @@ export class TagFilter extends React.Component<Props, any> {
}
onLoadOptions = (query: string) => {
return this.props.tagOptions().then(options => {
return this.props.tagOptions().then((options: any[]) => {
return options.map(option => ({
value: option.term,
label: option.term,

View File

@@ -28,7 +28,7 @@ describe('file_export', () => {
describe('when exporting series as rows', () => {
it('should export points in proper order', () => {
const text = fileExport.convertSeriesListToCsv(ctx.seriesList, { dateTimeFormat: ctx.timeFormat });
const text = fileExport.convertSeriesListToCsv(ctx.seriesList, ctx.timeFormat);
const expectedText =
'"Series";"Time";"Value"\r\n' +
'"series_1";"1500026100";1\r\n' +
@@ -48,7 +48,7 @@ describe('file_export', () => {
describe('when exporting series as columns', () => {
it('should export points in proper order', () => {
const text = fileExport.convertSeriesListToCsvColumns(ctx.seriesList, { dateTimeFormat: ctx.timeFormat });
const text = fileExport.convertSeriesListToCsvColumns(ctx.seriesList, ctx.timeFormat);
const expectedText =
'"Time";"series_1";"series_2"\r\n' +
'"1500026100";1;11\r\n' +
@@ -65,7 +65,7 @@ describe('file_export', () => {
const expectedSeries1DataPoints = ctx.seriesList[0].datapoints.slice();
const expectedSeries2DataPoints = ctx.seriesList[1].datapoints.slice();
fileExport.convertSeriesListToCsvColumns(ctx.seriesList, { dateTimeFormat: ctx.timeFormat });
fileExport.convertSeriesListToCsvColumns(ctx.seriesList, ctx.timeFormat);
expect(expectedSeries1DataPoints).toEqual(ctx.seriesList[0].datapoints);
expect(expectedSeries2DataPoints).toEqual(ctx.seriesList[1].datapoints);

View File

@@ -1,7 +1,7 @@
import { isBoolean, isNumber, sortedUniq, sortedIndexOf, unescape as htmlUnescaped } from 'lodash';
import { saveAs } from 'file-saver';
import { isNullOrUndefined } from 'util';
import { dateTime, TimeZone } from '@grafana/data';
import { dateTime } from '@grafana/data';
const DEFAULT_DATETIME_FORMAT = 'YYYY-MM-DDTHH:mm:ssZ';
const POINT_TIME_INDEX = 1;
@@ -12,19 +12,7 @@ const END_ROW = '\r\n';
const QUOTE = '"';
const EXPORT_FILENAME = 'grafana_data_export.csv';
interface SeriesListToCsvColumnsOptions {
dateTimeFormat: string;
excel: boolean;
timezone: TimeZone;
}
const defaultOptions: SeriesListToCsvColumnsOptions = {
dateTimeFormat: DEFAULT_DATETIME_FORMAT,
excel: false,
timezone: '',
};
function csvEscaped(text: string) {
function csvEscaped(text) {
if (!text) {
return text;
}
@@ -37,13 +25,13 @@ function csvEscaped(text: string) {
}
const domParser = new DOMParser();
function htmlDecoded(text: string) {
function htmlDecoded(text) {
if (!text) {
return text;
}
const regexp = /&[^;]+;/g;
function htmlDecoded(value: string) {
function htmlDecoded(value) {
const parsedDom = domParser.parseFromString(value, 'text/html');
return parsedDom.body.textContent;
}
@@ -70,19 +58,14 @@ function formatRow(row, addEndRowDelimiter = true) {
return addEndRowDelimiter ? text + END_ROW : text;
}
export function convertSeriesListToCsv(seriesList, options: Partial<SeriesListToCsvColumnsOptions>) {
const { dateTimeFormat, excel, timezone } = { ...defaultOptions, ...options };
export function convertSeriesListToCsv(seriesList, dateTimeFormat = DEFAULT_DATETIME_FORMAT, excel = false) {
let text = formatSpecialHeader(excel) + formatRow(['Series', 'Time', 'Value']);
for (let seriesIndex = 0; seriesIndex < seriesList.length; seriesIndex += 1) {
for (let i = 0; i < seriesList[seriesIndex].datapoints.length; i += 1) {
text += formatRow(
[
seriesList[seriesIndex].alias,
timezone === 'utc'
? dateTime(seriesList[seriesIndex].datapoints[i][POINT_TIME_INDEX])
.utc()
.format(dateTimeFormat)
: dateTime(seriesList[seriesIndex].datapoints[i][POINT_TIME_INDEX]).format(dateTimeFormat),
dateTime(seriesList[seriesIndex].datapoints[i][POINT_TIME_INDEX]).format(dateTimeFormat),
seriesList[seriesIndex].datapoints[i][POINT_VALUE_INDEX],
],
i < seriesList[seriesIndex].datapoints.length - 1 || seriesIndex < seriesList.length - 1
@@ -92,13 +75,12 @@ export function convertSeriesListToCsv(seriesList, options: Partial<SeriesListTo
return text;
}
export function exportSeriesListToCsv(seriesList, options: Partial<SeriesListToCsvColumnsOptions>) {
const text = convertSeriesListToCsv(seriesList, options);
export function exportSeriesListToCsv(seriesList, dateTimeFormat = DEFAULT_DATETIME_FORMAT, excel = false) {
const text = convertSeriesListToCsv(seriesList, dateTimeFormat, excel);
saveSaveBlob(text, EXPORT_FILENAME);
}
export function convertSeriesListToCsvColumns(seriesList, options: Partial<SeriesListToCsvColumnsOptions>) {
const { dateTimeFormat, excel, timezone } = { ...defaultOptions, ...options };
export function convertSeriesListToCsvColumns(seriesList, dateTimeFormat = DEFAULT_DATETIME_FORMAT, excel = false) {
// add header
let text =
formatSpecialHeader(excel) +
@@ -114,13 +96,7 @@ export function convertSeriesListToCsvColumns(seriesList, options: Partial<Serie
// make text
for (let i = 0; i < extendedDatapointsList[0].length; i += 1) {
const timestamp =
timezone === 'utc'
? dateTime(extendedDatapointsList[0][i][POINT_TIME_INDEX])
.utc()
.format(dateTimeFormat)
: dateTime(extendedDatapointsList[0][i][POINT_TIME_INDEX]).format(dateTimeFormat);
const timestamp = dateTime(extendedDatapointsList[0][i][POINT_TIME_INDEX]).format(dateTimeFormat);
text += formatRow(
[timestamp].concat(
extendedDatapointsList.map(datapoints => {
@@ -167,8 +143,8 @@ function mergeSeriesByTime(seriesList) {
return result;
}
export function exportSeriesListToCsvColumns(seriesList, options: Partial<SeriesListToCsvColumnsOptions>) {
const text = convertSeriesListToCsvColumns(seriesList, options);
export function exportSeriesListToCsvColumns(seriesList, dateTimeFormat = DEFAULT_DATETIME_FORMAT, excel = false) {
const text = convertSeriesListToCsvColumns(seriesList, dateTimeFormat, excel);
saveSaveBlob(text, EXPORT_FILENAME);
}

View File

@@ -179,7 +179,7 @@ export default class AdminEditUserCtrl {
const user = $scope.user;
// External user can not be disabled
if (user.isExternal) {
if (user.authModule) {
event.preventDefault();
event.stopPropagation();
return;

View File

@@ -1,6 +1,5 @@
import { BackendSrv } from 'app/core/services/backend_srv';
import { NavModelSrv } from 'app/core/core';
import tags from 'app/core/utils/tags';
export default class AdminListUsersCtrl {
users: any;
@@ -33,8 +32,6 @@ export default class AdminListUsersCtrl {
for (let i = 1; i < this.totalPages + 1; i++) {
this.pages.push({ page: i, current: i === this.page });
}
this.addUsersAuthLabels();
});
}
@@ -43,29 +40,10 @@ export default class AdminListUsersCtrl {
this.getUsers();
}
addUsersAuthLabels() {
for (const user of this.users) {
user.authLabel = getAuthLabel(user);
user.authLabelStyle = getAuthLabelStyle(user.authLabel);
getAuthModule(user: any) {
if (user.authModule && user.authModule.length) {
return user.authModule[0];
}
return undefined;
}
}
function getAuthLabel(user: any) {
if (user.authLabels && user.authLabels.length) {
return user.authLabels[0];
}
return '';
}
function getAuthLabelStyle(label: string) {
if (label === 'LDAP' || !label) {
return {};
}
const { color, borderColor } = tags.getTagColorsFromName(label);
return {
'background-color': color,
'border-color': borderColor,
};
}

View File

@@ -118,52 +118,48 @@
<h3 class="page-heading">Sessions</h3>
<div class="gf-form-group">
<div class="gf-form">
<table class="filter-table form-inline">
<thead>
<tr>
<th>Last seen</th>
<th>Logged on</th>
<th>IP address</th>
<th>Browser &amp; OS</th>
<th></th>
</tr>
</thead>
<tbody>
<tr ng-repeat="session in sessions">
<td ng-if="session.isActive">Now</td>
<td ng-if="!session.isActive">{{session.seenAt}}</td>
<td>{{session.createdAt}}</td>
<td>{{session.clientIp}}</td>
<td>{{session.browser}} on {{session.os}} {{session.osVersion}}</td>
<td>
<button class="btn btn-danger btn-small" ng-click="revokeUserSession(session.id)">
<i class="fa fa-power-off"></i>
</button>
</td>
</tr>
</tbody>
</table>
</div>
<div class="gf-form-button-row">
<button ng-if="sessions.length" class="btn btn-danger" ng-click="revokeAllUserSessions()">
Logout user from all devices
</button>
</div>
<table class="filter-table form-inline">
<thead>
<tr>
<th>Last seen</th>
<th>Logged on</th>
<th>IP address</th>
<th>Browser &amp; OS</th>
<th></th>
</tr>
</thead>
<tbody>
<tr ng-repeat="session in sessions">
<td ng-if="session.isActive">Now</td>
<td ng-if="!session.isActive">{{session.seenAt}}</td>
<td>{{session.createdAt}}</td>
<td>{{session.clientIp}}</td>
<td>{{session.browser}} on {{session.os}} {{session.osVersion}}</td>
<td>
<button class="btn btn-danger btn-small" ng-click="revokeUserSession(session.id)">
<i class="fa fa-power-off"></i>
</button>
</td>
</tr>
</tbody>
</table>
</div>
<h3 class="page-heading">User status</h3>
<button ng-if="sessions.length" class="btn btn-danger" ng-click="revokeAllUserSessions()">
Logout user from all devices
</button>
<div class="gf-form-group">
<h3 class="page-heading">User status</h3>
<div class="gf-form-button-row">
<button
type="submit"
class="btn btn-danger"
ng-if="!user.isDisabled"
ng-click="disableUser($event)"
bs-tooltip="user.isExternal ? 'External user cannot be enabled or disabled' : ''"
ng-class="{'disabled': user.isExternal}"
bs-tooltip="user.authModule ? 'External user cannot be activated or deactivated' : ''"
ng-class="{'disabled': user.authModule}"
>
Disable
</button>
@@ -172,8 +168,8 @@
class="btn btn-primary"
ng-if="user.isDisabled"
ng-click="disableUser($event)"
bs-tooltip="user.isExternal ? 'External user cannot be enabled or disabled' : ''"
ng-class="{'disabled': user.isExternal}"
bs-tooltip="user.authModule ? 'External user cannot be activated or deactivated' : ''"
ng-class="{'disabled': user.authModule}"
>
Enable
</button>

View File

@@ -55,9 +55,7 @@
</a>
</td>
<td class="text-right">
<span class="label label-tag" ng-style="user.authLabelStyle" ng-if="user.authLabel">
{{user.authLabel}}
</span>
<span class="label label-tag" ng-class="{'muted': user.isDisabled}" ng-if="ctrl.getAuthModule(user) === 'ldap'">LDAP</span>
</td>
<td class="text-right">
<span class="label label-tag label-tag--gray" ng-if="user.isDisabled">Disabled</span>

View File

@@ -1,7 +1,6 @@
import angular from 'angular';
import * as fileExport from 'app/core/utils/file_export';
import appEvents from 'app/core/app_events';
import { DashboardSrv } from 'app/features/dashboard/services/DashboardSrv';
export class ExportDataModalCtrl {
private data: any[];
@@ -10,23 +9,14 @@ export class ExportDataModalCtrl {
dateTimeFormat = 'YYYY-MM-DDTHH:mm:ssZ';
excel = false;
/** @ngInject */
constructor(private dashboardSrv: DashboardSrv) {}
export() {
const timezone = this.dashboardSrv.getCurrent().timezone;
const options = {
excel: this.excel,
dateTimeFormat: this.dateTimeFormat,
timezone,
};
if (this.panel === 'table') {
fileExport.exportTableDataToCsv(this.data, this.excel);
} else {
if (this.asRows) {
fileExport.exportSeriesListToCsv(this.data, options);
fileExport.exportSeriesListToCsv(this.data, this.dateTimeFormat, this.excel);
} else {
fileExport.exportSeriesListToCsvColumns(this.data, options);
fileExport.exportSeriesListToCsvColumns(this.data, this.dateTimeFormat, this.excel);
}
}

View File

@@ -22,7 +22,6 @@ import * as graphPanel from 'app/plugins/panel/graph/module';
import * as dashListPanel from 'app/plugins/panel/dashlist/module';
import * as pluginsListPanel from 'app/plugins/panel/pluginlist/module';
import * as alertListPanel from 'app/plugins/panel/alertlist/module';
import * as annoListPanel from 'app/plugins/panel/annolist/module';
import * as heatmapPanel from 'app/plugins/panel/heatmap/module';
import * as tablePanel from 'app/plugins/panel/table/module';
import * as table2Panel from 'app/plugins/panel/table2/module';
@@ -60,7 +59,6 @@ const builtInPlugins = {
'app/plugins/panel/dashlist/module': dashListPanel,
'app/plugins/panel/pluginlist/module': pluginsListPanel,
'app/plugins/panel/alertlist/module': alertListPanel,
'app/plugins/panel/annolist/module': annoListPanel,
'app/plugins/panel/heatmap/module': heatmapPanel,
'app/plugins/panel/table/module': tablePanel,
'app/plugins/panel/table2/module': table2Panel,

View File

@@ -45,10 +45,6 @@
</div>
</div>
<div class="clearfix"></div>
<a class="btn btn-medium btn-service btn-service--github login-btn" href="login/saml" target="_self" ng-if="samlEnabled">
<i class="btn-service-icon fa fa-key"></i>
Sign in with SAML
</a>
<div class="login-oauth text-center" ng-show="oauthEnabled">
<a class="btn btn-medium btn-service btn-service--google login-btn" href="login/google" target="_self" ng-if="oauth.google">
<i class="btn-service-icon fa fa-google"></i>
@@ -72,6 +68,10 @@
<i class="btn-service-icon fa fa-sign-in"></i>
Sign in with {{oauth.generic_oauth.name}}
</a>
<a class="btn btn-medium btn-service btn-service--github login-btn" href="login/saml" target="_self" ng-if="samlEnabled">
<i class="btn-service-icon fa fa-key"></i>
Sign in with SAML
</a>
</div>
<div class="login-signup-box" ng-show="!disableUserSignUp">
<div class="login-signup-title p-r-1">

View File

@@ -22,7 +22,7 @@ const DEFAULT_KEYS = ['job', 'namespace'];
const EMPTY_SELECTOR = '{}';
const HISTORY_ITEM_COUNT = 10;
const HISTORY_COUNT_CUTOFF = 1000 * 60 * 60 * 24; // 24h
const NS_IN_MS = 1000000;
const NS_IN_MS = 1_000_000;
export const LABEL_REFRESH_INTERVAL = 1000 * 30; // 30sec
const wrapLabel = (label: string) => ({ label });

View File

@@ -1,194 +0,0 @@
// Libraries
import React, { PureComponent, ChangeEvent } from 'react';
// Components
import { PanelEditorProps, PanelOptionsGroup, PanelOptionsGrid, Switch, FormField, FormLabel } from '@grafana/ui';
import { toIntegerOrUndefined, toNumberString } from '@grafana/data';
// Types
import { AnnoOptions } from './types';
import { TagBadge } from 'app/core/components/TagFilter/TagBadge';
interface State {
tag: string;
}
export class AnnoListEditor extends PureComponent<PanelEditorProps<AnnoOptions>, State> {
constructor(props: PanelEditorProps<AnnoOptions>) {
super(props);
this.state = {
tag: '',
};
}
// Display
//-----------
onToggleShowUser = () =>
this.props.onOptionsChange({ ...this.props.options, showUser: !this.props.options.showUser });
onToggleShowTime = () =>
this.props.onOptionsChange({ ...this.props.options, showTime: !this.props.options.showTime });
onToggleShowTags = () =>
this.props.onOptionsChange({ ...this.props.options, showTags: !this.props.options.showTags });
// Navigate
//-----------
onNavigateBeforeChange = (event: ChangeEvent<HTMLInputElement>) => {
this.props.onOptionsChange({ ...this.props.options, navigateBefore: event.target.value });
};
onNavigateAfterChange = (event: ChangeEvent<HTMLInputElement>) => {
this.props.onOptionsChange({ ...this.props.options, navigateAfter: event.target.value });
};
onToggleNavigateToPanel = () =>
this.props.onOptionsChange({ ...this.props.options, navigateToPanel: !this.props.options.navigateToPanel });
// Search
//-----------
onLimitChange = (event: ChangeEvent<HTMLInputElement>) => {
const v = toIntegerOrUndefined(event.target.value);
this.props.onOptionsChange({ ...this.props.options, limit: v });
};
onToggleOnlyFromThisDashboard = () =>
this.props.onOptionsChange({
...this.props.options,
onlyFromThisDashboard: !this.props.options.onlyFromThisDashboard,
});
onToggleOnlyInTimeRange = () =>
this.props.onOptionsChange({ ...this.props.options, onlyInTimeRange: !this.props.options.onlyInTimeRange });
// Tags
//-----------
onTagTextChange = (event: ChangeEvent<HTMLInputElement>) => {
this.setState({ tag: event.target.value });
};
onTagClick = (e: React.SyntheticEvent, tag: string) => {
e.stopPropagation();
const tags = this.props.options.tags.filter(item => item !== tag);
this.props.onOptionsChange({
...this.props.options,
tags,
});
};
renderTags = (tags: string[]): JSX.Element => {
if (!tags || !tags.length) {
return null;
}
return (
<>
{tags.map(tag => {
return (
<span key={tag} onClick={e => this.onTagClick(e, tag)} className="pointer">
<TagBadge label={tag} removeIcon={true} count={0} />
</span>
);
})}
</>
);
};
render() {
const { options } = this.props;
const labelWidth = 8;
return (
<PanelOptionsGrid>
<PanelOptionsGroup title="Display">
<Switch
label="Show User"
labelClass={`width-${labelWidth}`}
checked={options.showUser}
onChange={this.onToggleShowUser}
/>
<Switch
label="Show Time"
labelClass={`width-${labelWidth}`}
checked={options.showTime}
onChange={this.onToggleShowTime}
/>
<Switch
label="Show Tags"
labelClass={`width-${labelWidth}`}
checked={options.showTags}
onChange={this.onToggleShowTags}
/>
</PanelOptionsGroup>
<PanelOptionsGroup title="Navigate">
<FormField
label="Before"
labelWidth={labelWidth}
onChange={this.onNavigateBeforeChange}
value={options.navigateBefore}
/>
<FormField
label="After"
labelWidth={labelWidth}
onChange={this.onNavigateAfterChange}
value={options.navigateAfter}
/>
<Switch
label="To Panel"
labelClass={`width-${labelWidth}`}
checked={options.navigateToPanel}
onChange={this.onToggleNavigateToPanel}
/>
</PanelOptionsGroup>
<PanelOptionsGroup title="Search">
<Switch
label="Only This Dashboard"
labelClass={`width-12`}
checked={options.onlyFromThisDashboard}
onChange={this.onToggleOnlyFromThisDashboard}
/>
<Switch
label="Within Time Range"
labelClass={`width-12`}
checked={options.onlyInTimeRange}
onChange={this.onToggleOnlyInTimeRange}
/>
<div className="form-field">
<FormLabel width={6}>Tags</FormLabel>
{this.renderTags(options.tags)}
<input
type="text"
className={`gf-form-input width-${8}`}
value={this.state.tag}
onChange={this.onTagTextChange}
onKeyPress={ev => {
if (this.state.tag && ev.key === 'Enter') {
const tags = [...options.tags, this.state.tag];
this.props.onOptionsChange({
...this.props.options,
tags,
});
this.setState({ tag: '' });
ev.preventDefault();
}
}}
/>
</div>
<FormField
label="Limit"
labelWidth={6}
onChange={this.onLimitChange}
value={toNumberString(options.limit)}
type="number"
/>
</PanelOptionsGroup>
</PanelOptionsGrid>
);
}
}

View File

@@ -1,304 +0,0 @@
// Libraries
import React, { PureComponent } from 'react';
// Types
import { AnnoOptions } from './types';
import { dateTime, DurationUnit, AnnotationEvent } from '@grafana/data';
import { PanelProps, Tooltip } from '@grafana/ui';
import { getBackendSrv } from 'app/core/services/backend_srv';
import { AbstractList } from '@grafana/ui/src/components/List/AbstractList';
import { TagBadge } from 'app/core/components/TagFilter/TagBadge';
import { getDashboardSrv } from 'app/features/dashboard/services/DashboardSrv';
import appEvents from 'app/core/app_events';
import { updateLocation } from 'app/core/actions';
import { store } from 'app/store/store';
import { cx, css } from 'emotion';
interface UserInfo {
id: number;
login: string;
email: string;
}
interface Props extends PanelProps<AnnoOptions> {}
interface State {
annotations: AnnotationEvent[];
timeInfo: string;
loaded: boolean;
queryUser?: UserInfo;
queryTags: string[];
}
export class AnnoListPanel extends PureComponent<Props, State> {
constructor(props: Props) {
super(props);
this.state = {
annotations: [],
timeInfo: '',
loaded: false,
queryTags: [],
};
}
componentDidMount() {
this.doSearch();
}
componentDidUpdate(prevProps: Props, prevState: State) {
const { options, timeRange } = this.props;
const needsQuery =
options !== prevProps.options ||
this.state.queryTags !== prevState.queryTags ||
this.state.queryUser !== prevState.queryUser ||
timeRange !== prevProps.timeRange;
if (needsQuery) {
this.doSearch();
}
}
async doSearch() {
// http://docs.grafana.org/http_api/annotations/
// https://github.com/grafana/grafana/blob/master/public/app/core/services/backend_srv.ts
// https://github.com/grafana/grafana/blob/master/public/app/features/annotations/annotations_srv.ts
const { options } = this.props;
const { queryUser, queryTags } = this.state;
const params: any = {
tags: options.tags,
limit: options.limit,
type: 'annotation', // Skip the Annotations that are really alerts. (Use the alerts panel!)
};
if (options.onlyFromThisDashboard) {
params.dashboardId = getDashboardSrv().getCurrent().id;
}
let timeInfo = '';
if (options.onlyInTimeRange) {
const { timeRange } = this.props;
params.from = timeRange.from.valueOf();
params.to = timeRange.to.valueOf();
} else {
timeInfo = 'All Time';
}
if (queryUser) {
params.userId = queryUser.id;
}
if (options.tags && options.tags.length) {
params.tags = options.tags;
}
if (queryTags.length) {
params.tags = params.tags ? [...params.tags, ...queryTags] : queryTags;
}
const annotations = await getBackendSrv().get('/api/annotations', params);
this.setState({
annotations,
timeInfo,
loaded: true,
});
}
onAnnoClick = (e: React.SyntheticEvent, anno: AnnotationEvent) => {
e.stopPropagation();
const { options } = this.props;
const dashboardSrv = getDashboardSrv();
const current = dashboardSrv.getCurrent();
const params: any = {
from: this._timeOffset(anno.time, options.navigateBefore, true),
to: this._timeOffset(anno.time, options.navigateAfter, false),
};
if (options.navigateToPanel) {
params.panelId = anno.panelId;
params.fullscreen = true;
}
if (current.id === anno.dashboardId) {
store.dispatch(
updateLocation({
query: params,
partial: true,
})
);
return;
}
getBackendSrv()
.get('/api/search', { dashboardIds: anno.dashboardId })
.then((res: any[]) => {
if (res && res.length && res[0].id === anno.dashboardId) {
const dash = res[0];
store.dispatch(
updateLocation({
query: params,
path: dash.url,
})
);
return;
}
appEvents.emit('alert-warning', ['Unknown Dashboard: ' + anno.dashboardId]);
});
};
_timeOffset(time: number, offset: string, subtract = false): number {
let incr = 5;
let unit = 'm';
const parts = /^(\d+)(\w)/.exec(offset);
if (parts && parts.length === 3) {
incr = parseInt(parts[1], 10);
unit = parts[2];
}
const t = dateTime(time);
if (subtract) {
incr *= -1;
}
return t.add(incr, unit as DurationUnit).valueOf();
}
onTagClick = (e: React.SyntheticEvent, tag: string, remove: boolean) => {
e.stopPropagation();
const queryTags = remove ? this.state.queryTags.filter(item => item !== tag) : [...this.state.queryTags, tag];
this.setState({ queryTags });
};
onUserClick = (e: React.SyntheticEvent, anno: AnnotationEvent) => {
e.stopPropagation();
this.setState({
queryUser: {
id: anno.userId,
login: anno.login,
email: anno.email,
},
});
};
onClearUser = () => {
this.setState({
queryUser: undefined,
});
};
renderTags = (tags: string[], remove: boolean): JSX.Element => {
if (!tags || !tags.length) {
return null;
}
return (
<>
{tags.map(tag => {
return (
<span key={tag} onClick={e => this.onTagClick(e, tag, remove)} className="pointer">
<TagBadge label={tag} removeIcon={remove} count={0} />
</span>
);
})}
</>
);
};
renderItem = (anno: AnnotationEvent, index: number): JSX.Element => {
const { options } = this.props;
const { showUser, showTags, showTime } = options;
const dashboard = getDashboardSrv().getCurrent();
return (
<div className="dashlist-item">
<span
className="dashlist-link pointer"
onClick={e => {
this.onAnnoClick(e, anno);
}}
>
<span
className={cx([
'dashlist-title',
css`
margin-right: 8px;
`,
])}
>
{anno.text}
</span>
<span className="pluginlist-message">
{anno.login && showUser && (
<span className="graph-annotation">
<Tooltip
content={
<span>
Created by:
<br /> {anno.email}
</span>
}
theme="info"
placement="top"
>
<span onClick={e => this.onUserClick(e, anno)} className="graph-annotation__user">
<img src={anno.avatarUrl} />
</span>
</Tooltip>
</span>
)}
{showTags && this.renderTags(anno.tags, false)}
</span>
<span className="pluginlist-version">{showTime && <span>{dashboard.formatDate(anno.time)}</span>}</span>
</span>
</div>
);
};
render() {
const { height } = this.props;
const { loaded, annotations, queryUser, queryTags } = this.state;
if (!loaded) {
return <div>loading...</div>;
}
// Previously we showed inidication that it covered all time
// { timeInfo && (
// <span className="panel-time-info">
// <i className="fa fa-clock-o" /> {timeInfo}
// </span>
// )}
const hasFilter = queryUser || queryTags.length > 0;
return (
<div style={{ height, overflow: 'scroll' }}>
{hasFilter && (
<div>
<b>Filter: &nbsp; </b>
{queryUser && (
<span onClick={this.onClearUser} className="pointer">
{queryUser.email}
</span>
)}
{queryTags.length > 0 && this.renderTags(queryTags, true)}
</div>
)}
{annotations.length < 1 && <div className="panel-alert-list__no-alerts">No Annotations Found</div>}
<AbstractList
items={annotations}
renderItem={this.renderItem}
getItemKey={item => {
return item.id + '';
}}
className="dashlist"
/>
</div>
);
}
}

View File

@@ -1,4 +0,0 @@
# Annotation List Panel - Native Plugin
This Annotations List panel is **included** with Grafana.

View File

@@ -1,119 +0,0 @@
<?xml version="1.0" encoding="iso-8859-1"?>
<!-- Generator: Adobe Illustrator 19.1.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
width="100px" height="100px" viewBox="0 0 100 100" style="enable-background:new 0 0 100 100;" xml:space="preserve">
<g>
<g>
<path style="fill:#666666;" d="M8.842,11.219h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,11.219z"/>
<path style="fill:#666666;" d="M0.008,2.113l2.054-2.054C0.966,0.139,0.089,1.016,0.008,2.113z"/>
<polygon style="fill:#666666;" points="0,2.998 0,5.533 5.484,0.05 2.948,0.05 "/>
<polygon style="fill:#666666;" points="6.361,0.05 0,6.411 0,8.946 8.896,0.05 "/>
<path style="fill:#666666;" d="M11.169,2.277c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V2.277z"/>
<path style="fill:#666666;" d="M9.654,0.169L0.119,9.704c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,0.812,10.247,0.37,9.654,0.169z"/>
<polygon style="fill:#666666;" points="11.169,5.479 5.429,11.219 7.964,11.219 11.169,8.014 "/>
</g>
<path style="fill:#898989;" d="M88.146,11.031H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.831,1.83-1.831h73.281
c1.011,0,1.83,0.82,1.83,1.831v7.37C89.977,10.212,89.157,11.031,88.146,11.031z"/>
<g>
<path style="fill:#666666;" d="M8.842,23.902h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,23.902z"/>
<path style="fill:#666666;" d="M0.008,14.796l2.054-2.054C0.966,12.822,0.089,13.699,0.008,14.796z"/>
<polygon style="fill:#666666;" points="0,15.681 0,18.216 5.484,12.733 2.948,12.733 "/>
<polygon style="fill:#666666;" points="6.361,12.733 0,19.094 0,21.629 8.896,12.733 "/>
<path style="fill:#666666;" d="M11.169,14.96c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V14.96z"/>
<path style="fill:#666666;" d="M9.654,12.852l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,13.495,10.247,13.053,9.654,12.852z"/>
<polygon style="fill:#666666;" points="11.169,18.162 5.429,23.902 7.964,23.902 11.169,20.697 "/>
</g>
<path style="fill:#898989;" d="M88.146,23.714H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.83,1.83-1.83h73.281
c1.011,0,1.83,0.82,1.83,1.83v7.37C89.977,22.895,89.157,23.714,88.146,23.714z"/>
<g>
<path style="fill:#666666;" d="M8.842,36.585h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,36.585z"/>
<path style="fill:#666666;" d="M0.008,27.479l2.054-2.054C0.966,25.505,0.089,26.382,0.008,27.479z"/>
<polygon style="fill:#666666;" points="0,28.364 0,30.899 5.484,25.416 2.948,25.416 "/>
<polygon style="fill:#666666;" points="6.361,25.416 0,31.777 0,34.312 8.896,25.416 "/>
<path style="fill:#666666;" d="M11.169,27.643c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V27.643z"/>
<path style="fill:#666666;" d="M9.654,25.535L0.119,35.07c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,26.178,10.247,25.736,9.654,25.535z"/>
<polygon style="fill:#666666;" points="11.169,30.845 5.429,36.585 7.964,36.585 11.169,33.38 "/>
</g>
<path style="fill:#898989;" d="M88.146,36.397H14.866c-1.011,0-1.83-0.82-1.83-1.831v-7.37c0-1.011,0.82-1.83,1.83-1.83h73.281
c1.011,0,1.83,0.82,1.83,1.83v7.37C89.977,35.578,89.157,36.397,88.146,36.397z"/>
<g>
<path style="fill:#666666;" d="M8.842,49.268h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,49.268z"/>
<path style="fill:#666666;" d="M0.008,40.162l2.054-2.054C0.966,38.188,0.089,39.065,0.008,40.162z"/>
<polygon style="fill:#666666;" points="0,41.047 0,43.582 5.484,38.099 2.948,38.099 "/>
<polygon style="fill:#666666;" points="6.361,38.099 0,44.46 0,46.995 8.896,38.099 "/>
<path style="fill:#666666;" d="M11.169,40.326c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V40.326z"/>
<path style="fill:#666666;" d="M9.654,38.218l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,38.861,10.247,38.419,9.654,38.218z"/>
<polygon style="fill:#666666;" points="11.169,43.528 5.429,49.268 7.964,49.268 11.169,46.063 "/>
</g>
<path style="fill:#898989;" d="M88.146,49.08H14.866c-1.011,0-1.83-0.82-1.83-1.831v-7.37c0-1.011,0.82-1.831,1.83-1.831h73.281
c1.011,0,1.83,0.82,1.83,1.831v7.37C89.977,48.261,89.157,49.08,88.146,49.08z"/>
<g>
<path style="fill:#666666;" d="M8.842,61.951h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,61.951z"/>
<path style="fill:#666666;" d="M0.008,52.845l2.054-2.054C0.966,50.871,0.089,51.748,0.008,52.845z"/>
<polygon style="fill:#666666;" points="0,53.73 0,56.265 5.484,50.782 2.948,50.782 "/>
<polygon style="fill:#666666;" points="6.361,50.782 0,57.143 0,59.678 8.896,50.782 "/>
<path style="fill:#666666;" d="M11.169,53.009c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V53.009z"/>
<path style="fill:#666666;" d="M9.654,50.901l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,51.544,10.247,51.102,9.654,50.901z"/>
<polygon style="fill:#666666;" points="11.169,56.211 5.429,61.951 7.964,61.951 11.169,58.746 "/>
</g>
<path style="fill:#898989;" d="M88.146,61.763H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.831,1.83-1.831h73.281
c1.011,0,1.83,0.82,1.83,1.831v7.37C89.977,60.944,89.157,61.763,88.146,61.763z"/>
<g>
<path style="fill:#666666;" d="M8.842,74.634h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,74.634z"/>
<path style="fill:#666666;" d="M0.008,65.528l2.054-2.054C0.966,63.554,0.089,64.431,0.008,65.528z"/>
<polygon style="fill:#666666;" points="0,66.413 0,68.948 5.484,63.465 2.948,63.465 "/>
<polygon style="fill:#666666;" points="6.361,63.465 0,69.826 0,72.361 8.896,63.465 "/>
<path style="fill:#666666;" d="M11.169,65.692c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V65.692z"/>
<path style="fill:#666666;" d="M9.654,63.584l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,64.227,10.247,63.785,9.654,63.584z"/>
<polygon style="fill:#666666;" points="11.169,68.894 5.429,74.634 7.964,74.634 11.169,71.429 "/>
</g>
<path style="fill:#898989;" d="M88.146,74.446H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.831,1.83-1.831h73.281
c1.011,0,1.83,0.82,1.83,1.831v7.37C89.977,73.627,89.157,74.446,88.146,74.446z"/>
<g>
<path style="fill:#666666;" d="M8.842,87.317h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,87.317z"/>
<path style="fill:#666666;" d="M0.008,78.211l2.054-2.054C0.966,76.237,0.089,77.114,0.008,78.211z"/>
<polygon style="fill:#666666;" points="0,79.096 0,81.631 5.484,76.148 2.948,76.148 "/>
<polygon style="fill:#666666;" points="6.361,76.148 0,82.509 0,85.044 8.896,76.148 "/>
<path style="fill:#666666;" d="M11.169,78.375c0-0.068-0.004-0.134-0.01-0.2l-9.132,9.132c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V78.375z"/>
<path style="fill:#666666;" d="M9.654,76.267l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,76.91,10.247,76.468,9.654,76.267z"/>
<polygon style="fill:#666666;" points="11.169,81.577 5.429,87.317 7.964,87.317 11.169,84.112 "/>
</g>
<path style="fill:#898989;" d="M88.146,87.129H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.831,1.83-1.831h73.281
c1.011,0,1.83,0.82,1.83,1.831v7.37C89.977,86.31,89.157,87.129,88.146,87.129z"/>
<g>
<path style="fill:#666666;" d="M8.842,100h0.1c1.228,0,2.227-0.999,2.227-2.227v-0.1L8.842,100z"/>
<path style="fill:#666666;" d="M0.008,90.894l2.054-2.054C0.966,88.92,0.089,89.797,0.008,90.894z"/>
<polygon style="fill:#666666;" points="0,91.779 0,94.314 5.484,88.831 2.948,88.831 "/>
<polygon style="fill:#666666;" points="6.361,88.831 0,95.192 0,97.727 8.896,88.831 "/>
<path style="fill:#666666;" d="M11.169,91.058c0-0.068-0.004-0.134-0.01-0.2L2.027,99.99c0.066,0.006,0.133,0.01,0.2,0.01h2.325
l6.617-6.617V91.058z"/>
<path style="fill:#666666;" d="M9.654,88.95l-9.536,9.536c0.201,0.592,0.643,1.073,1.211,1.324l9.649-9.649
C10.728,89.593,10.247,89.151,9.654,88.95z"/>
<polygon style="fill:#666666;" points="11.169,94.26 5.429,100 7.964,100 11.169,96.795 "/>
</g>
<path style="fill:#898989;" d="M88.146,99.812H14.866c-1.011,0-1.83-0.82-1.83-1.83v-7.37c0-1.011,0.82-1.83,1.83-1.83h73.281
c1.011,0,1.83,0.82,1.83,1.83v7.37C89.977,98.993,89.157,99.812,88.146,99.812z"/>
<circle style="fill:#F7941E;" cx="96.125" cy="5.637" r="3.875"/>
<circle style="fill:#898989;" cx="96.125" cy="18.37" r="3.875"/>
<circle style="fill:#898989;" cx="96.125" cy="31.104" r="3.875"/>
<circle style="fill:#F7941E;" cx="96.125" cy="43.837" r="3.875"/>
<circle style="fill:#F7941E;" cx="96.125" cy="56.57" r="3.875"/>
<circle style="fill:#898989;" cx="96.125" cy="69.304" r="3.875"/>
<circle style="fill:#F7941E;" cx="96.125" cy="82.037" r="3.875"/>
<circle style="fill:#898989;" cx="96.125" cy="94.77" r="3.875"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 9.0 KiB

View File

@@ -1,16 +0,0 @@
import { AnnoListPanel } from './AnnoListPanel';
import { AnnoOptions, defaults } from './types';
import { AnnoListEditor } from './AnnoListEditor';
import { PanelPlugin } from '@grafana/ui';
export const plugin = new PanelPlugin<AnnoOptions>(AnnoListPanel)
.setDefaults(defaults)
.setEditor(AnnoListEditor)
// TODO, we should support this directly in the plugin infrastructure
.setPanelChangeHandler((options: AnnoOptions, prevPluginId: string, prevOptions: any) => {
if (prevPluginId === 'ryantxu-annolist-panel') {
return prevOptions as AnnoOptions;
}
return options;
});

View File

@@ -1,20 +0,0 @@
{
"type": "panel",
"name": "Annotations list (alpha)",
"id": "annolist",
"state": "alpha",
"skipDataQuery": true,
"info": {
"description": "List annotations",
"author": {
"name": "Grafana Project",
"url": "https://grafana.com"
},
"logos": {
"small": "img/icn-annolist-panel.svg",
"large": "img/icn-annolist-panel.svg"
}
}
}

View File

@@ -1,29 +0,0 @@
export interface AnnoOptions {
limit: number;
tags: string[];
onlyFromThisDashboard: boolean;
onlyInTimeRange: boolean;
showTags: boolean;
showUser: boolean;
showTime: boolean;
navigateBefore: string;
navigateAfter: string;
navigateToPanel: boolean;
}
export const defaults: AnnoOptions = {
limit: 10,
tags: [],
onlyFromThisDashboard: false,
onlyInTimeRange: false,
showTags: true,
showUser: true,
showTime: true,
navigateBefore: '10m',
navigateAfter: '10m',
navigateToPanel: true,
};

View File

@@ -2283,10 +2283,10 @@
resolved "https://registry.yarnpkg.com/@types/json-schema/-/json-schema-7.0.3.tgz#bdfd69d61e464dcc81b25159c270d75a73c1a636"
integrity sha512-Il2DtDVRGDcqjDtE+rF8iqg1CArehSK84HZJCT7AMITlyXRBpuPhqGLDQMowraqqu1coEaimg4ZOqggt6L6L+A==
"@types/lodash@4.14.123":
version "4.14.123"
resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.14.123.tgz#39be5d211478c8dd3bdae98ee75bb7efe4abfe4d"
integrity sha512-pQvPkc4Nltyx7G1Ww45OjVqUsJP4UsZm+GWJpigXgkikZqJgRm4c48g027o6tdgubWHwFRF15iFd+Y4Pmqv6+Q==
"@types/lodash@4.14.119", "@types/lodash@4.14.123":
version "4.14.119"
resolved "https://registry.yarnpkg.com/@types/lodash/-/lodash-4.14.119.tgz#be847e5f4bc3e35e46d041c394ead8b603ad8b39"
integrity sha512-Z3TNyBL8Vd/M9D9Ms2S3LmFq2sSMzahodD6rCS9V2N44HUMINb75jNkSuwAx7eo2ufqTdfOdtGQpNbieUjPQmw==
"@types/marked@0.6.5":
version "0.6.5"
@@ -2568,11 +2568,6 @@
resolved "https://registry.yarnpkg.com/@types/tinycolor2/-/tinycolor2-1.4.2.tgz#721ca5c5d1a2988b4a886e35c2ffc5735b6afbdf"
integrity sha512-PeHg/AtdW6aaIO2a+98Xj7rWY4KC1E6yOy7AFknJQ7VXUGNrMlyxDFxJo7HqLtjQms/ZhhQX52mLVW/EX3JGOw==
"@types/tmp@^0.1.0":
version "0.1.0"
resolved "https://registry.yarnpkg.com/@types/tmp/-/tmp-0.1.0.tgz#19cf73a7bcf641965485119726397a096f0049bd"
integrity sha512-6IwZ9HzWbCq6XoQWhxLpDjuADodH/MKXRUIDFudvgjcVdjFknvmR+DNsoUeer4XPrEnrZs04Jj+kfV9pFsrhmA==
"@types/uglify-js@*":
version "3.0.4"
resolved "https://registry.yarnpkg.com/@types/uglify-js/-/uglify-js-3.0.4.tgz#96beae23df6f561862a830b4288a49e86baac082"
@@ -3437,7 +3432,7 @@ babel-plugin-add-react-displayname@^0.0.5:
version "0.0.5"
resolved "https://registry.yarnpkg.com/babel-plugin-add-react-displayname/-/babel-plugin-add-react-displayname-0.0.5.tgz#339d4cddb7b65fd62d1df9db9fe04de134122bd5"
babel-plugin-angularjs-annotate@0.10.0, babel-plugin-angularjs-annotate@^0.10.0:
babel-plugin-angularjs-annotate@0.10.0:
version "0.10.0"
resolved "https://registry.yarnpkg.com/babel-plugin-angularjs-annotate/-/babel-plugin-angularjs-annotate-0.10.0.tgz#4213b3aaae494a087aad0b8237c5d0716d22ca76"
dependencies:
@@ -4221,11 +4216,6 @@ caniuse-api@^3.0.0:
lodash.memoize "^4.1.2"
lodash.uniq "^4.5.0"
caniuse-db@1.0.30000772:
version "1.0.30000772"
resolved "https://registry.yarnpkg.com/caniuse-db/-/caniuse-db-1.0.30000772.tgz#51aae891768286eade4a3d8319ea76d6a01b512b"
integrity sha1-UarokXaChureSj2DGep21qAbUSs=
caniuse-lite@^1.0.0, caniuse-lite@^1.0.30000929, caniuse-lite@^1.0.30000947, caniuse-lite@^1.0.30000957, caniuse-lite@^1.0.30000963:
version "1.0.30000966"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30000966.tgz#f3c6fefacfbfbfb981df6dfa68f2aae7bff41b64"
@@ -10763,7 +10753,7 @@ ng-annotate-loader@0.6.1:
normalize-path "2.0.1"
source-map "0.5.6"
ng-annotate-webpack-plugin@0.3.0:
ng-annotate-webpack-plugin@0.3.0, ng-annotate-webpack-plugin@^0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/ng-annotate-webpack-plugin/-/ng-annotate-webpack-plugin-0.3.0.tgz#2e7f5e29c6a4ce26649edcb06c1213408b35b84a"
dependencies:
@@ -15762,13 +15752,6 @@ tmp@^0.0.33:
dependencies:
os-tmpdir "~1.0.2"
tmp@^0.1.0:
version "0.1.0"
resolved "https://registry.yarnpkg.com/tmp/-/tmp-0.1.0.tgz#ee434a4e22543082e294ba6201dcc6eafefa2877"
integrity sha512-J7Z2K08jbGcdA1kkQpJSqLF6T0tdQqpR2pnSUXsIchbPdTI9v3e85cLW0d6WDhwuAleOV71j2xWs8qMPfK7nKw==
dependencies:
rimraf "^2.6.3"
tmpl@1.0.x:
version "1.0.4"
resolved "https://registry.yarnpkg.com/tmpl/-/tmpl-1.0.4.tgz#23640dd7b42d00433911140820e5cf440e521dd1"