From ed473c94ff574a3c79f942665936a05482906550 Mon Sep 17 00:00:00 2001 From: Aliaksandr Valialkin Date: Mon, 21 Sep 2020 21:49:02 +0300 Subject: [PATCH] docs/vmagent.md: typo fix --- app/vmagent/README.md | 2 +- docs/vmagent.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/app/vmagent/README.md b/app/vmagent/README.md index d596cc59fb..0ed6484b08 100644 --- a/app/vmagent/README.md +++ b/app/vmagent/README.md @@ -41,7 +41,7 @@ Just download `vmutils-*` archive from [releases page](https://github.com/Victor and pass the following flags to `vmagent` binary in order to start scraping Prometheus targets: * `-promscrape.config` with the path to Prometheus config file (it is usually located at `/etc/prometheus/prometheus.yml`) -* `-remoteWrite.url` with the remote storage endpoint such as VictoriaMetrics. The `-remoteWrite.url` argument can be specified multiple times in order to replicate data concurrently to an arbitrary amount of remote storage systems. +* `-remoteWrite.url` with the remote storage endpoint such as VictoriaMetrics. The `-remoteWrite.url` argument can be specified multiple times in order to replicate data concurrently to an arbitrary number of remote storage systems. Example command line: diff --git a/docs/vmagent.md b/docs/vmagent.md index d596cc59fb..0ed6484b08 100644 --- a/docs/vmagent.md +++ b/docs/vmagent.md @@ -41,7 +41,7 @@ Just download `vmutils-*` archive from [releases page](https://github.com/Victor and pass the following flags to `vmagent` binary in order to start scraping Prometheus targets: * `-promscrape.config` with the path to Prometheus config file (it is usually located at `/etc/prometheus/prometheus.yml`) -* `-remoteWrite.url` with the remote storage endpoint such as VictoriaMetrics. The `-remoteWrite.url` argument can be specified multiple times in order to replicate data concurrently to an arbitrary amount of remote storage systems. +* `-remoteWrite.url` with the remote storage endpoint such as VictoriaMetrics. The `-remoteWrite.url` argument can be specified multiple times in order to replicate data concurrently to an arbitrary number of remote storage systems. Example command line: