mirror of
https://github.com/VictoriaMetrics/VictoriaMetrics.git
synced 2024-12-15 16:30:55 +01:00
d8258be292
* deployment/docker: add VictoriaLogs configuration Signed-off-by: Zakhar Bessarab <z.bessarab@victoriametrics.com> * deployment/docker/victorialogs: remove outdated comment It was added in order to indicate that it is required to build VictoriaLogs manually before starting it at the time there was no public release available. Currently, there is a public tag and it is not required to build it from sources. Signed-off-by: Zakhar Bessarab <z.bessarab@victoriametrics.com> * deployment/docker/victorialogs/fluentbit: include log path in stream configuration Signed-off-by: Zakhar Bessarab <z.bessarab@victoriametrics.com> * deployment/docker: add reference to monitoring setup for VictoriaLogs Signed-off-by: Zakhar Bessarab <z.bessarab@victoriametrics.com> --------- Signed-off-by: Zakhar Bessarab <z.bessarab@victoriametrics.com> |
||
---|---|---|
.. | ||
docker-compose.yml | ||
README.md | ||
scrape.yml | ||
vector.toml |
Docker compose Vector integration with VictoriaLogs for docker
The folder contains the example of integration of vector with Victorialogs
To spin-up environment run the following command:
docker compose up -d
To shut down the docker-compose environment run the following command:
docker compose down
docker compose rm -f
The docker compose file contains the following components:
- vector - vector is configured to collect logs from the
docker
, you can find configuration in thevector.toml
. It writes data in VictoriaLogs. It pushes metrics to VictoriaMetrics. - VictoriaLogs - the log database, it accepts the data from
vector
by elastic protocol - VictoriaMetrics - collects metrics from
VictoriaLogs
andVictoriaMetrics
Querying the data
- vmui - a web UI is accessible by
http://localhost:9428/select/vmui
- for querying the data via command-line please check these docs
the example of vector configuration(vector.toml
)
[sources.docker]
type = "docker_logs"
[transforms.msg_parser]
type = "remap"
inputs = ["docker"]
source = '''
.log = parse_json!(.message)
del(.message)
'''
[sinks.vlogs]
type = "elasticsearch"
inputs = [ "msg_parser" ]
endpoints = [ "http://victorialogs:9428/insert/elasticsearch/" ]
mode = "bulk"
api_version = "v8"
compression = "gzip"
healthcheck.enabled = false
[sinks.vlogs.query]
_msg_field = "log.msg"
_time_field = "timestamp"
_stream_fields = "source_type,host,container_name"
[sinks.vlogs.request.headers]
AccountID = "0"
ProjectID = "0"
Please, note that _stream_fields
parameter must follow recommended best practices to achieve better performance.