VictoriaMetrics/docs/VictoriaLogs/data-ingestion/Promtail.md
Andrii Chubatiuk 6b97044d8a
view documentation locally ()
- moved files from root to VictoriaMetrics folder to be able to mount
operator docs and VictoriaMetrics docs independently
- added ability to run website locally

The following checks are **mandatory**:

- [ ] My change adheres [VictoriaMetrics contributing
guidelines](https://docs.victoriametrics.com/contributing/).
2024-07-25 12:27:05 +02:00

3.9 KiB

weight title disableToc menu aliases
4 Promtail setup true
docs
parent weight
victorialogs-data-ingestion 4
/VictoriaLogs/data-ingestion/Promtail.html
/victorialogs/data-ingestion/Promtail.html
/victorialogs/data-ingestion/promtail.html

Promtail is a default log shipper for Grafana Loki. Promtail can be configured to send the collected logs to VictoriaLogs according to the following docs.

Specify clients section in the configuration file for sending the collected logs to VictoriaLogs:

clients:
  - url: http://localhost:9428/insert/loki/api/v1/push?_stream_fields=instance,job,host,app

Substitute localhost:9428 address inside clients with the real TCP address of VictoriaLogs.

By default VictoriaLogs stores all the ingested logs into a single log stream. Storing all the logs in a single log stream may be not so efficient, so it is recommended to specify _stream_fields query arg with the list of labels, which uniquely identify log streams. There is no need in specifying all the labels Promtail generates there - it is usually enough specifying instance and job labels. See these docs for details.

See also these docs for details on other supported query args. There is no need in specifying _msg_field and _time_field query args, since VictoriaLogs automatically extracts log message and timestamp from the ingested Loki data.

It is recommended verifying whether the initial setup generates the needed log fields and uses the correct stream fields. This can be done by specifying debug parameter and inspecting VictoriaLogs logs then:

clients:
  - url: http://localhost:9428/insert/loki/api/v1/push?_stream_fields=instance,job,host,app&debug=1

If some log fields must be skipped during data ingestion, then they can be put into ignore_fields parameter. For example, the following config instructs VictoriaLogs to ignore filename and stream fields in the ingested logs:

clients:
  - url: http://localhost:9428/insert/loki/api/v1/push?_stream_fields=instance,job,host,app&ignore_fields=filename,stream

By default the ingested logs are stored in the (AccountID=0, ProjectID=0) tenant. If you need storing logs in other tenant, then specify the needed tenant via tenant_id field in the Loki client configuration The tenant_id must have AccountID:ProjectID format, where AccountID and ProjectID are arbitrary uint32 numbers. For example, the following config instructs VictoriaLogs to store logs in the (AccountID=12, ProjectID=34) tenant:

clients:
  - url: http://localhost:9428/insert/loki/api/v1/push?_stream_fields=instance,job,host,app&debug=1
    tenant_id: "12:34"

The ingested log entries can be queried according to these docs.

See also data ingestion troubleshooting docs.