Fix doc: vmbackup splits by 1 GiB not 100 MB (#1756)

This is a follow-up for bdd0a1cdb2
This commit is contained in:
Dima Lazerka 2021-10-26 20:19:49 +04:00 committed by GitHub
parent d282a7593b
commit e706fb5686
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -107,7 +107,7 @@ The backup algorithm is the following:
These are usually the biggest and the oldest files, which are shared between backups. These are usually the biggest and the oldest files, which are shared between backups.
5. Upload the remaining files from step 3 from `-snapshotName` to `-dst`. 5. Upload the remaining files from step 3 from `-snapshotName` to `-dst`.
The algorithm splits source files into 100 MB chunks in the backup. Each chunk stored as a separate file in the backup. The algorithm splits source files into 1 GiB chunks in the backup. Each chunk stored as a separate file in the backup.
Such splitting minimizes the amounts of data to re-transfer after temporary errors. Such splitting minimizes the amounts of data to re-transfer after temporary errors.
`vmbackup` relies on [instant snapshot](https://medium.com/@valyala/how-victoriametrics-makes-instant-snapshots-for-multi-terabyte-time-series-data-e1f3fb0e0282) properties: `vmbackup` relies on [instant snapshot](https://medium.com/@valyala/how-victoriametrics-makes-instant-snapshots-for-multi-terabyte-time-series-data-e1f3fb0e0282) properties: