Fix doc: vmbackup splits by 1 GiB not 100 MB (#1756)

This is a follow-up for bdd0a1cdb2
This commit is contained in:
Dima Lazerka 2021-10-26 20:19:49 +04:00 committed by Aliaksandr Valialkin
parent 56970caded
commit 464f5d0910
No known key found for this signature in database
GPG Key ID: A72BEC6CD3D0DED1

View File

@ -107,7 +107,7 @@ The backup algorithm is the following:
These are usually the biggest and the oldest files, which are shared between backups. These are usually the biggest and the oldest files, which are shared between backups.
5. Upload the remaining files from step 3 from `-snapshotName` to `-dst`. 5. Upload the remaining files from step 3 from `-snapshotName` to `-dst`.
The algorithm splits source files into 100 MB chunks in the backup. Each chunk stored as a separate file in the backup. The algorithm splits source files into 1 GiB chunks in the backup. Each chunk stored as a separate file in the backup.
Such splitting minimizes the amounts of data to re-transfer after temporary errors. Such splitting minimizes the amounts of data to re-transfer after temporary errors.
`vmbackup` relies on [instant snapshot](https://medium.com/@valyala/how-victoriametrics-makes-instant-snapshots-for-multi-terabyte-time-series-data-e1f3fb0e0282) properties: `vmbackup` relies on [instant snapshot](https://medium.com/@valyala/how-victoriametrics-makes-instant-snapshots-for-multi-terabyte-time-series-data-e1f3fb0e0282) properties: