gzip - Single-File Compression
gzip compresses single files into the .gz format. It is commonly used on a WordPress VPS to compress SQL dumps, rotate logs, and reduce transfer size. For directory backups, you typically combine tar (archiving) with gzip (.tar.gz).
- Compress in-place:
gzip file.sql->file.sql.gz - Keep original:
gzip -k file.sql - Decompress:
gunzip file.sql.gzorgzip -d file.sql.gz - Stream to stdout:
gzip -c file.sql > /backups/file.sql.gz - Test integrity:
gzip -t file.sql.gz
When to use gzip
- Compress single files: SQL dumps (
.sql.gz) and logs (.log.gz). - Use
tar.gzfor directories (WordPress files are a directory tree). - Choose
gzipwhen you want maximum compatibility across Linux systems.
Prerequisites
- VPS with Linux (Ubuntu).
- Basic knowledge of file handling.
- Access to WordPress paths (
/var/www/html,/home/backups, etc.).
Core syntax
# ──────────────────────────────
# 1. Compress a single file (same location)
gzip [OPTION] [INPUT_FILE]
# Example:
# gzip /home/dev_wpstrategist/public_html/error.log
# → Output: /home/dev_wpstrategist/public_html/error.log.gz
# ──────────────────────────────
# 2. Compress file from input location and save to specific output location
gzip -c [INPUT_FILE] > [OUTPUT_DIRECTORY]/[OUTPUT_FILE].gz
# Example:
# gzip -c /home/dev_wpstrategist/db.sql > /home/wpbackup/db_$(date +%F).sql.gz
# Input file: /home/dev_wpstrategist/db.sql
# Output file: /home/wpbackup/db_2025-10-07.sql.gz
# ──────────────────────────────
# 3. Compress all files in an input directory (recursive)
gzip -r [INPUT_DIRECTORY]
# Example:
# gzip -r /home/dev_wpstrategist/public_html/wp-content/uploads
# Input directory: /home/dev_wpstrategist/public_html/wp-content/uploads
# Output: Each file inside becomes .gz in same folder
# ──────────────────────────────
# 4. Decompress a .gz file (restore in same folder)
gunzip [INPUT_FILE.gz]
# Example:
# gunzip /home/wpbackup/db_2025-10-07.sql.gz
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output file: /home/wpbackup/db_2025-10-07.sql
# ──────────────────────────────
# 5. Decompress file from input location and save to different output location
gunzip -c [INPUT_FILE.gz] > [OUTPUT_DIRECTORY]/[OUTPUT_FILE]
# Example:
# gunzip -c /home/wpbackup/db_2025-10-07.sql.gz > /home/dev_wpstrategist/db_restored.sql
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output file: /home/dev_wpstrategist/db_restored.sql
# ──────────────────────────────
# 6. Decompress all .gz files inside a directory
gunzip -r [INPUT_DIRECTORY]
# Example:
# gunzip -r /home/wpbackup/
# Input directory: /home/wpbackup
# Output directory: same location (each .gz restored)
# ──────────────────────────────
# 7. View compressed file content without extracting
zcat [INPUT_FILE.gz]
# Example:
# zcat /home/wpbackup/db_2025-10-07.sql.gz | head -n 20
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output: text printed on screen only
# ──────────────────────────────
# 8. Compress a whole directory as one .tar.gz archive (Export)
tar -cvf - [INPUT_DIRECTORY] | gzip > [OUTPUT_DIRECTORY]/[ARCHIVE_NAME].tar.gz
# Example:
# tar -cvf - /home/dev_wpstrategist/public_html | gzip > /home/wpbackup/sitefiles_$(date +%F).tar.gz
# Input directory: /home/dev_wpstrategist/public_html
# Output file: /home/wpbackup/sitefiles_2025-10-07.tar.gz
# ──────────────────────────────
# 9. Extract a .tar.gz archive to target directory (Import)
gunzip -c [INPUT_FILE.tar.gz] | tar -xvf - -C [OUTPUT_DIRECTORY]
# Example:
# gunzip -c /home/wpbackup/sitefiles_2025-10-07.tar.gz | tar -xvf - -C /home/dev_wpstrategist/public_html
# Input file: /home/wpbackup/sitefiles_2025-10-07.tar.gz
# Output directory: /home/dev_wpstrategist/public_html
# ──────────────────────────────
# 10. Export database and compress (Database → Compressed Backup)
mysqldump -u [USER] -p'[PASSWORD]' [DATABASE] | gzip > [OUTPUT_DIRECTORY]/[DB_NAME]_$(date +%F).sql.gz
# Example:
# mysqldump -u DB_USER -p'DB_PASS' DB_NAME \
# | gzip > /home/wpbackup/db-dev_wpstrategist_$(date +%F).sql.gz
# Input: database content
# Output file: /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz
# ──────────────────────────────
# 11. Import database from compressed backup (Compressed Backup → Database)
gunzip -c [INPUT_FILE.gz] | mysql -u [USER] -p'[PASSWORD]' [DATABASE]
# Example:
# gunzip -c /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz \
# | mysql -u DB_USER -p'DB_PASS' DB_NAME
# Input file: /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz
# Output: data restored into MySQL database
Key options
| Option | Description | Example | Use Case |
|---|---|---|---|
-c | Write output to stdout, keep original file | gzip -c wp.sql > wp.sql.gz | Backup without deleting source |
-d | Decompress (same as gunzip) | gzip -d wp.sql.gz | Restore WordPress DB backup |
-k | Keep original file after compression | gzip -k wp.log | Keep both compressed & raw logs |
-r | Compress recursively in directories | gzip -r /var/log/ | Compress all logs in subfolders |
-1 … -9 | Compression levels (fastest -1, best -9) | gzip -9 backup.tar | Save max space on big archives |
-l | List compression stats | gzip -l backup.sql.gz | Check ratio & space saved |
-t | Test integrity of compressed file | gzip -t backup.sql.gz | Verify before restore |
-v | Verbose mode | gzip -v backup.sql | Show compression details |
Examples with expected output
Compress a file
gzip wp-config.php
Output:
ls
wp-config.php.gz
Explanation: Original file replaced with compressed .gz.
Use Case: Shrink config files before sending.
Keep original file (-k)
gzip -k wp-config.php
Output:
ls
wp-config.php wp-config.php.gz
Explanation: Keeps both raw and compressed file.
Use Case: Useful when you still need the source file.
Compress a WordPress SQL dump
gzip -9 wpdb.sql
Output:
wpdb.sql.gz # much smaller size
Use Case: Store DB backup with max compression.
Decompress a file
gunzip wpdb.sql.gz
Output:
wpdb.sql
Explanation: Restores original file.
View a compressed file without extracting
zcat wpdb.sql.gz | head -n 5
Output:
-- MySQL dump 10.13 Distrib 8.0.22
--
-- Host: localhost Database: wpdb
Use Case: Peek into compressed DB file.
Compress all logs in /var/log/
gzip -r /var/log/
Output:
All .log files become .log.gz.
Use Case: Free up VPS disk space.
Test integrity of a .gz file
gzip -t wpdb.sql.gz && echo "Valid file"
Output:
Valid file
Use Case: Ensures backup is not corrupted.
Show compression ratio
gzip -l wpdb.sql.gz
Output:
compressed uncompressed ratio uncompressed_name
1.2M 10.5M 88.6% wpdb.sql
Use Case: See how much disk space you saved.
WordPress VPS use cases
- Backups: Compress MySQL dumps (
wpdb.sql → wpdb.sql.gz). - Logs: Reduce space by compressing logs in
/var/log/. - Uploads: Package and compress
wp-content/uploads/. - Transfer: Faster
scp/rsyncof compressed files.
Best practices
- Always use
9for database backups. - Use
gzip -tbefore restoring backups. - Pair with
tarfor directories (tar -czf backup.tar.gz /var/www/html). - Use
kwhen you want to keep the original. - Automate with
cronfor daily backups.
Quick lab
-
Create a SQL dump of WordPress DB.
mysqldump wpdb > wpdb.sql -
Compress it:
gzip -9 wpdb.sql -
Verify:
gzip -t wpdb.sql.gz && echo "Backup OK" -
Restore test:
gunzip wpdb.sql.gzmysql -u root -p wpdb < wpdb.sql
Cheat sheet
gzip file # compress file
gunzip file.gz # decompress
gzip -k file # keep original
gzip -9 file # max compression
zcat file.gz # view without extracting
gzip -l file.gz # show compression stats
gzip -t file.gz # test compressed file
gzip -r dir/ # compress directory files
Mini quiz
- What does
gzip -kdo? - How do you check if
wpdb.sql.gzis valid? - Which command compresses all files in
/var/log/? - Which option gives the highest compression level?
- How can you read inside a
.gzfile without extracting?
Deep dive: gzip, gunzip, zcat, and splitting
The sections below explain how gzip and related tools behave on a VPS. Use them to inspect compressed backups without extracting, restore safely, and split large archives when transport limits require it.
Core syntax (expanded)
gzip [OPTION] [FILE...]
gunzip [OPTION] [FILE.gz...]
zcat [FILE.gz...]
gzip - the compressor
- Purpose: Compress one or more files using the GNU zip algorithm.
- Default Behavior:
- Replaces the original file with a new one that has a
.gzextension. - Example:
wp-config.php→wp-config.php.gz
- Replaces the original file with a new one that has a
- Compression Ratio: Typically reduces file size by 50–90%, depending on file type.
- Common Options:
k→ Keep original file.v→ Show compression progress.9→ Maximum compression.
Example:
gzip -9 wpdb.sql
Result:
Creates wpdb.sql.gz and removes the original wpdb.sql (unless -k used).
Use Case:
Compress MySQL dumps or logs before backup or transfer.
gunzip - the decompressor
- Purpose: Decompress
.gzfiles back to their original state. - Equivalent Command:
gzip -d - Default Behavior:
- Removes
.gzsuffix. - Restores original file name.
- Removes
- Common Options:
k→ Keep compressed file after extraction.v→ Show decompression details.
Example:
gunzip wpdb.sql.gz
Output:
wpdb.sql
Explanation:
The compressed backup is restored to its original .sql file.
Use Case:
Used before restoring WordPress database or viewing old logs.
zcat - view without extracting
- Purpose: View contents of a compressed file without decompressing it physically on disk.
- Equivalent Command:
gunzip -c - Usage: Ideal for checking logs or SQL files on the fly.
Example:
zcat wpdb.sql.gz | head -n 10
Output:
Displays the first 10 lines of the compressed database dump.
Use Case:
Check content integrity before restoration or processing.
How to decompress a .gz file
There are four ways to decompress (un-gzip) a file:
| Method | Command | Description |
|---|---|---|
| 1 | gunzip file.gz | Standard decompression; removes .gz |
| 2 | gzip -d file.gz | Same as gunzip |
| 3 | zcat file.gz > file | Writes decompressed content to new file |
| 4 | zgrep pattern file.gz | Search text inside .gz directly |
Example:
gzip -d wpdb.sql.gz
Expected Output:
wpdb.sql
Can we split a .gz file?
Yes, but not natively with gzip alone.
gzip compresses one file into a single .gz stream. To split it safely:
| Method | Command | Explanation |
|---|---|---|
split | split -b 100M backup.tar.gz part_ | Splits into 100 MB parts |
zipsplit (for .zip) | Works only for ZIP, not GZIP | |
| Combine again | cat part_* > backup.tar.gz | Merges parts back before un-gzipping |
| Extract after merge | gunzip backup.tar.gz | Decompress full file |
Tip:
If you need to compress and split a large backup, use tar with gzip and split:
tar -czf - /var/www/html | split -b 500M - backup.tar.gz.part-
Summary table
| Command | Function | Extension | Keeps Original | Output Example |
|---|---|---|---|---|
gzip file | Compress | .gz | ❌ | file.gz |
gzip -k file | Compress + keep | .gz | ✅ | file, file.gz |
gunzip file.gz | Decompress | Removes .gz | ❌ | file |
gzip -d file.gz | Same as above | Removes .gz | ❌ | file |
zcat file.gz | View content | None | ✅ | Printed output |
split -b 100M file.gz part_ | Split large .gz | Multiple parts | ✅ | part_aa, part_ab, etc. |
So which one has native spli? Is it tar?
Excellent — this is a very important distinction when managing large backups in Linux, especially for WordPress sites.
Let’s clarify this precisely
Split support by tool
| Tool | Native Split Feature | Explanation |
|---|---|---|
gzip | ❌ No | gzip compresses a single file into one continuous stream. It cannot split files natively. You must pair it with split for large files. |
gunzip / zcat | ❌ No | These only decompress or view .gz files; they can’t split or segment compressed data. |
tar | ⚠️ Partially (with external tools) | tar itself doesn’t natively split, but it can output to stdout, making it easy to pipe into split. |
zip / zipsplit | ✅ Yes (native) | The zip format supports native multi-part volumes via zipsplit (only for .zip, not .gz). |
Best practical approach for splitting large archives
Since gzip doesn’t support splitting, the most reliable workflow for large WordPress site backups (10GB–50GB+) is to combine tar, gzip, and split.
Example: compress and split a WordPress folder
tar -czf - /var/www/html | split -b 500M - backup.tar.gz.part-
Explanation:
tar -czf -→ Creates a compressed.tar.gzstream to standard output (no file yet).split -b 500M - backup.tar.gz.part-→ Splits that stream into 500MB parts (backup.tar.gz.part-aa,backup.tar.gz.part-ab, etc.).
Expected Output:
backup.tar.gz.part-aa
backup.tar.gz.part-ab
backup.tar.gz.part-ac
...
Recombine and extract later
cat backup.tar.gz.part-* > backup.tar.gz
tar -xzf backup.tar.gz
Explanation:
catmerges all split parts in correct order.tar -xzfdecompresses and extracts the original directory structure.
Key takeaways
| Purpose | Recommended Command | Notes |
|---|---|---|
| Compress single file | gzip file | Simple .gz output |
| Compress folder | tar -czf backup.tar.gz folder/ | Combines + compresses |
| Split large backup | `tar -czf - folder/ | split -b 500M - backup.tar.gz.part-` |
| Merge parts | cat part-* > full.tar.gz | Before extraction |
| Extract | tar -xzf full.tar.gz | Restores entire site |
Real WordPress Use Case
Imagine you are backing up your production site:
tar -czf - /var/www/html/metaxenith.com | split -b 1000M - /home/backups/metaxenith_$(date +%F).tar.gz.part-
- Your backup is automatically compressed and chunked into 1GB parts.
- Perfect for uploading to Google Drive / Rclone, as many cloud systems have file size limits.
- Easy to rejoin and extract anytime.
In short
| Task | Native Split Support |
|---|---|
gzip | ❌ No |
gunzip | ❌ No |
zcat | ❌ No |
tar | ⚠️ Via pipe + split |
zip | ✅ Yes (via zipsplit) |
Give strategy wp with size 100 gb folder + database about archieve. We need this to do send backup with rclone
WordPress 100 GB backup and archive strategy (rclone transfer)
Goal
Efficiently compress, split, and transfer a large WordPress site (≈ 100 GB total including /wp-content/ and database) from your VPS to cloud storage using Rclone — safely, resumably, and recoverably.
Understand the structure
Typical 100 GB breakdown:
| Component | Size | Example Path | Notes |
|---|---|---|---|
| WordPress files | 90 GB | /var/www/html/ | themes, plugins, uploads, cache |
| Database | 10 GB | /var/lib/mysql/ or dump file | mysqldump output |
| Total | ≈100 GB | — | Needs splitting & compression |
Step-by-step strategy
Prepare a dedicated backup folder
mkdir -p /home/wpbackup
cd /home/wpbackup
Dump the database
Use mysqldump with gzip on the fly to save space:
mysqldump -u root -p wpdb | gzip > wpdb_$(date +%F).sql.gz
Output Example:
wpdb_2025-10-04.sql.gz (≈ 2–5 GB after compression)
Compress WordPress files (stream + split)
Since 90 GB is too large for a single archive, use tar + gzip + split pipeline:
tar -czf - /var/www/html | split -b 2000M - /home/wpbackup/wpfiles_$(date +%F).tar.gz.part-
Explanation:
tar -czf -→ stream a.tar.gzarchive to stdout.split -b 2000M→ split into 2 GB chunks (wpfiles_2025-10-04.tar.gz.part-aa,part-ab, etc.).- before output tells
splitto read from standard input. - Safe for very large directories (100+ GB).
Expected Output:
wpfiles_2025-10-04.tar.gz.part-aa
wpfiles_2025-10-04.tar.gz.part-ab
wpfiles_2025-10-04.tar.gz.part-ac
...
Verify the archive integrity
Run this to confirm all parts are valid gzip streams:
cat wpfiles_2025-10-04.tar.gz.part-* | gzip -t && echo " Archive verified"
Expected output:
Archive verified
Transfer using rclone
Use Rclone with resumable uploads, checksum verification, and bandwidth control.
Example: upload to Google Drive remote named gdrive
rclone copy /home/wpbackup gdrive:wpbackup --progress --transfers=4 --checkers=8 --bwlimit=8M --retries=5
Explanation:
-transfers=4→ upload 4 files in parallel.-checkers=8→ use 8 checksum threads.-bwlimit=8M→ limit bandwidth (optional).-retries=5→ retry failed uploads automatically.
Tip: Add --resume (or --retries 10) for long transfers on unstable connections.
Clean up old backups (optional)
Keep only last 7 days:
find /home/wpbackup -type f -mtime +7 -delete
Restoration workflow
To restore from Rclone cloud:
rclone copy gdrive:wpbackup /home/restore --progress
cd /home/restore
cat wpfiles_2025-10-04.tar.gz.part-* > wpfiles.tar.gz
tar -xzf wpfiles.tar.gz -C /var/www/html/
gunzip -c wpdb_2025-10-04.sql.gz | mysql -u root -p wpdb
Explanation:
- Merge split parts with
cat. - Extract using
tar. - Restore DB from
.gzdump.
Recommended compression and split strategy
| Type | Command | Recommended | Why |
|---|---|---|---|
| gzip | tar -czf | ✅ | Fast, stable, widely supported |
| xz | tar -cJf | ⚠️ | Higher compression, slower |
| zstd | tar -I zstd -cf | ⚙️ | Modern, very fast (if installed) |
| split size | -b 2000M | ✅ | Cloud upload friendly (2 GB per file) |
Example cron automation
Schedule nightly backup at 2 AM:
0 2 * * * tar -czf - /var/www/html | split -b 2000M - /home/wpbackup/wpfiles_$(date +\%F).tar.gz.part- && mysqldump -u root -pYOURPASS wpdb | gzip > /home/wpbackup/wpdb_$(date +\%F).sql.gz && rclone copy /home/wpbackup gdrive:wpbackup --progress --transfers=4 --checkers=8 --retries=5 && find /home/wpbackup -type f -mtime +7 -delete
Fully automated
Split archives for large size
Upload to cloud
Auto-delete older backups
Performance tips
| Optimization | Description |
|---|---|
Use nice or ionice | Prevents backup from hogging CPU/IO |
| Exclude cache directories | e.g., --exclude='wp-content/cache/*' |
Use gzip level -6 | Good balance of speed vs. compression |
| Run during low-traffic hours | Reduces server load |
Monitor with htop / iotop | Check CPU/disk usage |
| Test restore regularly | Never trust backups you haven’t tested |
Example folder layout
/home/wpbackup/
├── wpfiles_2025-10-04.tar.gz.part-aa
├── wpfiles_2025-10-04.tar.gz.part-ab
├── wpdb_2025-10-04.sql.gz
└── upload_log.txt
Final checklist before sending
| Step | Command | Status |
|---|---|---|
| Check total size | du -sh /home/wpbackup | ✅ |
| Verify gzip integrity | gzip -t file.gz | ✅ |
| Verify tar structure | `tar -tzf file.tar.gz | head` |
| Confirm split count | `ls | grep part- |
| Test rejoin & extract (sample) | `cat part-aa part-ab | tar -tzf - |
Summary table
| Task | Recommended Tool | Command |
|---|---|---|
| Compress + split large folder | tar + gzip + split | `tar -czf - folder |
| Compress + dump database | mysqldump + gzip | `mysqldump db |
| Upload to cloud | rclone copy | rclone copy /home/wpbackup remote:wpbackup |
| Merge + extract | cat + tar | cat part-* > full.tar.gz && tar -xzf full.tar.gz |
Verdict
Best combination for 100 GB+ WordPress sites:
tar + gzip + split + rclone
- Stable and widely supported.
- Compatible with all VPS and cloud storage systems.
- Works for automation (cron).
- Recoverable even if 1 part fails (only re-upload that part).
In mysql syntaxwe can combine gzip syntax?
Yes — 100% you can combine mysqldump with gzip in a single command.
This is a very common best practice for VPS and WordPress database backups, especially when you plan to upload the resulting .gz file using Rclone or move it between servers.
Let’s go step-by-step so you fully understand the syntax, behavior, expected output, and use cases.
Combine mysqldump with gzip
Standard MySQL dump
Normally you might run:
mysqldump -u root -p wpdatabase > wpdatabase.sql
But this can produce a huge plain SQL file (10–20 GB for large sites).
So instead, we pipe the dump output directly into gzip
Compressed MySQL dump command
mysqldump -u root -p wpdatabase | gzip > /home/wpbackup/wpdatabase_$(date +%F).sql.gz
Explanation of each component
| Part | Function |
|---|---|
mysqldump | Exports the database as plain text (SQL statements). |
| ` | ` (pipe) |
gzip | Compresses the incoming stream. |
> | Redirects the compressed stream to a .gz file. |
$(date +%F) | Adds a date stamp for versioned backups. |
Expected output
In /home/wpbackup/, you’ll see:
wpdatabase_2025-10-04.sql.gz
The file will usually be 60–90% smaller than the raw SQL file.
For a 10 GB database, you might get a 1–3 GB .gz file.
Restore from .gz later
Restore by decompressing first
gunzip wpdatabase_2025-10-04.sql.gz
mysql -u root -p wpdatabase < wpdatabase_2025-10-04.sql
Restore by streaming into mysql
gunzip -c wpdatabase_2025-10-04.sql.gz | mysql -u root -p wpdatabase
Explanation:
ctellsgunzipto send decompressed content to stdout, which is then piped directly into MySQL.
Add compression level
You can specify gzip level -1 (fastest) to -9 (best compression):
mysqldump -u root -p wpdatabase | gzip -9 > /home/wpbackup/wpdatabase_$(date +%F).sql.gz
Tip:
Use -6 for good speed/space balance on large databases.
Combine with split for huge databases
If your SQL dump exceeds several GB, combine gzip with split:
mysqldump -u root -p wpdatabase | gzip | split -b 2000M - /home/wpbackup/wpdb_$(date +%F).sql.gz.part-
Expected Output:
wpdb_2025-10-04.sql.gz.part-aa
wpdb_2025-10-04.sql.gz.part-ab
...
To restore:
cat wpdb_2025-10-04.sql.gz.part-* | gunzip | mysql -u root -p wpdatabase
Automated cron example
Schedule nightly compressed DB backups:
0 2 * * * mysqldump -u root -pYOURPASS wpdatabase | gzip -9 > /home/wpbackup/wpdatabase_$(date +\%F).sql.gz && rclone copy /home/wpbackup/wpdatabase_$(date +\%F).sql.gz gdrive:wpbackup --progress
Automatically compresses
Automatically uploads
Automatically adds date stamp
Summary table
| Task | Command | Notes |
|---|---|---|
| Dump & compress | `mysqldump db | gzip > db.sql.gz` |
| Dump, compress, and split | `mysqldump db | gzip |
| Restore (compressed) | `gunzip -c db.sql.gz | mysql db` |
| Restore (split parts) | `cat part-* | gunzip |
Key takeaways
gzipcan be safely combined withmysqldumpusing a pipe (|).- This saves massive disk space and speeds up Rclone transfers.
- You can restore directly from
.gzor even split archives. - Don’t forget to test your restore command at least once a month.
Example: daily MySQL + gzip + rclone backup script
This is a practical template for compressing a WordPress database dump and pushing it off-host.
Do not hardcode database passwords in scripts. Prefer an interactive prompt, or a dedicated MySQL config file (for example --defaults-extra-file) with tight permissions.
Backup script (template)
#!/usr/bin/env bash
set -euo pipefail
BACKUP_DIR="/backups"
DB_NAME="DB_NAME"
MYSQL_DEFAULTS="/root/.my.cnf"
RCLONE_REMOTE="remote:wp-backups"
mkdir -p "$BACKUP_DIR"
TS="$(date +%F)"
OUT_FILE="$BACKUP_DIR/wp-db-$TS.sql.gz"
mysqldump --defaults-extra-file="$MYSQL_DEFAULTS" --single-transaction "$DB_NAME" \
| gzip > "$OUT_FILE"
gzip -t "$OUT_FILE"
rclone copy "$OUT_FILE" "$RCLONE_REMOTE/" --checksum
```text title="gzip-example-daily-mysql-gzip-rclone-backup-script-102.txt"
</details>
<details>
<summary>Example cron entry</summary>
```text title="crontab-entry.txt"
0 2 * * * /usr/local/bin/wp-db-backup-gzip-rclone.sh