Skip to main content

gzip - Single-File Compression

gzip compresses single files into the .gz format. It is commonly used on a WordPress VPS to compress SQL dumps, rotate logs, and reduce transfer size. For directory backups, you typically combine tar (archiving) with gzip (.tar.gz).

Quick Summary
  • Compress in-place: gzip file.sql -> file.sql.gz
  • Keep original: gzip -k file.sql
  • Decompress: gunzip file.sql.gz or gzip -d file.sql.gz
  • Stream to stdout: gzip -c file.sql > /backups/file.sql.gz
  • Test integrity: gzip -t file.sql.gz

When to use gzip

  • Compress single files: SQL dumps (.sql.gz) and logs (.log.gz).
  • Use tar.gz for directories (WordPress files are a directory tree).
  • Choose gzip when you want maximum compatibility across Linux systems.

Prerequisites

  • VPS with Linux (Ubuntu).
  • Basic knowledge of file handling.
  • Access to WordPress paths (/var/www/html, /home/backups, etc.).

Core syntax

gzip-core-syntax.sh
# ──────────────────────────────
# 1. Compress a single file (same location)
gzip [OPTION] [INPUT_FILE]

# Example:
# gzip /home/dev_wpstrategist/public_html/error.log
# → Output: /home/dev_wpstrategist/public_html/error.log.gz

# ──────────────────────────────
# 2. Compress file from input location and save to specific output location
gzip -c [INPUT_FILE] > [OUTPUT_DIRECTORY]/[OUTPUT_FILE].gz

# Example:
# gzip -c /home/dev_wpstrategist/db.sql > /home/wpbackup/db_$(date +%F).sql.gz
# Input file: /home/dev_wpstrategist/db.sql
# Output file: /home/wpbackup/db_2025-10-07.sql.gz

# ──────────────────────────────
# 3. Compress all files in an input directory (recursive)
gzip -r [INPUT_DIRECTORY]

# Example:
# gzip -r /home/dev_wpstrategist/public_html/wp-content/uploads
# Input directory: /home/dev_wpstrategist/public_html/wp-content/uploads
# Output: Each file inside becomes .gz in same folder

# ──────────────────────────────
# 4. Decompress a .gz file (restore in same folder)
gunzip [INPUT_FILE.gz]

# Example:
# gunzip /home/wpbackup/db_2025-10-07.sql.gz
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output file: /home/wpbackup/db_2025-10-07.sql

# ──────────────────────────────
# 5. Decompress file from input location and save to different output location
gunzip -c [INPUT_FILE.gz] > [OUTPUT_DIRECTORY]/[OUTPUT_FILE]

# Example:
# gunzip -c /home/wpbackup/db_2025-10-07.sql.gz > /home/dev_wpstrategist/db_restored.sql
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output file: /home/dev_wpstrategist/db_restored.sql

# ──────────────────────────────
# 6. Decompress all .gz files inside a directory
gunzip -r [INPUT_DIRECTORY]

# Example:
# gunzip -r /home/wpbackup/
# Input directory: /home/wpbackup
# Output directory: same location (each .gz restored)

# ──────────────────────────────
# 7. View compressed file content without extracting
zcat [INPUT_FILE.gz]

# Example:
# zcat /home/wpbackup/db_2025-10-07.sql.gz | head -n 20
# Input file: /home/wpbackup/db_2025-10-07.sql.gz
# Output: text printed on screen only

# ──────────────────────────────
# 8. Compress a whole directory as one .tar.gz archive (Export)
tar -cvf - [INPUT_DIRECTORY] | gzip > [OUTPUT_DIRECTORY]/[ARCHIVE_NAME].tar.gz

# Example:
# tar -cvf - /home/dev_wpstrategist/public_html | gzip > /home/wpbackup/sitefiles_$(date +%F).tar.gz
# Input directory: /home/dev_wpstrategist/public_html
# Output file: /home/wpbackup/sitefiles_2025-10-07.tar.gz

# ──────────────────────────────
# 9. Extract a .tar.gz archive to target directory (Import)
gunzip -c [INPUT_FILE.tar.gz] | tar -xvf - -C [OUTPUT_DIRECTORY]

# Example:
# gunzip -c /home/wpbackup/sitefiles_2025-10-07.tar.gz | tar -xvf - -C /home/dev_wpstrategist/public_html
# Input file: /home/wpbackup/sitefiles_2025-10-07.tar.gz
# Output directory: /home/dev_wpstrategist/public_html

# ──────────────────────────────
# 10. Export database and compress (Database → Compressed Backup)
mysqldump -u [USER] -p'[PASSWORD]' [DATABASE] | gzip > [OUTPUT_DIRECTORY]/[DB_NAME]_$(date +%F).sql.gz

# Example:
# mysqldump -u DB_USER -p'DB_PASS' DB_NAME \
# | gzip > /home/wpbackup/db-dev_wpstrategist_$(date +%F).sql.gz
# Input: database content
# Output file: /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz

# ──────────────────────────────
# 11. Import database from compressed backup (Compressed Backup → Database)
gunzip -c [INPUT_FILE.gz] | mysql -u [USER] -p'[PASSWORD]' [DATABASE]

# Example:
# gunzip -c /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz \
# | mysql -u DB_USER -p'DB_PASS' DB_NAME
# Input file: /home/wpbackup/db-dev_wpstrategist_2025-10-07.sql.gz
# Output: data restored into MySQL database


Key options

OptionDescriptionExampleUse Case
-cWrite output to stdout, keep original filegzip -c wp.sql > wp.sql.gzBackup without deleting source
-dDecompress (same as gunzip)gzip -d wp.sql.gzRestore WordPress DB backup
-kKeep original file after compressiongzip -k wp.logKeep both compressed & raw logs
-rCompress recursively in directoriesgzip -r /var/log/Compress all logs in subfolders
-1-9Compression levels (fastest -1, best -9)gzip -9 backup.tarSave max space on big archives
-lList compression statsgzip -l backup.sql.gzCheck ratio & space saved
-tTest integrity of compressed filegzip -t backup.sql.gzVerify before restore
-vVerbose modegzip -v backup.sqlShow compression details

Examples with expected output

Compress a file

gzip-compress-a-file-002.sh
gzip wp-config.php

Output:

gzip-compress-a-file-004.sh
ls
wp-config.php.gz

Explanation: Original file replaced with compressed .gz.

Use Case: Shrink config files before sending.


Keep original file (-k)

gzip-keep-original-file-k-006.sh
gzip -k wp-config.php

Output:

gzip-keep-original-file-k-008.sh
ls
wp-config.php wp-config.php.gz

Explanation: Keeps both raw and compressed file.

Use Case: Useful when you still need the source file.


Compress a WordPress SQL dump

gzip-compress-a-wordpress-sql-dump-010.sh
gzip -9 wpdb.sql

Output:

gzip-compress-a-wordpress-sql-dump-012.sh
wpdb.sql.gz # much smaller size

Use Case: Store DB backup with max compression.


Decompress a file

gzip-decompress-a-file-014.sh
gunzip wpdb.sql.gz

Output:

gzip-decompress-a-file-016.sh
wpdb.sql

Explanation: Restores original file.


View a compressed file without extracting

gzip-view-a-compressed-file-without-extracting-018.sh
zcat wpdb.sql.gz | head -n 5

Output:

gzip-view-a-compressed-file-without-extracting-020.sql
-- MySQL dump 10.13 Distrib 8.0.22
--
-- Host: localhost Database: wpdb

Use Case: Peek into compressed DB file.


Compress all logs in /var/log/

gzip-compress-all-logs-in-var-log-022.sh
gzip -r /var/log/

Output:

All .log files become .log.gz.

Use Case: Free up VPS disk space.


Test integrity of a .gz file

gzip-test-integrity-of-a-gz-file-024.sh
gzip -t wpdb.sql.gz && echo "Valid file"

Output:

gzip-test-integrity-of-a-gz-file-026.sh
Valid file

Use Case: Ensures backup is not corrupted.


Show compression ratio

gzip-show-compression-ratio-028.sh
gzip -l wpdb.sql.gz

Output:

gzip-show-compression-ratio-030.sh
compressed uncompressed ratio uncompressed_name
1.2M 10.5M 88.6% wpdb.sql

Use Case: See how much disk space you saved.


WordPress VPS use cases

  • Backups: Compress MySQL dumps (wpdb.sql → wpdb.sql.gz).
  • Logs: Reduce space by compressing logs in /var/log/.
  • Uploads: Package and compress wp-content/uploads/.
  • Transfer: Faster scp/rsync of compressed files.

Best practices

  • Always use 9 for database backups.
  • Use gzip -t before restoring backups.
  • Pair with tar for directories (tar -czf backup.tar.gz /var/www/html).
  • Use k when you want to keep the original.
  • Automate with cron for daily backups.

Quick lab

  1. Create a SQL dump of WordPress DB.

    mysqldump wpdb > wpdb.sql

  2. Compress it:

    gzip -9 wpdb.sql

  3. Verify:

    gzip -t wpdb.sql.gz && echo "Backup OK"

  4. Restore test:

    gunzip wpdb.sql.gz
    mysql -u root -p wpdb < wpdb.sql


Cheat sheet

gzip-cheat-sheet-032.sh
gzip file # compress file
gunzip file.gz # decompress
gzip -k file # keep original
gzip -9 file # max compression
zcat file.gz # view without extracting
gzip -l file.gz # show compression stats
gzip -t file.gz # test compressed file
gzip -r dir/ # compress directory files


Mini quiz

  1. What does gzip -k do?
  2. How do you check if wpdb.sql.gz is valid?
  3. Which command compresses all files in /var/log/?
  4. Which option gives the highest compression level?
  5. How can you read inside a .gz file without extracting?

Deep dive: gzip, gunzip, zcat, and splitting

The sections below explain how gzip and related tools behave on a VPS. Use them to inspect compressed backups without extracting, restore safely, and split large archives when transport limits require it.


Core syntax (expanded)

gzip-core-syntax-expanded-034.sh
gzip [OPTION] [FILE...]
gunzip [OPTION] [FILE.gz...]
zcat [FILE.gz...]


gzip - the compressor

  • Purpose: Compress one or more files using the GNU zip algorithm.
  • Default Behavior:
    • Replaces the original file with a new one that has a .gz extension.
    • Example: wp-config.phpwp-config.php.gz
  • Compression Ratio: Typically reduces file size by 50–90%, depending on file type.
  • Common Options:
    • k → Keep original file.
    • v → Show compression progress.
    • 9 → Maximum compression.

Example:

gzip-gzip-the-compressor-036.sh
gzip -9 wpdb.sql

Result:

Creates wpdb.sql.gz and removes the original wpdb.sql (unless -k used).

Use Case:

Compress MySQL dumps or logs before backup or transfer.


gunzip - the decompressor

  • Purpose: Decompress .gz files back to their original state.
  • Equivalent Command: gzip -d
  • Default Behavior:
    • Removes .gz suffix.
    • Restores original file name.
  • Common Options:
    • k → Keep compressed file after extraction.
    • v → Show decompression details.

Example:

gzip-gunzip-the-decompressor-038.sh
gunzip wpdb.sql.gz

Output:

gzip-gunzip-the-decompressor-040.sh
wpdb.sql

Explanation:

The compressed backup is restored to its original .sql file.

Use Case:

Used before restoring WordPress database or viewing old logs.


zcat - view without extracting

  • Purpose: View contents of a compressed file without decompressing it physically on disk.
  • Equivalent Command: gunzip -c
  • Usage: Ideal for checking logs or SQL files on the fly.

Example:

gzip-zcat-view-without-extracting-042.sh
zcat wpdb.sql.gz | head -n 10

Output:

Displays the first 10 lines of the compressed database dump.

Use Case:

Check content integrity before restoration or processing.


How to decompress a .gz file

There are four ways to decompress (un-gzip) a file:

MethodCommandDescription
1gunzip file.gzStandard decompression; removes .gz
2gzip -d file.gzSame as gunzip
3zcat file.gz > fileWrites decompressed content to new file
4zgrep pattern file.gzSearch text inside .gz directly

Example:

gzip-how-to-decompress-a-gz-file-044.sh
gzip -d wpdb.sql.gz

Expected Output:

gzip-how-to-decompress-a-gz-file-046.sh
wpdb.sql


Can we split a .gz file?

Yes, but not natively with gzip alone.

gzip compresses one file into a single .gz stream. To split it safely:

MethodCommandExplanation
splitsplit -b 100M backup.tar.gz part_Splits into 100 MB parts
zipsplit (for .zip)Works only for ZIP, not GZIP
Combine againcat part_* > backup.tar.gzMerges parts back before un-gzipping
Extract after mergegunzip backup.tar.gzDecompress full file

Tip:

If you need to compress and split a large backup, use tar with gzip and split:

gzip-can-we-split-a-gz-file-048.sh
tar -czf - /var/www/html | split -b 500M - backup.tar.gz.part-


Summary table

CommandFunctionExtensionKeeps OriginalOutput Example
gzip fileCompress.gzfile.gz
gzip -k fileCompress + keep.gzfile, file.gz
gunzip file.gzDecompressRemoves .gzfile
gzip -d file.gzSame as aboveRemoves .gzfile
zcat file.gzView contentNonePrinted output
split -b 100M file.gz part_Split large .gzMultiple partspart_aa, part_ab, etc.

So which one has native spli? Is it tar?

Excellent — this is a very important distinction when managing large backups in Linux, especially for WordPress sites.

Let’s clarify this precisely


Split support by tool

ToolNative Split FeatureExplanation
gzip❌ Nogzip compresses a single file into one continuous stream. It cannot split files natively. You must pair it with split for large files.
gunzip / zcat❌ NoThese only decompress or view .gz files; they can’t split or segment compressed data.
tar⚠️ Partially (with external tools)tar itself doesn’t natively split, but it can output to stdout, making it easy to pipe into split.
zip / zipsplit✅ Yes (native)The zip format supports native multi-part volumes via zipsplit (only for .zip, not .gz).

Best practical approach for splitting large archives

Since gzip doesn’t support splitting, the most reliable workflow for large WordPress site backups (10GB–50GB+) is to combine tar, gzip, and split.

Example: compress and split a WordPress folder

gzip-example-compress-and-split-a-wordpress-folder-050.sh
tar -czf - /var/www/html | split -b 500M - backup.tar.gz.part-

Explanation:

  • tar -czf - → Creates a compressed .tar.gz stream to standard output (no file yet).
  • split -b 500M - backup.tar.gz.part- → Splits that stream into 500MB parts (backup.tar.gz.part-aa, backup.tar.gz.part-ab, etc.).

Expected Output:

gzip-example-compress-and-split-a-wordpress-folder-052.txt
backup.tar.gz.part-aa
backup.tar.gz.part-ab
backup.tar.gz.part-ac
...


Recombine and extract later

gzip-recombine-and-extract-later-054.sh
cat backup.tar.gz.part-* > backup.tar.gz
tar -xzf backup.tar.gz

Explanation:

  • cat merges all split parts in correct order.
  • tar -xzf decompresses and extracts the original directory structure.

Key takeaways

PurposeRecommended CommandNotes
Compress single filegzip fileSimple .gz output
Compress foldertar -czf backup.tar.gz folder/Combines + compresses
Split large backup`tar -czf - folder/split -b 500M - backup.tar.gz.part-`
Merge partscat part-* > full.tar.gzBefore extraction
Extracttar -xzf full.tar.gzRestores entire site

Real WordPress Use Case

Imagine you are backing up your production site:

gzip-real-wordpress-use-case-056.sh
tar -czf - /var/www/html/metaxenith.com | split -b 1000M - /home/backups/metaxenith_$(date +%F).tar.gz.part-

  • Your backup is automatically compressed and chunked into 1GB parts.
  • Perfect for uploading to Google Drive / Rclone, as many cloud systems have file size limits.
  • Easy to rejoin and extract anytime.

In short

TaskNative Split Support
gzip❌ No
gunzip❌ No
zcat❌ No
tar⚠️ Via pipe + split
zip✅ Yes (via zipsplit)

Give strategy wp with size 100 gb folder + database about archieve. We need this to do send backup with rclone


WordPress 100 GB backup and archive strategy (rclone transfer)


Goal

Efficiently compress, split, and transfer a large WordPress site (≈ 100 GB total including /wp-content/ and database) from your VPS to cloud storage using Rclone — safely, resumably, and recoverably.


Understand the structure

Typical 100 GB breakdown:

ComponentSizeExample PathNotes
WordPress files90 GB/var/www/html/themes, plugins, uploads, cache
Database10 GB/var/lib/mysql/ or dump filemysqldump output
Total≈100 GBNeeds splitting & compression

Step-by-step strategy

Prepare a dedicated backup folder

gzip-prepare-a-dedicated-backup-folder-058.sh
mkdir -p /home/wpbackup
cd /home/wpbackup


Dump the database

Use mysqldump with gzip on the fly to save space:

gzip-dump-the-database-060.sh
mysqldump -u root -p wpdb | gzip > wpdb_$(date +%F).sql.gz

Output Example:

gzip-dump-the-database-062.txt
wpdb_2025-10-04.sql.gz (≈ 2–5 GB after compression)


Compress WordPress files (stream + split)

Since 90 GB is too large for a single archive, use tar + gzip + split pipeline:

gzip-compress-wordpress-files-stream-split-064.sh
tar -czf - /var/www/html | split -b 2000M - /home/wpbackup/wpfiles_$(date +%F).tar.gz.part-

Explanation:

  • tar -czf - → stream a .tar.gz archive to stdout.
  • split -b 2000M → split into 2 GB chunks (wpfiles_2025-10-04.tar.gz.part-aa, part-ab, etc.).
  • before output tells split to read from standard input.
  • Safe for very large directories (100+ GB).

Expected Output:

gzip-compress-wordpress-files-stream-split-066.txt
wpfiles_2025-10-04.tar.gz.part-aa
wpfiles_2025-10-04.tar.gz.part-ab
wpfiles_2025-10-04.tar.gz.part-ac
...


Verify the archive integrity

Run this to confirm all parts are valid gzip streams:

gzip-verify-the-archive-integrity-068.sh
cat wpfiles_2025-10-04.tar.gz.part-* | gzip -t && echo " Archive verified"

Expected output:

gzip-verify-the-archive-integrity-070.txt
Archive verified


Transfer using rclone

Use Rclone with resumable uploads, checksum verification, and bandwidth control.

Example: upload to Google Drive remote named gdrive

gzip-transfer-using-rclone-072.sh
rclone copy /home/wpbackup gdrive:wpbackup --progress --transfers=4 --checkers=8 --bwlimit=8M --retries=5

Explanation:

  • -transfers=4 → upload 4 files in parallel.
  • -checkers=8 → use 8 checksum threads.
  • -bwlimit=8M → limit bandwidth (optional).
  • -retries=5 → retry failed uploads automatically.

Tip: Add --resume (or --retries 10) for long transfers on unstable connections.


Clean up old backups (optional)

Keep only last 7 days:

gzip-clean-up-old-backups-optional-074.sh
find /home/wpbackup -type f -mtime +7 -delete


Restoration workflow

To restore from Rclone cloud:

gzip-restoration-workflow-076.sh
rclone copy gdrive:wpbackup /home/restore --progress
cd /home/restore
cat wpfiles_2025-10-04.tar.gz.part-* > wpfiles.tar.gz
tar -xzf wpfiles.tar.gz -C /var/www/html/
gunzip -c wpdb_2025-10-04.sql.gz | mysql -u root -p wpdb

Explanation:

  1. Merge split parts with cat.
  2. Extract using tar.
  3. Restore DB from .gz dump.

TypeCommandRecommendedWhy
gziptar -czfFast, stable, widely supported
xztar -cJf⚠️Higher compression, slower
zstdtar -I zstd -cf⚙️Modern, very fast (if installed)
split size-b 2000MCloud upload friendly (2 GB per file)

Example cron automation

Schedule nightly backup at 2 AM:

gzip-example-cron-automation-078.sh
0 2 * * * tar -czf - /var/www/html | split -b 2000M - /home/wpbackup/wpfiles_$(date +\%F).tar.gz.part- && mysqldump -u root -pYOURPASS wpdb | gzip > /home/wpbackup/wpdb_$(date +\%F).sql.gz && rclone copy /home/wpbackup gdrive:wpbackup --progress --transfers=4 --checkers=8 --retries=5 && find /home/wpbackup -type f -mtime +7 -delete

Fully automated

Split archives for large size

Upload to cloud

Auto-delete older backups


Performance tips

OptimizationDescription
Use nice or ionicePrevents backup from hogging CPU/IO
Exclude cache directoriese.g., --exclude='wp-content/cache/*'
Use gzip level -6Good balance of speed vs. compression
Run during low-traffic hoursReduces server load
Monitor with htop / iotopCheck CPU/disk usage
Test restore regularlyNever trust backups you haven’t tested

Example folder layout

gzip-example-folder-layout-080.txt
/home/wpbackup/
├── wpfiles_2025-10-04.tar.gz.part-aa
├── wpfiles_2025-10-04.tar.gz.part-ab
├── wpdb_2025-10-04.sql.gz
└── upload_log.txt


Final checklist before sending

StepCommandStatus
Check total sizedu -sh /home/wpbackup
Verify gzip integritygzip -t file.gz
Verify tar structure`tar -tzf file.tar.gzhead`
Confirm split count`lsgrep part-
Test rejoin & extract (sample)`cat part-aa part-abtar -tzf -

Summary table

TaskRecommended ToolCommand
Compress + split large foldertar + gzip + split`tar -czf - folder
Compress + dump databasemysqldump + gzip`mysqldump db
Upload to cloudrclone copyrclone copy /home/wpbackup remote:wpbackup
Merge + extractcat + tarcat part-* > full.tar.gz && tar -xzf full.tar.gz

Verdict

Best combination for 100 GB+ WordPress sites:

tar + gzip + split + rclone

  • Stable and widely supported.
  • Compatible with all VPS and cloud storage systems.
  • Works for automation (cron).
  • Recoverable even if 1 part fails (only re-upload that part).

In mysql syntaxwe can combine gzip syntax?

Yes — 100% you can combine mysqldump with gzip in a single command.

This is a very common best practice for VPS and WordPress database backups, especially when you plan to upload the resulting .gz file using Rclone or move it between servers.

Let’s go step-by-step so you fully understand the syntax, behavior, expected output, and use cases.


Combine mysqldump with gzip


Standard MySQL dump

Normally you might run:

gzip-standard-mysql-dump-082.sh
mysqldump -u root -p wpdatabase > wpdatabase.sql

But this can produce a huge plain SQL file (10–20 GB for large sites).

So instead, we pipe the dump output directly into gzip


Compressed MySQL dump command

gzip-compressed-mysql-dump-command-084.sh
mysqldump -u root -p wpdatabase | gzip > /home/wpbackup/wpdatabase_$(date +%F).sql.gz

Explanation of each component

PartFunction
mysqldumpExports the database as plain text (SQL statements).
`` (pipe)
gzipCompresses the incoming stream.
>Redirects the compressed stream to a .gz file.
$(date +%F)Adds a date stamp for versioned backups.

Expected output

In /home/wpbackup/, you’ll see:

gzip-expected-output-086.txt
wpdatabase_2025-10-04.sql.gz

The file will usually be 60–90% smaller than the raw SQL file.

For a 10 GB database, you might get a 1–3 GB .gz file.


Restore from .gz later

Restore by decompressing first

gzip-restore-by-decompressing-first-088.sh
gunzip wpdatabase_2025-10-04.sql.gz
mysql -u root -p wpdatabase < wpdatabase_2025-10-04.sql

Restore by streaming into mysql

gzip-restore-by-streaming-into-mysql-090.sh
gunzip -c wpdatabase_2025-10-04.sql.gz | mysql -u root -p wpdatabase

Explanation:

  • c tells gunzip to send decompressed content to stdout, which is then piped directly into MySQL.

Add compression level

You can specify gzip level -1 (fastest) to -9 (best compression):

gzip-add-compression-level-092.sh
mysqldump -u root -p wpdatabase | gzip -9 > /home/wpbackup/wpdatabase_$(date +%F).sql.gz

Tip:

Use -6 for good speed/space balance on large databases.


Combine with split for huge databases

If your SQL dump exceeds several GB, combine gzip with split:

gzip-combine-with-split-for-huge-databases-094.sh
mysqldump -u root -p wpdatabase | gzip | split -b 2000M - /home/wpbackup/wpdb_$(date +%F).sql.gz.part-

Expected Output:

gzip-combine-with-split-for-huge-databases-096.txt
wpdb_2025-10-04.sql.gz.part-aa
wpdb_2025-10-04.sql.gz.part-ab
...

To restore:

gzip-combine-with-split-for-huge-databases-098.sh
cat wpdb_2025-10-04.sql.gz.part-* | gunzip | mysql -u root -p wpdatabase


Automated cron example

Schedule nightly compressed DB backups:

gzip-automated-cron-example-100.sh
0 2 * * * mysqldump -u root -pYOURPASS wpdatabase | gzip -9 > /home/wpbackup/wpdatabase_$(date +\%F).sql.gz && rclone copy /home/wpbackup/wpdatabase_$(date +\%F).sql.gz gdrive:wpbackup --progress

Automatically compresses

Automatically uploads

Automatically adds date stamp


Summary table

TaskCommandNotes
Dump & compress`mysqldump dbgzip > db.sql.gz`
Dump, compress, and split`mysqldump dbgzip
Restore (compressed)`gunzip -c db.sql.gzmysql db`
Restore (split parts)`cat part-*gunzip

Key takeaways

  • gzip can be safely combined with mysqldump using a pipe (|).
  • This saves massive disk space and speeds up Rclone transfers.
  • You can restore directly from .gz or even split archives.
  • Don’t forget to test your restore command at least once a month.

Example: daily MySQL + gzip + rclone backup script

This is a practical template for compressing a WordPress database dump and pushing it off-host.

warning

Do not hardcode database passwords in scripts. Prefer an interactive prompt, or a dedicated MySQL config file (for example --defaults-extra-file) with tight permissions.

Backup script (template)
wp-db-backup-gzip-rclone.sh
#!/usr/bin/env bash
set -euo pipefail

BACKUP_DIR="/backups"
DB_NAME="DB_NAME"
MYSQL_DEFAULTS="/root/.my.cnf"
RCLONE_REMOTE="remote:wp-backups"

mkdir -p "$BACKUP_DIR"

TS="$(date +%F)"
OUT_FILE="$BACKUP_DIR/wp-db-$TS.sql.gz"

mysqldump --defaults-extra-file="$MYSQL_DEFAULTS" --single-transaction "$DB_NAME" \
| gzip > "$OUT_FILE"

gzip -t "$OUT_FILE"

rclone copy "$OUT_FILE" "$RCLONE_REMOTE/" --checksum

```text title="gzip-example-daily-mysql-gzip-rclone-backup-script-102.txt"

</details>

<details>
<summary>Example cron entry</summary>

```text title="crontab-entry.txt"
0 2 * * * /usr/local/bin/wp-db-backup-gzip-rclone.sh