I automated Linux backups with a simple bash script and cron (and it’s better than a GUI)

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Summary

  • Create a bash script that controls what, where and when backups are run.

  • Schedule the backup.sh executable with cron and examine the logs and destination after the runs.

  • Test the script, use absolute paths to avoid cron and runtime failures, and use comments for documentation.

A tool like Déjà Dup is a simple and excellent graphical way to automate Linux backups. However, creating a backup script is the best way to automate backups. It provides much more control over what is backed up, where data is backed up, and the frequency of backup tasks. It’s also a fun introduction to basic scripting. Here’s how I created a simple backup script and used cron to automate backup tasks.

Create a simple backup script and automate it

As long as you edit and update the script with your absolute paths, your save location can be a local folder, a new disk partition, or an external drive.

Choose a save destination

The plan was to create a simple backup script, but I also wanted a copy of the essential Linux directories, just in case I needed to restore the system. My save location for this project was a 128 GB USB drive.

Two USB sticks on a table

The best USB drives of 2024

Looking for a sturdy USB drive to add to your everyday carry? We’ve got a roundup of some of the best on the market.

Preparing the save location

Whether your save location is a USB stick, an external hard drive, or a new partition, pay attention to absolute directory paths and mount points; you’ll need it once we start writing scripts.

This backup script uses rsync, which is usually preinstalled on most distributions. Confirm your rsync version and if it is missing, use the package manager to install it.

I started the project by creating a new directory called “Backup” on the mounted USB drive. You can create this folder graphically by navigating to your save location and then using the shortcut Ctrl+SHIFT+N. I opted for the terminal and used the lsblk, cd and mkdir commands.

lsblk
cd “/media/htg/DATA BACKUP”
mkdir backup
rsync --version
sudo apt update -y && sudo apt install rsync

Decide what to backup and create a backup.sh script

For system recovery purposes, I chose to backup a directory containing consolidated personal files and essential Linux directories like /home, etc., /var, /usr/local, /root and /opt. Most Linux terminal text editors support basic scripting. I used nano to create and save a simple backup.sh bin bash script in the home directory. Copy and paste the following script into your favorite terminal text editor:

#!/bin/bash
# This bash script backs up Linux recovery directories and personal files
# Preserves ownership, permissions, timestamps, ACLs, and xattrs

set -euo pipefail

#------------This is the CONFIG script-----------

HOSTNAME="$(hostname)"
DATE="$(date +%F)"
BACKUP_ROOT="/media/htg/DATA BACKUP/Backup"
DEST_DIR="$BACKUP_ROOT/archives/${HOSTNAME}_${DATE}"
LOG_FILE="BACKUP_ROOT/logs/backup_${HOSTNAME}_${DATE}.log"
TMP_DIR="$BACKUP_ROOT/tmp"

#Don't forget to edit the BACKUP_ROOT, DEST_DIR, LOG_FILE and TMP_DIR paths

#Customize  SOURCES=( to include the files and directories you want to back up

SOURCES=(
  "/home"
  "/etc"
  "/var"
  "/root"
  "/opt"
  "/usr/local"
  "/home/htg/Backup"
)

EXCLUDES=(
  "--exclude=/var/cache/"
  "--exclude=/var/tmp/"
  "--exclude=/var/lib/apt/lists/"
  "--exclude=/home/*/.cache/"
  "--exclude=/home/*/Downloads/"
  "--exclude=/var/lib/docker/"
  "--exclude=/var/lib/containers/"
)

# Add these flags for backup summary and progress:
# --info=progress2: Total progress line
# --info=name0: Hide individual filenames
# --stats: Final summary block
# --no-inc-recursive: Better progress accuracy

RSYNC_FLAGS=(-aAXH --numeric-ids --delete  --human-readable --inplace --partial --info=progress2 --info=name0 --stats --no-inc-recursive)

# ---------- PREPARATION ----------
# Create destination folders on the flash drive
mkdir -p "$DEST_DIR" "$TMP_DIR" "$(dirname "$LOG_FILE")"
touch "$LOG_FILE"

# ---------- BACKUP ----------
echo "[$(date)] Starting backup to $DEST_DIR" | tee -a "$LOG_FILE"

for SRC in "${SOURCES[@]}"; do
  echo "[$(date)] Backing up $SRC ..." | tee -a "$LOG_FILE"
  
  # Run rsync and log output
  rsync "${RSYNC_FLAGS[@]}" "${EXCLUDES[@]}" "$SRC" "$DEST_DIR" >>"$LOG_FILE" 2>&1
done

echo "[$(date)] Backup completed" | tee -a "$LOG_FILE"

# ---------- VERIFY ----------
echo "[$(date)] Listing destination sizes:" | tee -a "$LOG_FILE"
du -sh "$DEST_DIR"/* 2>/dev/null | tee -a "$LOG_FILE"

exit 0

Edit the script by replacing BACKUP_ROOT, DEST_DIR, LOG_FILE, and TMP_DIR with your absolute paths, then write and exit; for nano, use Ctrl+O+Enter and Ctrl+X. The next part is to use the chmod command to make the script executable and the ls command to confirm execution permission (look for -x).

nano ~/backup.sh
chmod +x ~/backup.sh
ls -l ~/backup.sh
Using chmod and ls commands to make the backup script executable and confirm execution permission

Test the backup script and automate it using cron jobs

Although all backup strategies eventually fail, testing backup systems and scripts can reduce the risk of catastrophic data loss. It detects errors that could cause process failure and is a simple but essential step that protects the integrity of any backup process. On some systems you may need to use sudo to test script execution.

~/backup.sh
sudo ~/backup.sh

The script will not run correctly if it encounters errors, but the system will print them and you can troubleshoot from there. If all is well, you will see the messages “Starting backup” and “Backing up /home…”. If you navigate to your save destination, you should see backed up directories.

If you are backing up a lot of data to an external drive, the process may take a while. Until you receive an error message, be patient, even if it appears that the terminal is frozen.

After testing the script and confirming that it works, I used cron to schedule it to run at 8:00 p.m., the ls command to confirm the scheduled backup job, and the systemctl command to check the status of the cron service.

sudo crontab -e
0 20 * * * /usr/bin/bash /home/htg/backup.sh >> "/media/htg/DATA BACKUP/Backup/logs/cron_backup.log" 2>&1
sudo crontab -l
ls -l /home/htg/backup.sh
systemctl status cron

At this point I had successfully created and tested my backup script and used cron to automate it. The last step was to confirm the backup process and check the logs after 8:00 p.m. the next day.

The 3 lessons I learned the hard way

Creating this backup script was a fun and educational experience. The project taught me three lessons that have stuck with me to this day.

Always test backup scripts

I haven’t tested my first backup script. Can you guess what happened? That’s right: when I ran a test backup, the script failed. It took me a while to realize that I had messed up a shell variable; I had typed DATE+”$(date +%F)” instead of DATE=”$(date +%F)”, which caused a “command not found error” on line 10.

Testing backup scripts detects syntax errors and permission errors that can cause runtime errors. This is also a good practice.

Use absolute paths

Since cron is strict and errors can cause a cron job to fail, get in the habit of using full and absolute directory and file paths in your script. It minimizes errors, streamlines backups, and can help reduce runtime errors.

Commenting on your script is like leaving notes for your future self (and other people) that explain the intention behind a specific part or section of the script. In bash scripts, the pound symbol (#) is the widely accepted comment method. You may have noticed several comments on my bash script.

How to comment when using nano to bash script

Let’s take an example where you need to modify a script 6 months after its creation. Without feedback, the chances of incorrectly editing essential parts and breaking your script are high because the script is no longer fresh in your mind.


This is how I automated Linux backups with a simple bash script and the three lessons I learned the hard way. Although scripts aren’t the easiest way to automate backups, they are fun and educational, especially if you want to learn the basics of bash scripting.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button