Generic Directory Backup Script

I regularly have directories of files I want to back up in case of disaster. Usually these are files that change often, and I only want to keep recent versions in case I want to revert or recover changes. (Git? Git, who?)

I have used this script over and over as a simple way to archive a directory to a location with a date-stamped filename. It also cleans up after itself by deleting files older than X days. I stick it in CRON and let it run on a schedule and I always have an archive of the last X days of the files in my directory.

I use variations of this script on my home lab computers to back up persistent Docker data.

#!/usr/bin/env bash
#=====================================================================
#
#          FILE: wiki-backup.sh
#        AUTHOR: C Hawley
#       CREATED: 2022-11-30
#      REVISION: 2022-11-30
#
#=====================================================================
 
set -o nounset                   # Treat unset variables as an error
 
# Backup Source
bsource=/mnt/data/wiki-data
 
# Backup Destination
bdest=/mnt/backups
 
# Backup Filename (no extension)
bfilename=wiki-data-backup 
 
# Get today's date
bdate=$(date +"%Y-%m-%d")
 
# Archive directory to the destination
tar czf $bdest/$bfilename-$bdate.tgz $bsource
 
# Prune backups older than 7 days
find $bdest -maxdepth 1 -type f -iname "$bfilename*.tgz" -mtime +7 -delete

Leave a Comment