Date created: Sunday, June 19, 2011 1:35:27 PM. Last modified: Sunday, January 27, 2019 11:45:33 AM

Web Backup Scripts

Basic MySQL Dump and Webroot Zip

Backup basic lamp stack sites. The script backs up a MySQL DB and the web root;

#!bin/bash
zip -r /backup/dir/webroot_`date '+%Y-%m-%d--%H-%M-%S'`.zip /path/to/public_html
/path/to/mysql/bin/mysqldump -S /path/to/mysql/socket -u db_read_only_user -pP455W0RD db_name > /backup/dir/database_`date '+%Y-%m-%d--%H-%M-%S'`.sql
find /backup/dir/ -type f -mtime +30 | xargs rm -f

 

MySQL Dump, Webroot Tar, and SCP (with nice/ionice)

Dump and scp a website to another host, also use nice and ionice as to not affect the site performance;

backup.sh;

#!/bin/bash

# As the size of the DB has grown, and the size of the web root,
# backing it up has become quite IO intensive on the hard drive. It is
# causing HostTracker alerts to come in at night whilst backups are 
# running, so ionice has been inserted here to reduce the IO priority 
# of the backups, so that web requests are still being served
ionice -c3 -p$$

thedate=`date '+%Y-%m-%d--%H-%M-%S'`
backupdir="/root/backups"

# Delete MySQL dumps older than 30 days 
find $backupdir/*.gz -type f -mtime +30 | xargs rm -f

#Backup MySQL databases
mysqldump -S /var/lib/mysql/mysql.sock -u read-only-user -pP455W0RD db1 | gzip -1 > $backupdir/db1-$thedate.gz
mysqldump -S /var/lib/mysql/mysql.sock -u read-only-user -pP455W0RD db2 | gzip -1 > $backupdir/db2-$thedate.gz

# Backup MySQL config
cp /etc/my.cnf $backupdir/my-$thedate.cnf

# Delete MySQL config files older than 90 days 
find $backupdir/*.cnf -type f -mtime +90 | xargs rm -f


if [ `date +"%u"` -eq 7 ]; then 
 # Only backup the httpdocs dir once a week, its too big otherwise!
 tar -zcvf $backupdir/httpdocs-$thedate.tar.gz /vhosts/www.a-site.com/httpdocs
fi

# Note here, "username" account have passwordless SSH keys set up to an account on the backup server
su --session-command="$backupdir/scp_to_backup_server.sh $thedate $backupdir" username

scp_to_backup_server.sh

#!/bin/bash

thedate=$1
backupdir=$2

# Delete files older than 30 days on the backup server, it has less space
ssh username@backup-server.net "find /store -type f -mtime +30 | xargs rm -f"

scp $backupdir/waccoe5_ibrd1-$thedate.gz username@backup-server.net:/store
scp $backupdir/adserver-$thedate.gz username@backup-server.net:/store
scp $backupdir/httpdocs-$thedate.tar.gz username@backup-server.net:/store
scp $backupdir/my-$thedate.cnf username@backup-server.net:/store

Start backup.sh from a wrapper script, in /etc/cron.daily or similar.

wrapper.sh;

#!/bin/bash

# This script it simply a wrapper so we can call other scripts, programs and change environment
# options before we call the backup script;

# Call the backup script, but as it's single threaded and will by default fall onto either core 
# 0 or 8, (the first two physical cores of each physical processor) lets override that and run
# the backup script on the last virtual thread of the last core, otherwise it can interrupt 
# the web site serving web users
taskset -c 15 /path/to/backup.sh

Previous page: Try Example
Next page: RoboCopy and Delete Old Backups