Thursday 27 August 2009

My simple Website Backup Method

Since playing around with this new drupal site, plus I have a lot of my personal photos stored in the gallery I really wanted to have some sort of backup strategy, so for now I am using lftp to send the selected data to a NAS device that is connected to my home network.

I am using FTP on the NAS device as it means I don't need to have a computer on to send data to it. I like lftp as it lets you write a script of commands as a queue of commands to perform; so you can launch lftp with the -f option and specify a simple text file with all the commands you wish to perform.

I backup the contents of my /home and /var/www/. Also, before backing this up, I tar up /etc and dump the mysql databases into a directory in home/.

I put all this into a little script which runs once a week using cron.

Here is my script:

#!/bin/bash #Backup script for server #set variable of date for labelling date=`date +%F` cd /home/jonr/backupdata/ #remove oldest etc backup rm `ls -t *etc-backup* | tail -n 1` #Backup latest /etc/ tar -czvf /home/jonr/backupdata/${date}-etc-backup.tar /etc/ #remove oldest mysql backup rm `ls -t *mysql* | tail -n 1` #Dump mysql databases mysqldump --all-databases > /home/jonr/backupdata/${date}_mysql_backup #Backup to home NAS using lftp lftp -f /home/jonr/bin/jcr-lftp-script #Calculate the size of the amount of data sent during the backup from lftp's transfer log. grep $date /home/jonr/.lftp/transfer_log | sed -r s/^.+0-// | sed -r s/\ [0-9].+$// > temp.log i=0 for n in `cat temp.log` do i=`expr $i + $n` done echo $i'b' rm temp.log exit

Hopefully most of the comments explain what I am doing. Some things to note:

tar of /etc is being run as user 'jonr', not root, therefore there are some files/directories that jonr does not have permission to tar... I need to sort this out.

To perform the mysqldump, it requires root permissions, normally on the command line you would enter the -uroot and -p options to act as root, so to avoid using these options in plain text in my script you can create a file called '.my.cnf' in your home directory.

This simply (in my case) contains this info:

[client] user = root password = (password)

So when any mysql command is called without -u -p options, it uses these credentials instead.

Similarly, for lftp, if I create a file called .netrc in my home directory containing:

machine server.address.com login jonr password (password)

...when lftp is launched it can be requested to connect to 'server.address.com' and will not ask for a user and password as it finds it in .netrc

Here is the contents of the lftp script I call:

open server.address.com cd PUBLIC/jcrdev/ mirror -Ren /home/jonr/ home-backup mirror -Ren /var/www/ varwww-backup exit

Simply, it connects using the above credentials, changes to the appropriate directory, then uses the mirror command to 'put' data on th FTP server.

The -R option means 'reverse mirror', i.e. 'put' data instead of 'get' data. So put it on the FTP drive, rather than pull it off the FTP drive.

The -e option deletes files on the FTP drive which are not present in the source.

The -n option puts only NEWER data on the FTP drive, so only things that have changed basically get transferred.

This is basically behaving like rsync, only transferring data that it needs to to keep it up to date. I just can't use rsync to an FTP drive.

Things I need to improve on:

Don't use FTP (not very secure - SFTP better) Configure certain commands, like tar'ing up /etc to allow root priviledges without transmitting password)

If anyone spots problems or just wants to feedback some comments/advice, please do, I would appreciate it.

1 comment:

  1. Looks good, find herein some small changes:

    # if the cd fails, we do not want to continue to delete files!
    cd /home/jonr/backupdata/ || exit 22

    ...

    # added '.gz' to the extension so it is more accurate
    tar -czvf /home/jonr/backupdata/${date}-etc-backup.tar.gz /etc/

    ...

    # removed the directory from the path - since we should already be in that dir
    mysqldump --all-databases > ${date}_mysql_backup

    ... reduce the number of processes and obviate the need for a temporary file...

    i=0
    for n in `sed -rn /$date/'{s/^.+0-//;s/ [0-9].+$//;p;}' ~jonr/.lftp/transfer_log`
    do
    let i=i+n
    done
    echo $i'b'

    ... but we can let awk (which is a programmatical step up from sed) do *all* of that for us...

    awk /$date/'{sub(/^.+0-/"");sub(/ [0-9].+$/,"");i+=$0}END{print i "b"}' ~jonr/.lftp/transfer_log

    ...

    # remove the exit line - end of file indicates exit :)
    exit

    ReplyDelete