Amazon S3 Backup script with encryption


Need more help? Click here to get help with this issue!

With the advent of cloud computing, there have been several advances as far as commercial cloud offerings, most notably Amazon’s EC2 computing platform as well as their S3 Storage platform.

Backing up to Amazon S3 has become a popular alternative to achieving true offsite backup capabilities for many organizations.

The fast data transfer speeds as well as the low cost of storage per gigabyte make it an attractive offer.

There are several free software solutions that offer the ability to connect to S3 and transfer files. The one that shows the most promise is s3sync.

There are already a few guides that show you how to implement s3sync on your system.

The good thing is that this can be implemented in Windows, Linux, FreeBSD among other operating systems.

We have written a simple script that utilizes the s3sync program in a scheduled offsite backup scenario. Find our script below, and modify it as you wish. Hopefully it will help you get your data safely offsite ;)

#!/bin/sh
# OffSite Backup script

currentmonth=`date "+%Y-%m-%d %H:%M:%S"`

export AWS_ACCESS_KEY_ID="YOUR-ACCESS-KEY"
export AWS_SECRET_ACCESS_KEY="YOUR-SECRET-ACCESS-KEY"

echo "Offsite Backup Log: " $currentmonth > /var/log/offsite-backup.log
echo -e "----------------------------------------" >> /var/log/offsite-backup.log
echo -e "" >> /var/log/offsite-backup.log

# Archive Files and remove files older than 3 days
/usr/bin/find /home/offsite-backup-files -type f -mtime +3 -delete

# Compress and archive a few select key folders for archival and transfer to S3
tar -czvf /home/offsite-backup-files/offsite-backup-`date "+%Y-%m-%d"`.tar.gz /folder1 /folder2 /folder3 >> /var/log/offsite-backup.log 2>&1

# Transfer the files to Amazon S3 Storage via HTTPS
/usr/local/bin/ruby /usr/local/bin/s3sync/s3sync.rb --ssl -v --delete -r /home/offsite-backup-files your-node:your-sub-node/your-sub-sub-node >> /var/log/offsite-b
ackup.log 2>&1

# Some simple error checking and email alert logging
if [ "$?" -eq 1 ]
then
        echo -e "***OFFSITE BACKUP JOB, THERE WERE ERRORS***" >> /var/log/offsite-backup.log 2>&1
        cat /var/log/offsite-backup.log | mail -s "Offsite Backup Job failed" you@yourdomain.com
        exit 1
else
        echo -e "Script Completed Successfully!" >> /var/log/offsite-backup.log 2>&1
        cat /var/log/offsite-backup.log | mail -s "Offsite Backup Job Completed" your@yourdomain.com
        exit 0
fi

Now if your data happens to be sensitive (most usually is), usually encrypting the data during transit (with the –ssl flag) is not enough.

You can encrypt the actual file before it is sent to S3, as an alternative. This would be incorporated into the tar command with the above script. That line would look something like this :

/usr/bin/tar -czvf - /folder1 /folder2 /folder3 | /usr/local/bin/gpg --encrypt -r you@yourdomain.com > /home/offsite-backup-files/offsite-backups-`date "+%Y-%m-%d"`.tpg

Alternative to gpg, you could utilize openssl to encrypt the data.

Hopefully this has been helpful!


Need more help? Click here to get help with this issue!

  • Pingback: Amazon S3 Backup script with encryption | *.hosting « Script

  • Pingback: backup log | BACKUP

  • Diego

    Thanks for that.