The importance of backing up a database needs no further demonstration.
The following will create daily, weekly and monthly backups for our database and copy them to an Amazon S3 Bucket (MyBucketName) in MyBucketName/Daily, MyBucketName/Weekly and MyBucketName/Monthly folders respectively.
To control the cost of our backup solution, we will set expiration to each file to remove them from S3 automatically after 15 days for daily, 60 for weekly and 180 for monthly backups.
Lifecycle rules can also be used to delete files or transfer them into Amazon Glacier. The lifecycle rule could be to delete files under MyBucketName/Daily after 15 days, in MyBucketName/Weekly after 60 days, and MyBucketName/Monthly after 180 days [1].
In this tutorial we will use AWS Command Line Interface [2], with its high-level S3 commands [3]. Another tool like ‘s3cmd’ can also be used [4].
1. Install and configure AWS Command Line Interface
$ sudo yum install awscli
or $ sudo apt-get install awscli
for Ubuntu/Debian
$ aws configure
You will need to enter your AWS Access Key ID and AWS Secret Key, other parameters can be ignored, I recommend setting a default region.
2. Write scripts for backups
We will create three scripts to dump the database, name it todaysdate.sql and copy it to Amazon S3.
2.1. Script for daily backup – DailyBackupDB.sh
#!/bin/bash
NOW="$(date +'%m-%d-%Y')"
EXPIRES="$(date +'%m-%d-%Y' --date='15 days')"
mysqldump -uUSER -pPASSWORD database | gzip > /path/to/databasebackups/$NOW.sql.gz
aws s3 cp /path/to/databasebackups/$NOW.sql.gz s3://MyBucketName/Daily/$NOW.sql.gz --expires $EXPIRES --region YourRegion
2.2. Script for weekly backup – WeeklyBackupDB.sh
#!/bin/bash
NOW="$(date +'%m-%d-%Y')"
EXPIRES="$(date +'%m-%d-%Y' --date='60 days')"
mysqldump -uUSER -pPASSWORD database | gzip > /path/to/databasebackups/$NOW.sql.gz
aws s3 cp /path/to/databasebackups/$NOW.sql.gz s3://MyBucketName/Weekly/$NOW.sql.gz --expires $EXPIRES --region YourRegion
2.3. Script for weekly backup – MonthlyBackupDB.sh
#!/bin/bash
NOW="$(date +'%M-%Y')"
EXPIRES="$(date +'%m-%d-%Y' --date='180 days')"
mysqldump -uUSER -pPASSWORD database | gzip > /path/to/databasebackups/$NOW.sql.gz
aws s3 cp /path/to/databasebackups/$NOW.sql.gz s3://MyBucketName/Monthly/$NOW.sql.gz --expires $EXPIRES --region YourRegion
The three scripts above can be optimized to a single.
3. Make the scripts executable
$ sudo chmod +x DailyBackupDB.sh
$ sudo chmod +x WeeklyBackupDB.sh
$ sudo chmod +x MonthlyBackupDB.sh
4. Run CRON tasks to automate the backup process
Cron tasks for daily, weekly and monthly backups respectively:
$ sudo nano crontab -e
Add the following lines:
0 0 * * * root /path/to/script/DailyBackupDB.sh # Run every day at midnight
0 0 * * 0 root /path/to/script/WeeklyBackupDB.sh # Run every Sunday at midnight
0 0 1 * * root /path/to/script/MonthlyBackupDB.sh # Run every 1st of the month at midnight
And that is all… you have a cost-effective and reliable backup solution…
Links:
[1] – Objects lifecycle management
[2] – AWS Command Line Interface
[3] – AWS high-level S3 commands
[4] – s3cmd tool