Server Level CRON Backups

iqelipzpari

Backing up files and storing them in a remote location should be done on a regular basis. Backups are vital. They ensure your code and assets are available if your server is hacked, destroyed and when your data is lost or corrupted.

In WordPress, you could use a plugin like BackupBuddy to manage your backups. However, not everyone uses WordPress or wants to use a plugin to manage their backups.

You do not need to be a server guru to manage your backups. All you need is one shell script on your Linux installation and one cron job to run it and an Amazon S3 bucket.

If you don’t feel comfortable managing your backups, use a plugin, system admin or service. Backups are too crucial to skip.

In this tutorial

Use a primary Ubuntu server. I’m using root at my own risk, but you should use your own su.

  • Install AWS CLI so we can upload your files to S3
  • Create a shell script to automate backups
  • Create a server cron job to backup your site every day

Installing AWS CLI

To install the AWS CLI make sure you have python and pip installed, Python 2 version 2.6.5+ or Python 3 version 3.3+.

Here are the simple steps:

  1. Install the AWS CLI via pip: sudo pip install awscli
  2. Add export PATH=~/.local/bin:$PATH to your servers profile. This allows you to run aws from the command-line.
  3. Configure AWS with your credentials: aws configure
AWS Access Key ID [None]: AKI3IOSF3DNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtn4MI/K7MDGENG/bPxRfiCY4EXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: ENTER

To test if you can access your buckets run aws s3api list-buckets. The command lists your buckets in JSON format if you have permission.

AWS CLI Path

Now, get the aws cli location.

Run,

which aws

Returns,

/usr/local/bin/aws

Use this is your script to access aws.

Backing Up Files

To backup your website or app files on the server use tar. Remember, everything will be on S3, keep only the last 4 days archives on the server by running some basic cleanup. Take a Look at the shell script located at /root/bck_files_script.sh:

#!/bin/bash

BCKTIME=$(date +"%y-%m-%d-%H%M%S")
BCKFILE="full_backup_$BCKTIME.tar.gz"
echo "backup to file: $BCKFILE"

echo "backing up files..."
cd /var/www/
tar czf "$BCKFILE" html --exclude='.git'

echo "moving backup..."
find /var/www/full_backup_* -type f -exec mv {} /root/bck/ ;

echo "moving to a safe place..."
cd /root/bck/

echo "cleanup old backups..."
find /root/bck/full_backup_* -type f -mtime +3 -exec rm {} ;

echo "uploading new file to S3..."
/usr/local/bin/aws s3 cp "/root/bck/$BCKFILE" s3://your-bucket/

echo "finished!"

Remember to make the script executable, chmod +x /root/bck_files_script.sh.

Crontab

Crontab or cron jobs are straightforward if you know what the 5 asterisks (*) are.

minute hour day month day-of-week [command]
* * * * *

For the files backup, we want the cron job to run at 11:30 every day. You can access your user’s crontab with crontab -e.

30 11 * * * /root/bck_files_script.sh >/dev/null

Allow cron access to AWS CLI

Now you need to inform cron of your AWS credentials. Locate your AWS config file and edit the cron.

crontab -e

Add the env to the top of your crontab.

AWS_CONFIG_FILE="/root/.aws/config"
30 11 * * * /root/bck_files_script.sh >/dev/null

Backing Up PostgreSQL

Backing up the database is also important. Since I’m using PostgreSQL ill use pg_dump. These backups will stay on the server 2 days and then cleanup will remove them. Take a look at the shell script located at /root/bck_db_script.sh:

#!/bin/bash

BCKTIME=$(date +"%y-%m-%d-%H%M%S")
BCKFILE="db_backup_$BCKTIME.sql"
GZFILE="db_backup_$BCKTIME.tar.gz"

echo "backup db to file: $BCKFILE"
pg_dump -U username database -f "/root/db_bck/$BCKFILE"

echo "compressing..."
cd /root/db_bck/
tar czf "$GZFILE" "$BCKFILE"
rm "/root/db_bck/$BCKFILE"

echo "cleanup old backups..."
find /root/db_bck/db_backup_* -type f -mtime +1 -exec rm {} ;

echo "uploading new file to S3..."
/usr/local/bin/aws s3 cp "/root/db_bck/$GZFILE" s3://your-bucket/

echo "finished!"

Again make the script executable, chmod +x /root/bck_db_script.sh.

Edit the cron job one last time to run on the hour every hour backups. Again, crontab -e.

AWS_CONFIG_FILE="/root/.aws/config"
30 11 * * * /root/bck_files_script.sh >/dev/null
0 * * * * /root/bck_db_script.sh >/dev/null

To check your final cron results list your crontab.

crontab -l

Backing Up MySQL

If you want to backup MySQL as well replace:

pg_dump -U [username] [database] -f "/root/db_bck/$BCKFILE"

With,

mysqldump -u [username] -p[root_password] [database] > "/root/db_bck/$BCKFILE"

Note that there is no space after -p in the MySQL command.

Backup Security

If your backups include credit card numbers, email addresses or passwords you will want to encrypt your backup before sending it to a remote destination. However, local backups need no encryption if your server is secure.

Cleanup S3 Backups

Chances are you will not want to keep your backups forever. On the local server, we have already set a removal date for the old backups. In S3 we need to do the same.

Within S3 locate your bucket properties and find the “Lifecycle” option. Then create a rule, and you can delete old backups any number of days after they were created.

Here I created a rule called “Delete Old Backups” that runs on the whole bucket and permanently deletes items created 3 days ago.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.