
There are lots of WordPress backup plugins out there and in some cases ( well for many on cheap shared hosts without any real access ) that is your only real choice in terms of backing up your WordPress website.
However, if you are on a quality host, or your own VPS, that gives you SSH and WP Command Line access then you can backup your WordPress site without using a plugin.
My host does backups, why should I bother?
A good question. If you think your host is always going to be there, never go bust or get hacked then great. Also host based backup systems can in some cases be very very slow to recover from.
As I manage many client WordPress websites I simply can’t take such risks.
Why not use a plugin for WordPress backups?
One major reason is that Plugins need to run as PHP and as such are limited in terms of processing time. The plugins have to cater for this and restart themselves and if they don’t handle things well result in corrupted unusable backups, which unless you test them regularly you may be shocked to find when you need it, the backup is no good.
A plugin, by the nature of running PHP, will take longer and consume more resources than direct commands.
Cost is another reason, many backup plugins have premium features and often that premium feature is to allow automatic transfer of the backup elsewhere, e.g. to some cloud storage. See the next paragraph.
Why you should not save your backup on your host?
This is fairly simple, imagine your website gets hacked and the hacker gets access to your file system and decides just to delete everything. All gone, nothing to recover.
Lets get ready
As mentioned, there are some assumptions
- Your server is Linux based
- You have access to the command line and a basic understanding of how to use it
- You have WP CLI installed
- You have the ability to add cron jobs
- You have a cloud based or other storage
For this example I will use Google Cloud Storage and Amazon S3 but if you want to use another off server storage solution you can as long as you have a method to access is, e.g. scp
Set up Google Cloud Storage ( optional if you use something else )
Account
First you need a Google Cloud account, which is free to set up and at the time of writing actually they give you 5GB free for ever or if you need more $300 credit. In any case it is very cheap. https://cloud.google.com/storage/
Buckets
Then follow their tutorials to understand the cloud storage. You will need to set up ‘buckets’ for your backups, in this example I have a ‘daily’ bucket and a ‘weekly’ bucket.
I would recommend using ‘nearline storage’ as it is cheaper.
For you ‘buckets’ I would recommend setting up a ‘lifecycle’ rule. So in my daily bucket I auto delete after 7 day and my weekly bucket after 31 days, so keeping 4 weeks approximately.

GSutil
Google require you to have specific software (gsutil) to upload to their cloud storage. The good news is, if you are on shared hosting, you can install this without root access by following these instructions.
Find your way over to https://cloud.google.com/storage/docs/gsutil_install and navigate down to Alternative installation methods and look at Installing from tar or zip archive
wget https://storage.googleapis.com/pub/gsutil.tar.gz tar xfz gsutil.tar.gz -C $HOME # Add the following line to your ~/.bashrc export PATH=${PATH}:$HOME/gsutil # reload source ~/.bashrc # set up gsutil config # follow the instructions to link your project
The documentation is comprehensive, and whilst initially daunting, if you follow it step by step you should end up with working gsutil command.
Set up Amazon S3 ( optional if you use something else )
Account
Head over to Amazon Web Services and create your account if you don’t already have one.
Buckets
Create non public buckets for daily and weekly backups and add lifecycle rule to delete after 7 and 31 days. Amazon bucket setup is a straight forward, you are guided by a wizard. Adding the lifecycle isn’t hard either, just edit the bucket and go to ‘management’.
AWS
Amazon require you to use their software, rather than scp, to copy to buckets. Installing aws is straightforward enough. Amazon is bringing out aws2 to there may be better ways of doing this. This method means you can install even if you don’t have root access ( e.g. on shared hosting ).
wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip unzip awscli-bundle.zip ./awscli-bundle/install -b ~/bin/aws
Once installed, assuming ~/bin is on your path ( if not add it ) you should be able to run
aws --version
Then you need to authenticate the aws cli – with aws configure e.g.
aws configure >AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE >AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY >Default region name [None]: ENTER >Default output format [None]: ENTER
The basic script ( Google Storage Version )
The hard part is over. So now you just need a script
Lines 2-7 is just validating you provide a bucket name to the script.
Line 8 is just getting a date to use on the file names
Line 9 is your WordPress root folder – modify this as required
Line 12 uses WP CLI to export the database into a temp location
Line 13 copies this to your remote bucket – you may want to change this if you are using another location e.g. via scp or even gdrive
Line 14 removes the temp file after transfer
Line 15 uses tar to create a compressed archive ( if you prefer zip files an d zip is installed you could replace with zip -r … ) into a temp area
Line 16 copies to storage
Line 17 tidies up by deleting the temp file
So create a file e.g. backup.sh with the above content, edit as required if at at all and set it to executable and put it somewhere in your user’s path ( on my system that would be /home/myaccount/bin – echo $PATH to see yours )
chmod +x /bin/backup.sh
Then test, by running
backup.sh backups-daily-bucket backup.sh backups-weekly-bucket
The Basic script (AWS S3 )
This is a variation using aws commands
I won’t break down every line, look at the above Google Storage version for details.
The differences are simply line 13 & 16 which now use aws s3 cp to transfer the files.
Now to automate it
To automate it we put it into a cron job. Different ways of setting up cron depending on your control panel or command line. But you should end up with something like this
0 0 * * 0 source ~/.bashrc;backups.sh backups-weekly-bucket > logs/weekly.log 2>&1 30 2 * * 0 source ~/.bashrc;backups.sh backups-daily-bucket > logs/daily.log 2>&1
The first line being once a week at midnight, source the bashrc ( to get the paths as cron does not do that ) and run the backup script outputting to the log file.
The second, similar, but every day at 2:30 am run the script into the daily bucket
Your comments or improvements
Feel free to ask questions, suggest improvements or point out issue using the comments below.
Leave a Reply