Backing up cPanel without hitting logout
Posted by Admin • Thursday, April 21. 2011 • Category: Linux
cPanel based hosting presents some challenges for automatic backups - there isn't an automatic way of creating local backups or any standard way of triggering their creation remotely. Numerous scripts exist yet none were quite the solution I was looking for - I am primarily interested in the databases and mail forwarders, though files wouldn't hurt either. Moreover - you can do this in a single line!
My Goal therefore is: Create and retrieve backups nightly. I don't want cPanel to push files to my off-site box, I'd rather initiate everything remotely and not have to open up access to some other system. cPanel does permit one to do this using a browser, hence it can be scripted. Really, scripting isn't even necessary - wget is all that is required! That said, I had a very hard time convincing wget not to visit the logout link on each page - once you log out, you're not getting anywhere anymore. So here is how I did it.
I run this on my own server, which is the off-site backup location for the hosted space.
So to back up all databases and all the forwarder/etc configs but not files: (Substitute everything staring with THE_ with your values)
If you do want the master tarball of the entire home directory too, remove the backup-$MYDOMAIN* :
I am keeping the logout in there as a file exclusion in case at some point the URL changes in some way and it's no longer a directory.
Because I have a backup methodology already that rotates archive directories periodically, I want the daily downloads to be dated:
My Goal therefore is: Create and retrieve backups nightly. I don't want cPanel to push files to my off-site box, I'd rather initiate everything remotely and not have to open up access to some other system. cPanel does permit one to do this using a browser, hence it can be scripted. Really, scripting isn't even necessary - wget is all that is required! That said, I had a very hard time convincing wget not to visit the logout link on each page - once you log out, you're not getting anywhere anymore. So here is how I did it.
Solution
My hosting company uses the x3 theme for cPanel, which has a logout URL that looks like this: /logout/. Because of this I can't specify the wget file exclusion switch -R'*logout*' - wget doesn't think it's a file - that's because it's a directory. Hence the -X.I run this on my own server, which is the off-site backup location for the hosted space.
So to back up all databases and all the forwarder/etc configs but not files: (Substitute everything staring with THE_ with your values)
MYDOMAIN=THE_DOMAIN wget -nd -l1 -e robots=off -A'*.gz' -Xlogout -R'backup-$MYDOMAIN*,*logout*,*index.html*' -r --no-check-certificate --auth-no-challenge --user THE_USERNAME --password THE_PASSWORD https://$MYDOMAIN:2083/frontend/x3/backup/index.html
If you do want the master tarball of the entire home directory too, remove the backup-$MYDOMAIN* :
MYDOMAIN=THE_DOMAIN wget -nd -l1 -e robots=off -A'*.gz' -Xlogout -R'*logout*,*index.html*' -r --no-check-certificate --auth-no-challenge --user THE_USERNAME --password THE_PASSWORD https://$MYDOMAIN:2083/frontend/x3/backup/index.html
I am keeping the logout in there as a file exclusion in case at some point the URL changes in some way and it's no longer a directory.
What this does
- Visits the main Backup page in cPanel, forcefully passing in basic auth
- Scrapes the html of the page for links matching *.gz no further than 1 level away
- Downloads all matches to the current directory
How I actually use it
Because I have a backup methodology already that rotates archive directories periodically, I want the daily downloads to be dated:
#!/bin/bash
BACKUPDIR=/some/directory
DATE=`date +%Y-%m-%d`
DIR=cpanel-$DATE
cd $BACKUPDIR || exit 1
mkdir -p $DIR || exit 1
cd $DIR || exit 1
MYDOMAIN=THE_DOMAIN wget -nd -l1 -e robots=off -A'*.gz' -Xlogout -R'backup-$MYDOMAIN*,*logout*,*index.html*' -r --no-check-certificate --auth-no-challenge --user THE_USERNAME --password THE_PASSWORD https://$MYDOMAIN:2083/frontend/x3/backup/index.html
Considerations
- Any command that includes the password in the command line can be plainly seen by anyone running ps on the machine. If this is a concern, read the wget man page and look into moving your credentials into the .wgetrc, etc.
- If you have a sizeable home directory, pulling down a full tarball each night is too heavy. An alternative is to use rsync or unison if the provider allows scp/sftp access to files. You can set up SSH keys for password-less login and intelligently synchronize only what changed with your own off-site machine.
0 Comments
Add Comment