Backing Up a Website Using SSH and Tarring on the Fly

Non-Techie Warning – you probably will want to skip this post!

Here’s a neat tip for those of you who have Linux web sites AND a second computer running linux. Using one command you can back up your site to the second linux computer using tar, with the compression being done on the fly so it doesn’t take up disk space on your website.

1) This assumes you have SSH enabled on both your remote server (which I’ll call A) and a separate backup linux server (which I’ll call b).  I won’t get into the details of setting that up now as it would take up too much time, but I may do so in the future.

2) Log in using ssh into remote server A.

3) Go to the directory you want to backup (with all its subdirectories) and enter the following:

tar cvf – . | gzip | ssh -e none [user name of Server B]@[domain name of Server B] “cat > [linux path to the directory on Server B that will hold the backup]/[name of website]-date.tar.gz”

Replacing of course the bracketed words with the correct names/descriptions indicated. Once you enter that you will see the tar job go to town, but you’ll notice it’s not accumulating the backup file on Server A, it’s actually sending it so that the backup file is assembled on Server B.  It’s a great way to save disk space on the remote computer as well as bandwidth.

Just to keep a  little bit of the “wired” in Wired Catholic.

One Reply to “Backing Up a Website Using SSH and Tarring on the Fly”

  1. Cool – great tip. here’s an example of the command line i use in my local cygwin window
    1. open up cygwin on your local windows machine
    2. enter the following , where example.com is the remote server and local-backup.tar.gz is the target back up file,
    ssh [email protected] “(cd /path/to/parent/directory; tar cvfzX – /path/to/exclude/file directory-to-backup)” | cat > ./local-backup.tar.gz

Leave a Reply

Your email address will not be published. Required fields are marked *