Using rsync as an incremental backup

Carlos Carvalho carlos at fisica.ufpr.br
Fri Jun 24 08:38:51 MDT 2011


Scott Baker (scott at perturb.org) wrote on 23 June 2011 15:30:
 >I'm using rsync to do an incremental backup of my desktop here, to a
 >remote server as follows:
 >
 >#/usr/bin/bash
 >
 >old=$(date -d 'now - 1 week' +%Y-%m-%d)
 >new=$(date +%Y-%m-%d)
 >
 >rsync -avP --delete --link-dest=../$dir /home/bakers
 >bakers at perturb.org:/home/bakers/backup/$new/
 >
 >This is actually working GREAT! The only problem is that sometimes the
 >cronjob won't complete (internet is down, something like that). When it
 >tries to run the next week it does --link-dest against a dir that
 >doesn't exist. It happily complies and transfers EVERY file because
 >there is no source to hardlink from.
 >
 >I'd really like rsync to exit and throw an error if the --link-dest
 >isn't present. I can't find anything in the man page about any fancy
 >--list-dest options. Am I missing it?

You should handle it in your script. You can pass several directories
with link-dest and rsync will search for the file in all of them, in
the order you gave. If it doesn't find one for hardlinking it pulls
the file through the net.

We do it here in our backup script to avoid useless copying, both when
the backup machine starts the process and when the source of the files
starts it.


More information about the rsync mailing list