Problem with hard links

Matt McCutchen hashproduct+rsync at gmail.com
Fri Sep 28 12:08:27 GMT 2007


Fabian's suggestion to use the CVS rsync with incremental recursion is
good; that will be an improvement.  However, rsync still has to
remember all files in S1 that had multiple hard links in case they
show up again in S2.  If remembering the contents of even one of the
directories makes rsync run out of memory, you'll have to do something
different.

On 9/28/07, limule pika <limulezzz at gmail.com> wrote:
> Is there a solution to keep the hard links between S2 and S1 when running
> two separated command ?

Not in the general case, but if the hard links are between
corresponding files (e.g., S1/path/to/X and S2/path/to/X; often the
case in incremental backups), you can simply use --link-dest on the
second run, like this:

rsync <options> P/S1/ remote:P/S1/
rsync <options> --link-dest=../S1/ P/S2/ remote:P/S2/

(Note the ../S1/, because basis directory paths are interpreted
relative to the destination directory.)  If you do this and use the
incremental recursion mode, rsync will remember only up to a few
thousand files at a time and won't run out of memory.  You can even do
the copy in a single pass if you like: create a directory "P/basis"
containing a symlink "S2" -> "../S1", and then run something like:

rsync <options> --link-dest=basis/ P/ remote:P/

Matt


More information about the rsync mailing list