Problem with hard links

limule pika limulezzz at gmail.com
Fri Oct 5 08:15:33 GMT 2007


On 9/28/07, Matt McCutchen <hashproduct+rsync at gmail.com> wrote:
>
> Fabian's suggestion to use the CVS rsync with incremental recursion is
> good; that will be an improvement.  However, rsync still has to
> remember all files in S1 that had multiple hard links in case they
> show up again in S2.  If remembering the contents of even one of the
> directories makes rsync run out of memory, you'll have to do something
> different.


Thanks for your reply.I think that there is too many files in S1 ...



> Not in the general case, but if the hard links are between
> corresponding files (e.g., S1/path/to/X and S2/path/to/X; often the
> case in incremental backups), you can simply use --link-dest on the
> second run, like this:
>
> rsync <options> P/S1/ remote:P/S1/
> rsync <options> --link-dest=../S1/ P/S2/ remote:P/S2/



I'm using rsync to generate a copy of  a list of backups generated by
backuppc, and unfortunately the structure of S1 and S2 are absolutely not
the same ...
-------------- next part --------------
HTML attachment scrubbed and removed


More information about the rsync mailing list