Rsync'ing lists of files
wayned at users.sourceforge.net
Mon Jun 10 12:54:01 EST 2002
On Fri, 7 Jun 2002, Stephane Paltani wrote:
> I have 5 million files on one side of the ocean, 100000 of which must
> be copied to the other side.
This is the sort of problem that would benefit from the rsync_xfer.c
program I'm working on (I mentioned an early version on the list a week
or so ago). It allows total control of what gets sent by an external
program, so there's no directory scan and no include/exclude processing.
I could imagine writing a simple perl script that would take a list of
files and turn it into a series of "cput" commands followed by any
needed "del" commands to remove the names that vanished from the list
after the last run. Unfortunately, the code is still at a very early
stage, so it's not yet ready for use in a production environment.
I've been working on a new version of the program that is able to
transfer trees of files and will also have an improved socket protocol.
It works through the tree incrementally, and thus it shouldn't use as
much memory as the current rsync implementation. After I get the code
in a little better shape, I'm planning to compare its performance with
the current implementation and try to figure out if rsync might best
benefit from adding support for a new (internal) protocol, or if it just
needs some tweaks to the current one.
More information about the rsync