Syncing large amounts of data

Adam Herbert aherbert at
Wed Feb 12 18:13:45 EST 2003

I need some suggestions. Here's my setup:

	800GB of Data
	14,000,000+ Files
      No changes just additions
      Files range in size from 30k - 190k

The files are laid out in a tree fashion like:

   \-Directory ( Numerical Directory name from 0 - 1023 )
     \-Directory ( Numerical Directory name from 0 - 1023 )
       \- Files ( Up to 1024 files each directory )

This allows for a maximum of about a billion files. I need to limit the
amount of memory usage and processor / io time it takes to build the
list of files to transmit. Is there a better solution that rsync? Are
the patches that would help rsync in my particular situation?

Any suggestions are appreciated.

Adam Herbert
aherbert at

More information about the rsync mailing list