huge number of files
Ming Zhang
blackmagic02881 at gmail.com
Fri Jun 22 21:42:41 GMT 2007
On Fri, 2007-06-22 at 21:54 +0200, Christoph Biedl wrote:
> Ming Zhang wrote...
>
> > I wonder if rsync can have an option that when it scan the file system
> > tree and accumulate to N number of files, it process these files before
> > scanning further.
>
> I'd suggest to run rsync in each directory, without the --recursive
> option. But this should happen within rsync else fork and parameter
> parsing would take too much ressorces.
this will need to write a file system scanner and use each directory to
fork rsync. too painful.
good to know that rsync3.0 will do this by default. though i checked the
code and can not find any option to tune it yet.
>
> > This might break the operation on files with multiple hard links or
> > detecting rename. But we are ok on this.
>
> Indeed. Somehow I get the feeling such a feature is requested at least
> once per month.
>
> Christoph
--
Ming Zhang
@#$%^ purging memory... (*!%
http://blackmagic02881.wordpress.com/
http://www.linkedin.com/in/blackmagic02881
--------------------------------------------
More information about the rsync
mailing list