Can Rsync handle largs exclude lists without slowdown?

Matthias Schniedermeyer ms at
Sun Jul 29 07:05:08 GMT 2007


Let's say i wanted to exclude 100.000 files by naming them one by one in 
a file to be used by --exclude-from.

Can rsync cope with that without bigger problems?

I'm currently thinking about how i could make backing up by computer 
more efficient and if i exclude every single file that i can reproduce 
an other way, the amount of files i need to back up would be reduced by 
a large amount.

Or to be more precises, my distribution is Debian SID which uses 
packages in .deb-format.

So if i keep all the .deb files and make a list of a files provided by a 
.deb-package i only need to backup the .deb-files instead of the 
uncompressed files and as i have several computers i can save even more 
because i only needed a single copy of the .deb files.

So can i go forward with my idea or does rsync stand in my way to 
happyness. ;-)

Bis denn

Real Programmers consider "what you see is what you get" to be just as 
bad a concept in Text Editors as it is in women. No, the Real Programmer
wants a "you asked for it, you got it" text editor -- complicated, 
cryptic, powerful, unforgiving, dangerous.

More information about the rsync mailing list