sync a lot of files

Clint Byrum cbyrum at
Sun Feb 6 18:08:55 GMT 2005

ciprian niculescu wrote:

> Hello,
> I have a problem with "out of memory", I'm trying to sync around 30mil 
> files and i get error, the sync is on the same host but different 
> directory, this with the 100bytes per file, give me 3G of ram. I have 
> put 12G swap (6 partitions of 2G) and my 3G ram. Last i saw the 
> process rsync get to 2100M of SIZE and 1.8G RSS, don't know exactly at 
> how much it gets till he get the error, but not too far after 2G.
> How can i solve the problem, i use 2.6.3 version.
> Thanks for advice
> Ciprian

You don't say what OS you're running, but I'm going to presume Linux. On 
32bit architectures, on Linux, no one process can access more than 2G of 
RAM. You've made 15G of virtual memory available, but there's just no 
way you can use it with one rsync.

Why are you using rsync for that many files anyway? If I were in your 
position, I'd chop it up into many little chunks (if possible), or if I 
had control over how files were added, I'd have the process that 
adds/changes files keep a list of those changed files.. then just copy 
the ones that changed.

Rsync is an amazing tool.. but it has a window of usefulness, just like 
any other tool. You may have gone beyond that window.

More information about the rsync mailing list