Core dump - Can not sync big data folders of size 800 GB

Prasad prasad at i2r.a-star.edu.sg
Mon Jul 12 11:48:27 GMT 2004


Hi,

 

I was trying to synchronize data sitting on our Sun Solaris 8 server with
data size of about 800 GB to a remote Linux server in our LAN. Either the
rsync process hangs for ever else I get "core dump" after about 15 minutes
of time on the source host (solaris 8 server). I used rsync 2.6.2 using the
basic command options as shown below

 

# rsync -avz -e rsh <source folder name>  remote_server_name:<destination
folder name> 

 

I have used this command several number of times to synchronize smaller data
sizes without any problem.

 

Can anyone tell what is the limit of data size that rsync can work.
Moreover, when I tried previously with data sizes of over 50 - 100  GB, I
see that rsync takes too much time to build the file list on source host. Is
this a limitation of rsync or server's resources? Are there any other faster
/ high performance tools available for synchronizing larger data sizes?

 

Thanks in advance.

 

Prasad  



More information about the rsync mailing list