large file error is now SIGUSR1 or SIGINT error

Dave Dykstra dwd at bell-labs.com
Wed Feb 13 03:53:55 EST 2002


The SIGUSR1 or SIGINT is just a secondary message in this case that you can
ignore.  The receiver side of rsync splits into two processes, and that's
just the message that the second one prints after the first one kills it
off because it had a problem.  The real problem is your write failure.

I don't have any experience with using >2GB files on Solaris 7 or 8 so
hopefully somebody else can help you with that or you can figure out the
problem yourself.  The Solaris tools I distribute inside my company are all
compiled on 2.5.1 (because I need to support users on that OS version and
up) so I'm stuck with the 32 bit limit.

- Dave Dykstra

On Tue, Feb 12, 2002 at 11:31:55AM -0500, Granzow, Doug (NCI) wrote:
> I just ran this again and got this error:
> 
> leelab/NCBI_Data_old/GenBank/htg
> write failed on leelab/NCBI_Data_old/GenBank/htg : Error 0
> rsync error: error in file IO (code 11) at receiver.c(243)
> 
> Received signal 16. (no core)
> rsync error: received SIGUSR1 or SIGINT (code 20) at rsync.c(229)
> 
> The command I am running is:
> 
> /usr/local/bin/rsync -auv --delete --rsh=/usr/bin/ssh
> lpgfs104:/share/group/* /share/group/
> 
> 
> > An update on this problem...  I get the error below (and the error I
> > reported previously) when running rsync 2.5.2 compiled from 
> > source.  I saw
> > different behavior when I used the rsync 2.5.2 binary 
> > compiled on Solaris
> > 2.5.1 by Dave Dykstra.  That binary complained of "Value too large for
> > defined data type" whenever it encountered a large file (over 
> > 2GB), but did
> > not exit.  The impression I got was that the Solaris 2.5.1 
> > binary did not
> > support or even try to support files over 2 GB, where the 
> > binary compiled on
> > Solaris 7 or 8 *thinks* it can support large files but fails, 
> > since it exits
> > as soon as it encounters the large file.
> > 
> > So the problem still remains:  rsync is dying when it 
> > encounters a large
> > file.  One person suggested using --exclude, but this only 
> > matches against
> > file names, not file sizes.  (I can't do "--exclude=size>2GB" 
> > for example.)
> > 
> > Questions I still have:
> > 
> > - Is rsync supposed to support files >2GB on Solaris 7 and Solaris 8?
> > 
> > - If so, what is causing the errors I am seeing?  Is there 
> > something I can
> > do at compile time?
> > 
> > - If not, is there a way for it to skip large files 
> > gracefully so that at
> > least the rsync process completes?
> > 
> > leelab/NCBI_Data_old/GenBank/htg
> > write failed on leelab/NCBI_Data_old/GenBank/htg : Error 0
> > rsync error: error in file IO (code 11) at receiver.c(243)
> > 
> > Received signal 16. (no core)
> > rsync: connection unexpectedly closed (23123514 bytes read so far)
> > rsync error: error in rsync protocol data stream (code 12) at 
> > io.c(140)
> > 
> > 




More information about the rsync mailing list