Problem rsyncing 450GB file to my NAS: 'connection unexpectedly closed'

rsyncml.frucade at rsyncml.frucade at
Fri Oct 12 13:36:00 MDT 2012

Wow: Thanks for your fast responses, Justin and Karl!

Yet the NAS is still located in my home network. So the network 
connection shouldn't be the problem.

The pointer to the --inplace argument was really helpful. This is what I 
really wanted in this particular use case.

I also tried to enabled file logging on the target NAS:
2012/10/12 18:48:17 [6390] rsync error: timeout in data send/receive 
(code 30) at io.c(137) [receiver=3.0.7]

Oha! But the rsyncd.conf on the NAS did not contain a timeout entry?! I 
also stumbled over an RLimitRate entry not listed in the rsyncd man. 
Maybe this rsyncd is a patched by QNap and timeout no longer defaults to 0?

So I added "timeout = 0" and started a new run with --inplace and -vvvv: 
Until now its still running fine!

I'll report back when it finishes....
So long - thanks for the quick and helpful pointers!

- Ben

On 12.10.2012 20:06, Justin T Pryzby - justinp at wrote:
> Not sure, but some ideas:
> -P means to retain partial files, and doesn't have anything to do with
> /tmp; I wonder if you mean --inplace (not sure)?
> It sounds like a daemon may be timing out; is there a timeout
> specified in rsyncd.conf?  Is there a remote logfile with any useful
> content?
> Justin
> On Fri, Oct 12, 2012 at 07:29:58PM +0200, rsyncml.frucade at wrote:
>> Hi list!
>> I'm trying to use rsync to do a regular remote backup of a 450GB
>> sized container file located on my squeeze server onto a colocated
>> QNap NAS device.
>> Both use rsync 3.0.7. While this setup works for most files, it
>> fails on this large file (see log below).
>> I'm already trying to help rsync using "-P" to indicate that rsync
>> should reuse the current copy and not trying to create a copy on the
>> /tmp parition of the NAS.
>> Any hints what might be the problem?
>> Thanks
>> - Ben

More information about the rsync mailing list