Big timeout time

Fabian Cenedese Cenedese at indel.ch
Mon Jun 29 07:18:29 GMT 2009


At 09:07 15.06.2009 +0200, Fabian Cenedese wrote:
>Hi
>
>I'm using rsync 3.0.3 on a NAS. In the parameter list I use --timeout=1800.
>However sometimes I get very big timeout times like this one:
>
>io timeout after 12220 seconds -- exiting
>rsync error: timeout in data send/receive (code 30) at io.c(239) [sender=3.0.3pre1]
>
>rsynd.conf on the receiving side only defines modules, no timing parameters.
>How can that happen? What can I do to prevent this? As this is running on a
>NAS it's not that easy to update rsync.

Does nobody know what to do about this? I had another error like that.
The rsync command was running for quite some time until it dropped out:

io timeout after 2598 seconds -- exiting
rsync error: timeout in data send/receive (code 30) at io.c(239) [sender=3.0.3pre1]

real    279m16.896s
user    62m48.660s
sys     2m18.113s

This command is part several rsync commands for a daily backup.
All commands sync to the same server, the other ones run without
problem.

Here is the full command (with some protection):
RSYNC_ARGS="-rptgo --stats --modify-window=1 --timeout=1800"
RSYNC_ARGS="$RSYNC_ARGS -z --exclude="repo.*.svndmp.gz""
SSH="ssh -l mylogin -p myport -ax -i /path/to/serverkey -o ClearAllForwardings=yes -o StrictHostKeyChecking=no"
time $RSYNC_BIN $RSYNC_ARGS --delete -e "$SSH" -l $DATA/module $BACKUPSERVER:$DATA 2>&1 | grep -v "skipping non-regular file"

The other commands look the same except for different modules. So if at all
it must be something with the data (big files, many files etc). But if I run the
same command on the console it finishes without problem. Of course it can
be a problem with the connection over the Internet itself, but that still leaves
the question with the big timeout times.

I'd be thankful for any hint on this as this prevents automatic backup from
completeing.

bye  Fabi



More information about the rsync mailing list