Run rsync through intermediary server with SSH

Jeroen van der Vegt jeroen.van.der.vegt at
Tue Dec 9 08:10:29 GMT 2008

You could setup SSH port forwarding to forward the rsync commands to your
remote servers, then you'll be sure rsync fails when ssh is killed. The
commands should be something like this:

ssh gsync at cppdf -L 10000:sr-pdf:873 &

rsync gsync at localhost --port 10000

See e.g.
rding.html for more info.


Jeroen van der Vegt	 
System designer 

Technolution B.V.	 
Telephone:         +31(0)182 59 40 00	 
Fax:              +31(0)182 53 97 36	 
E-mail:           Jeroen.van.der.Vegt at	 
Visit us at:	 
Mailing address:  P.O. Box 2013 - 2800 BD Gouda - The Netherlands	 
Address:          Zuidelijk Halfrond 1 - 2801 DD Gouda - The Netherlands


This e-mail is intended exclusively for the addressee(s), and may not be
passed on to, or made available for use by any person other than the
addressee(s). Technolution B.V. rules out any and every liability resulting
from any electronic transmission. 
> -----Original Message-----
> From: at
> [ at]
> On Behalf Of Jimmie Fulton
> Sent: dinsdag 9 december 2008 3:15
> To: rsync at
> Subject: Run rsync through intermediary server with SSH
> I'm using rsync, ssh, and cron glued together with Python as a
> push-based synchronization system. From a single location, I push
> content out to various offices.  I log stdout/stderr on the master
> server to make sure everything is running smoothly.
> I would now like for some of our "regional hubs" to take on some of the
> load (bandwidth-wise), while still retaining my centralized master for
> command execution and logging.  So far, I simply send the desired rsync
> command as a remote ssh command.  It looks like this:
> ssh gsync at cppdf "rsync -ritzOK --delete-after --timeout=600 --progress
> --stats --verbose --bwlimit='350'
> --exclude-from='/home/gsync/.gsync/config/office-filters/sr' --filter='.
> /home/gsync/.gsync/config/general-filters/junk' --filter=':-
> gsync-filter' '/s/cad/Revit/' gsync at sr-pdf:'\"/s/cad/Revit\"'"
> The idea is that I can push to our regional offices, and then have them
> push to their near-by peers using their own bandwidth, yet still collect
> output at the master.
> The only problem I'm running into is this:  if I kill the ssh session on
> the master, the remote rsync process continues to run on the
> intermediary server to the final destination.  This can lead to multiple
> rsync jobs running from the intermediary to the same destination server,
> flooding it.  I'm having difficulty figuring out what the appropriate
> mojo is to ensure that the rsync jobs die with the ssh connections.
> Hope this is clear.  Any advice?
> Thanks,
> Jimmie
> --
> Please use reply-all for most replies to avoid omitting the mailing list.
> To unsubscribe or change options:
> Before posting, read:

More information about the rsync mailing list