Run rsync through intermediary server with SSH

Jimmie Fulton jimmie_fulton at
Tue Dec 9 02:14:55 GMT 2008

I'm using rsync, ssh, and cron glued together with Python as a
push-based synchronization system. From a single location, I push
content out to various offices.  I log stdout/stderr on the master
server to make sure everything is running smoothly.

I would now like for some of our "regional hubs" to take on some of the
load (bandwidth-wise), while still retaining my centralized master for
command execution and logging.  So far, I simply send the desired rsync
command as a remote ssh command.  It looks like this:

ssh gsync at cppdf "rsync -ritzOK --delete-after --timeout=600 --progress
--stats --verbose --bwlimit='350'
--exclude-from='/home/gsync/.gsync/config/office-filters/sr' --filter='.
/home/gsync/.gsync/config/general-filters/junk' --filter=':-
gsync-filter' '/s/cad/Revit/' gsync at sr-pdf:'\"/s/cad/Revit\"'"

The idea is that I can push to our regional offices, and then have them
push to their near-by peers using their own bandwidth, yet still collect
output at the master.

The only problem I'm running into is this:  if I kill the ssh session on
the master, the remote rsync process continues to run on the
intermediary server to the final destination.  This can lead to multiple
rsync jobs running from the intermediary to the same destination server,
flooding it.  I'm having difficulty figuring out what the appropriate
mojo is to ensure that the rsync jobs die with the ssh connections.
Hope this is clear.  Any advice?



More information about the rsync mailing list