crypting remote data
david reinares
reinareslara at gmail.com
Mon Mar 10 21:55:59 GMT 2008
After testing a bit more i discovered that fails when i pass the command to
restore and decrypt with dest-filter (in the client side). Always the same
file, no matter how many times i execute rsync. But after testing diferent
folders, i can't see the conection between the failed files. html, java,
etc, but all of them with more files exactly like them in the folder but
rsync'd and decrypted perfectly.
If i do the same with source-filter (server side) it seems to work ok, i can
restore all files. But that is a problem, because we don't want to have the
files decrypted in the server not even for a second, appart of the fact that
having a big bunch of clients restoring at the same time with all the hard
work of decrypting in the server side would overload the server.
--------------------------------------------------------------------------------------------------------------------------------------------------
Very good this patch...thank you
I've been testing this after patching rsync, and works fine to backup...but
when I'm restoring the crypted data after a while rsync shows
rsync: Failed to close: Bad file descriptor (9)
rsync: Failed dup/close: Bad file descriptor (9)
rsync error: error in IPC code (code 14) at pipe.c(208) [receiver=3.0.0]
rsync error: error in IPC code (code 14) at pipe.c(195) [receiver=3.0.0]
rsync: connection unexpectedly closed (55 bytes received so far) [generator]
rsync error: error in rsync protocol data stream (code 12) at io.c(600)
[generat
or=3.0.0]
It's a bit strange. It restores some files before failing, and they are
perfectly decrypted...i'm using openssl as command
-------------------------------------------------------------------------------------------------------
On Sat, 2008-03-08 at 18:33 +0100, david reinares wrote:
> rsyncrypto looks fine, but still not which we're looking for.
>
> Having a complete file updated if a little change happens doesn't
> bother me. We're performing daily rsync, so not many files can be
> changed in a day.
>
> The real problem is about space and performance. If you want good
> performance yo have to sacrifice space, and vice versa.
>
> We decided to save space for client. so we copy file by file, crypt
> it, rsync it, and then delete...a hell for performance, starting a
> rsync connection for each file.
> And worst of all, we loose -b functionality, that was really good
> (having not just a copy of the day before but an extra day)...having a
> previous version of destination data
> in a file by file basis is not a god idea.
I don't understand what problem you are having with -b; could you please
clarify? Suffixed backups should work exactly the same way when
rsyncing one file at a time as they did when you rsynced all the files
at once. The same is true of backups with --backup-dir, provided that
you use --relative so you can specify the same destination argument for
every run.
> Any idea to get the -b funcionality back again and obtain a compromise
> between space and performance?
To fix the performance while keeping the space usage low, look into the
"source-filter_dest-filter" branch of rsync:
http://rsync.samba.org/ftp/rsync/dev/patches/source-filter_dest-filter.diff
You could run rsync once for all the files and specify your encryption
program as the --source-filter, and rsync would call your encryption
program once per file as needed.
Matt
-------------- next part --------------
HTML attachment scrubbed and removed
More information about the rsync
mailing list