transferring large encrypted images.
list at xenhideout.nl
Tue Oct 13 16:54:04 UTC 2015
I was wondering if I could ask this question here.
Initially when I was thinking up how to do this I was expecting block
encryption to stay consistent from one 'encryption run' to the next, but I
found out later that most schemes randomize the result by injecting a
random block or seed at the beginning and basing all other encrypted data
In order to prevent plaintext attacks I guess (the block at the beginning
of many formats is always the same?) and also to prevent an attacker from
learning the key based on multiple encryptions using the same key.
However the downside is that any optimization scheme is rendered useless,
such as rsync's.
What is a best practice for this, if any?
My backup software that I'm currently using, I'm on Windows, does
encryption but since it has the key, it can create
differentials/incrementals so the whole image does not need to be
retransferred. If it works, but that's another story.
Still, differentials and incrementals are all fine (grandfather, father,
son) but updating the/a main full image file itself would perhaps be much
more efficient still.
For some reason my host and rsync on Windows are rather slow, I get some
500K/s upload for a 20GB file. Which takes, kinda long.
I might start splitting the files in lower gigabyte chunks as well,
Currently sending it to another host at 1MB/s which rsyncs it to the real
target where I'm less concerned about how long it takes.
But I'm sending it over with scp (pscp) because for some reason rsync is
also rather slow here (maybe it's my computer).
Scp has no partial option (how silly) but I can just rsync if it fails.
Still, I wonder how other people are doing this, if they do something like
More information about the rsync