Problems while transferring big files
shachar at shemesh.biz
Sun Mar 8 12:48:43 GMT 2009
Wayne Davison wrote:
> We hypothesize that there can be an accidental match in the checksum
> data, which would cause the two sides to put different streams of data
> into their gzip compression algorithm, and eventually get out of sync
> and blow up. If you have a repeatable case of a new file overwriting an
> existing file that always fails, and if you can share the files, make
> them available somehow (e.g. put them on a web server) and send the list
> (or me) an email on how to grab them, and we can run some tests.
> If the above is the cause of the error, running without -z should indeed
> avoid the issue.
If I understand the scenario you describe correctly, won't running
without -z will merely cause actual undetected data corruption?
Lingnu Open Source Consulting Ltd.
More information about the rsync