[clug] Anyone using 'snappy', Google's fast compression?
steve jenkin
sjenkin at canb.auug.org.au
Thu May 12 00:55:30 MDT 2011
Mike Carden wrote on 12/05/11 4:49 PM:
>> There exists no lossless compression algorithm that can reduce
>> the size of all inputs (if it did exist, run it on the output again and
>> again).
>
>
> Step 3. Profit!
In his Turing Award speech, "Reflections on Trust", Ken Thompson talks
about writing programs that print themselves.
This nicely answers the question of "If I multi-compress a file, will it
reach a stable size".
A: good algorithms *can*. Proof is by Thompson program.
But if you are hitting them with streamed content (they can't
backtrack), then I don't have an answer...
you think not, but there are much smarter people out there than me :-(
--
Steve Jenkin, Info Tech, Systems and Design Specialist.
0412 786 915 (+61 412 786 915)
PO Box 48, Kippax ACT 2615, AUSTRALIA
sjenkin at canb.auug.org.au http://members.tip.net.au/~sjenkin
More information about the linux
mailing list