[clug] Another thought on Cleanfeed

Paul Matthews plm at netspace.net.au
Tue Oct 28 12:33:54 GMT 2008


Michael Cohen wrote:

Thank you for your response Michael. Have been working on collecting a
few facts for my own letter to the Minister, Opposition and the Greens.

My main thought is that the by interfering with legitimate sites and
http traffic, as well as reducing the speed of existing contracted
connections, the Government is opening itself up (unbeknownst) to almost
limitless litigation. I just haven't seen anyone tackling that side of
the equation before.

All those kiddies (esp the bigger 40 year old kiddies) with their well
loved World of Warcraft / Guild Wars / Whatever avatars, who they have
sunk 100's of hours into, and who can no longer engage in their
favourite time sink, are going to be really upset. Lawsuit upset.
> Should the proxy uncompress the page before scanning it? Maximum
> compression ratio for gzip is about 1000:1 so a 1mb page will
> decompress to 1gb in the proxy ram before it can be scanned - does
> this scale to a country?
>   
I see a great opportunity for almost limitless evil here:

Step1) Set up web pages with 10Gb .bz2's, which end in a four letter word
Step2) Generate lots of http requests back and forth
Step3) ...
Step4) Profit

-- 
Fools ignore complexity. Pragmatists suffer it.
Some can avoid it. Geniuses remove it.



More information about the linux mailing list