[clug] How you know your Free or Open Source Software Project is doomed to FAIL

Scott Ferguson scott.ferguson.clug at gmail.com
Thu Jul 30 12:13:35 UTC 2015

On 30/07/15 16:36, Paul Harvey wrote:
> On 30 July 2015 at 16:30, James Ring <sjr at jdns.org> wrote:
>> On Wed, Jul 29, 2015 at 11:23 PM, Alex Satrapa
>> <grail at goldweb.com.au> wrote:
>>> On 30 Jul 2015, at 16:05, James Ring <sjr at jdns.org> wrote:
>>>> The possibility that somebody out there is going to somehow 
>>>> modify the encrypted shell script response in-flight is just
>>>> not a concern to me. Also I'd think Google has more to lose by
>>>> publishing bad scripts than I do running them.
>>> It won’t be Google that publishes the bad script. By definition
>>> the actor in the “Man in the Middle” attack is neither end of a
>>> presumably two-way conversation.
>>> You *think* you’ve connected to Google, but the attacker poisoned
>>> your DNS so you’re actually connected to g00gle, and the script
>>> you’re piping into shell sets up a rootkit rather than an
>>> Internet cat picture archive.
>> Well, they'd have to poison the DNS and also convince one of the 
>> certificate authorities trusted by wget to issue a SSL certificate 
>> with Google's name on it to the attacker.
> Which begs the question: if giving tutorial users instructions to do 
> the PGP/shasums dance is the alternative, that's information which 
> will be delivered over https anyway, and if it's good enough for
> one, why not just pipe it to a shell in the first place...

Why not give the user of the project both choices, and list the safest
choice first?

Force-feeding save labour and chewing but choices are good.

personally I want it on a plate - then I can push it around for a bit,
check the bathrooms, smell the waiters, and sacrifice a chicken to see
if it's safe to eat the food.

> .. some responses to this line of thinking are mentioned in the 
> Georgiev paper, curl/wget & friends don't do cert pinning etc.

Curl can (it's version dependant)

# get prerequisites
apt-get install ca-certificates curl

#Option 1 just get cet
openssl s_client -connect $someproject:443 >./x.cert </dev/null

#Option 2 get cert and pin CA
openssl s_client -connect $someproject:443 -CAfile
> ./x.cert </dev/null

#Option 3 get cert and pin CA and extract cert
echo -n | openssl s_client -connect $someproject:443 -CAfile
/usr/share/ca-certificates/mozilla/DigiCert_Assured_ID_Root_CA.crt | sed
-ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > ./$someproject.pem

#verify SHA1
openssl x509 -noout -in $someproject.pem -fingerprint -sha1

# verify SHA-256
openssl x509 -noout -in $someproject.pem -fingerprint -sha256

AFAIK wget does not yet allow cert pinning - but I haven't really looked
into it. It does allow CA pinning so "import the CA if it's a private
one"?? (thoughts anyone?)

> but at the end of the day, you really have to pick your battles.

Agreed. A weighted decision matrix is good. Shakespearean oracles and
cauldrons are hard to find these days.

> On this occasion perhaps the hypothetical scalpel that's been
> splashed with a bit of hot water (referring to Scott's reply earlier)
> is actually not the worst that could happen in this case :-)

As long as it's a life-or-death scenario rather than a false choice
being offered in a modern hospital(?)

Kind regards

More information about the linux mailing list