[clug] Offline snooping

Scott Ferguson scott.ferguson.clug at gmail.com
Wed Feb 5 23:57:52 MST 2014


On 06/02/14 17:33, Bob Edwards wrote:
> On 06/02/14 17:18, Scott Ferguson wrote:
>> Re-sending to the list
>>
>> On 06/02/14 16:03, Bryan Kilgallin wrote:
>>> Thanks, Scott:
>>>
>>>> Much of the
>>>> software used by the NSA is either hidden in commercial proprietary
>>>> software (Closed Source), or proprietary firmware (i.e. embedded coded
>>>> used for devices e.g. your hard drive controller, your network card,
>>>> your BIOS etc).
>>>
>>> How much of a hassle is the firmware problem?
>>
>> Define hassle? :)
>>
>> If you want control of your computer you need control of all the code -
>> that includes the firmware. If you don't have control then you cede it
>> to others - that is my definition of hassle in this instance.
>>
> 
> Of course, trusting the firmware, or otherwise, is not the end of the
> story. Certainly, there are demonstrated exploits against firmware-
> infused malware, so it may well be reasonable to not trust it anymore.
> 
> However, why should you trust your CPU and it's microcode? If not the
> main CPU, what about the GPU or other processing elements inside your
> computer (network card, block device, usb controller, wifi processor
> etc.)? Who's to say that there isn't a dormant agent waiting for a
> wake-up signal to start exfiltrating the directory listings of your
> block device(s) etc.?

Silicon poisoning? (Stealthy Dopant-Level Hardware Trojans not pulmonary
disease)

IMO the rational thing to do is assess the risks, then manage them.
Assessing the risks of your hardware (not the firmware) being
compromised leads to the conclusion that it is likely to be done only by
powerful actors - and likely only one actor per device.
How to manage the risks posed by actors powerful enough to implement
*working* exploits into your hardware (in a world where Ford can't make
a car that doesn't need a parts recall and M$ issues a greater quantity
of updates than the software it updates)? I'd refer to my cognitive
dissonance rule of risk management - worry and work should stop on the
sane side of it (IMO) ;)

NOTE: aluminium foil is no good, it must be tin. Tin and copper are
toxic. :)



> 
> What about running a "soft CPU" built on an FPGA, because then you
> sidestep any such risk, right? But almost all FPGA code is compiled
> using proprietry tools - who's to say that they can't add a "little
> extra" to your home-brew CPU design?

Long running discussion on scheier about just that.

My first question are usually why, and when? The answer to the first
question often negates the need to answer the second - as many of the
things people wish to entrust to the computer *and* the network should
*not* be. The only answer to their requirements specifications entails
pixie dust (but, but, but I'll pay for it... how hard can it be?)

> 
> Where do you start from if you want a truly trusted computing platform?

Looking up the definition of "oxymoron"?  ;D

> What would you need to have in order to stand up in court and testify
> that you know beyond a reasonable doubt that your computing platform
> is completely trustworthy?

That's an easier question. A court that is not technically trained.

> 
> Cheers,
> 
> Bob Edwards.
> 



More information about the linux mailing list