[clug] Private Browsing?
mike.carden at gmail.com
Mon Jun 24 03:58:59 UTC 2019
> "How we are tracked over the Internet" would be a great topic for
> CLUG, if anyone had the knowledge.
Well, those who attended LCA2019 would have had the opportunity to see a
talk on this subject from Martin Krafft. Here is his follow-up email from
Thanks to those who attended (or will watch the video about) my talk
on fighting Web trackers, and reducing your footprint while browsing
Here are the browser extensions I introduced, so that you can check
them out at your leisure. I am using Firefox, but most of these
should be available for Chrome as well. Most importantly, however,
these are all maintained and Free, so you can consider this list as
bootstrapping your due diligence towards a more private browsing
Please let me know if you have any comments or additions.
1. https://github.com/gorhill/uMatrix, comprehensive
resource/sub-request blocker, which eclipses your standard
ad-blocker, and can do a whole lot more. By the author of uBlock
Origin (https://github.com/gorhill/uBlock), but more bare
2. https://decentraleyes.org/, serve commonly used Web 2.0 fabric
(e.g. jQuery) from localhost to avoid pinging 3rd parties/CDNs
helpfully hosting that stuff.¹
flexible white-/greylisting, and removes cookies on blacklist
after a configurable amount of time.
4. https://github.com/kkapsner/CanvasBlocker/, fuzz two
commonly used fingerprinting methods to make it harder for the
remote to profile you.
5. https://www.eff.org/https-everywhere, ensure you don't leak
plain text information to snoops on your way.
extensions I'm to spoof and fuzz your user-agent. I haven't
quite made up my mind as to which one is best, yet.
dis-allows those potentially long-running threads that can
persist way beyond your web site visit from registering.
Complements uMatrix's control of Web Workers.
Handy means to control Firefox's built-in containers, which
isolate your browsing of certain data-hungry websites from the
rest of your activity.
9. https://github.com/mozilla/lightbeam-we, visualise 3rd party
An introduction to using the network monitor to trace what your
browser is doing on the wire.
11. https://browserleaks.com/, a frigthening collection of
fingerprinting methods you can use to track your progress.
12. https://panopticlick.eff.org/, EFF's anti-tracking checker.
Finally, Ben asked the question what to use for the less
technically-inclined. EFF's privacy badger
(https://www.eff.org/privacybadger) uses machine-learning to figure
out whom you trust, and while I personally want more control and
transparency of what's going on, this "privacy-by-default" approach
is great for people who don't want to configure anything. Privacy
Possum (https://github.com/cowlicks/privacypossum) is a an attempt
to improve on that by someone who worked on PrivacyBadger.
Stay safe, keep private,
¹) There are people who use transparent proxies for this, but SSL
makes that harder and harder. So what about the browser cache.
It's true that your browser should be able to just indefinitely
cache these immutable resources. However, I don't trust that, nor
the companies to set the expiry headers correctly, and apart,
I believe that caching really only prevents re-transfer, but still
pings the HTTP host to find out what the current timestamp/eTag is.
For instance, I picked a random static piece of content from
about:cache: https://assets-cdn.github.com/favicon.ico, which is set
to expire a year from now. When I load it, there's a genuine
connection with Github.com/Fastly, including Referer and User-Agent
and several other bits about me that the other side could use to
correlate their requests:
>Accept-Encoding: gzip, deflate, br
>If-Modified-Since: Sat, 01 Jan 2000 00:00:00 GMT
<HTTP/1.1 304 Not Modified
<Date: Tue, 21 Jan 2019 09:22:55 GMT
<Via: 1.1 varnish
<Cache-Control: max-age=31536000, public
<Expires: Tue, 20 Jan 2020 19:16:02 GMT
I've had the idea now that we could have an extension that simply
auto-answers such outbound requests for resources that we determine
to be valid if present in local cache. For all that matters, this
could be a list of hashes of those resources, which would be one
step closer to simply asking your peers around your whether they
have a certain hash in their caches, so that you can procure it
completely offline. How awesome would that be??
More information about the linux