Hardening Firefox – a checklist for improved browser privacy
Comments
cookiengineer
schiffern
Is this available as pastable text, ideally with the explanation parts as comment blocks?
Happy to see you recommend uBlock Origin and LocalCDN. I would humbly suggest ClearURLs might belong. Another excellent "set it and forget it" extension that skips common tracking redirects.
cookiengineer
Funny that you mention ClearURLs.
I was actually reading its codebase and wasn't happy with it due to potential sanitization problems with its regex usage and other things. So I kind of wrote it from scratch and it got to be something different.
But as always with my projects, nothing is ever really finished or usable.
binaryturtle
Yes, sadly Little Snitch (or a similar app) is required to tame Firefox. It's a real shame since they use "Privacy" as selling point, but for me that starts with being transparent about what they do behind the users' backs with very clear ways to disable any nonsense (no about:config or policy BS, but proper GUI exposed options), or even better with a proper opt-in to those "security" and comfort features.
It pretty much eroded any trust I had in this browser and Mozilla (they are no more better than Google, Meta, Apple in that regard.) If it wasn't for uBlock Origin and availability for older OSX versions I would ditch it (the Dynasty build is the only option I have for a recent browser on my old Mac.)
wackget
My dream is a user-friendly network-level firewall of some kind which can selectively block requests to domains on the entire network level. Something like uMatrix but for your entire network.
Imagine being able to block `ads.google.com` or whatever from all of your devices at once but without having to rely on local DNS. Or being able to block `pornhub.com` from just some of your devices but not all of them.
I assume the technology to do this is readily available in the form of parental control software or enterprise/office firewalls. However on the consumer level I don't know of anything which does this effectively.
mahoro
Neat article.
I would add `layout.css.font-visibility=1` to hide all non-default fonts (makes a canvas font rendering test less useful).
gruez
>No, this will not effectively help to reduce the fingerprint of your Browser.
Ironically many of your fingerprinting tweaks in your article make your more fingerprintable, because disabling random web APIs makes you stick out like a sore thumb (think https://xkcd.com/1105/). Besides, most of the configs you're modifying for anti-fingerprinting purposes are already covered by RFP.
>A LOT more tracking services are integrated into the Firefox browser in various places (like New Tab page, Sync, Pocket, Shavar, Google Safebrowsing, OSCP, etc pp).
Can you elaborate on how these services are "tracking"? Except for maybe safebrowsing, and OSCP, none of these services actually send information on what sites you visit. Unless you mean "tracking" to mean "make connections to the internet".
cookiengineer
The real question is on what OSI layer are you willing to die.
TCP fingerprinting is a real threat and most surveillance systems can identify your unique connection pretty easily, thanks to the quantum surveillance technique where closer surrounding and compromised hops will send you packets faster than the actual endpoint because they are geographically closer to you.
A real privacy aware browser caches everything, and scatters requests as much as possible through different network paths, and farbles Web APIs of the most common system and browser combination (which is Microsoft Edge or Google Chrome on Windows/Android).
I tried to implement all that, but I gave up working on that after I've been targeted in 2021. Maybe I have the time to get back to it after I am done with my current mission.
nilslindemann
Notice, if you have all these settings enabled, you can still be fingerprinted. Test here:
In my tests only Tor was able to prevent that, but using Tor will give you bad rankings on payment sites like PayPal, you may even get banned there.
I learned this from here:
https://news.ycombinator.com/item?id=35243355
That site is now black, surely a coincidence. Here the archive.org link:
https://web.archive.org/web/20250801173508/https://www.bites...
Have a local copy.
LeoPanthera
> fingerprint dot com
Is this an ad? Of all the things I was expecting to see when I clicked that, "Contact Sales" was not one of them.
styanax
Use the EFF version, it's been around a long time: https://coveryourtracks.eff.org/
neandrake
Looks like the source to Bitestring's blog is still up, maybe domain registration just lapsed?
https://github.com/bitestring/bitestring.github.io/blob/main...
mzajc
AFAIK none of these check for changing fingerprints. Your browser could report a very unique screen resolution, but could be configured to change it periodically. How much does that fool fingerprinting algorithms?
nilslindemann
I guess it would, but the problem of getting "bad karma" points on payment processors, etc. remains.
Further, this is not the only form of fingerprinting, there is also e.g. TLS fingerprinting [1].
Programmers should tell people that browsers and the internet are not private, and that everyone who claims otherwise does not tell the truth.
There should be more discussions between people more skilled than me, if and how such methods can be prevented. And that should be documented well. Including how to prevent getting blocked on sites.
A creative attempt would be when millions or billions of users have a software (self chosen!) which randomly visits sites, when the computer is not busy. This would not prevent fingerprinting, but the collected data would be useless (Someone in the other thread suggested that).
Another method would be to declare it illegal and require workers to report such methods to the authorities.
olivergregory
Set the browser.ml.chat.enabled and browser.ml.enabled to false as they intensively use the processor and drain the battery. All that to just find the best name for your tab groups. I prefer to have my laptop last one more hour instead.
yunruse
I took a brief gander at its code [0] and saw it mainly focusses on k-means clustering algorithms (in JS, no less). To my ken this is likely for suggesting new tabs, something a user is even less likely to use than renaming them.
Its constant drain even when not 'in use' seems to imply it's classifying tabs as they change page (though it might be telemetry or uncommented testing). If so, it's an example of premature optimisation gone very wrong.
It's a shame, because it overshadows the fact that naming tab groups is a perfect use case for an LLM, alongside keyboard suggestions and reverse dictionaries [1]. I'm ardently distrustful of LLMs for many, many purposes, but for the tiny parameter and token usage needed it's hard to not like. Which is a shame it's (somehow) such a drain.
[0] https://github.com/mozilla-firefox/firefox/blob/7b42e629fdef... exports a SmartTabGroupingManager, though how or why that is used without being asked eludes me
[1] https://www.onelook.com/thesaurus/ Can be helpful in a pinch when a word's on the tip of your tongue, though its synonyms aren't always perfect.
Vinnl
People drew their own conclusions about the drain being caused by tab group suggestions, but that wasn't the cause: https://bugzilla.mozilla.org/show_bug.cgi?id=1982278#c4
aragilar
I recall an extension (I think by a Mozilla dev) which could do automatic grouping of tabs (back before tab groups was removed). I'm surprised this hasn't come back.
l8rlump
Tab grouping is here, but not sure about automatic grouping.
squigz
Does anyone here struggle so much with naming a group of tabs that you'd reach for an LLM? I mean... really? How often does a group of tabs need a more complex name than "Work", "Gaming", etc? Maybe a suffix for the work project?
bstsb
i think the implementation is more that when you connect two or more tabs, it automatically names it for you, meaning you don't have to rename it (at least, that's my experience with the feature in Edge)
st3fan
Wasn't that a bug that was fixed weeks ago? Like early August? If you are not averse to this feature then it is better to simply make sure you are running the latest version.
neobrain
It was also caught during progressive rollout, i.e. it never affected anyone who had disabled "studies" in their preferences.
olivergregory
I litterally gained one hour off my charged battery when I switched these two settings off, just a week ago, and I keep my browser up to date. So not for me.
privatelypublic
On an 80wh battery, say you go from 7hrs to 8hrs, so- 10wh saved over 8hrs. Thats a 1.125watt difference.
I propose the below as various factors that can be larger:
Slower fan speed because of lower ambient temperature.
Different dark/light ratio and/or adaptive screen brightness.
Wifi spectrum congestion, variable power levels to maintain proper SNR.
Wifi/ethernet- broadcast packets.
The list goes on. Most of these are below a watt, but demonstrate the point that you've got a lot more variables than just one setting in a browser.
craftkiller
You sound like 1.125 watts is insignificant to a laptop, but my laptop idles around 6 watts and it is currently using 8 watts since I've got some stuff running. Shaving off 1.125 watts is a 14-19% improvement.
Nab443
The point is that the shaving might not be due to the firefox variable changes, but rather to other environmental differences.
privatelypublic
Exactly. And honestly- the screen is way way more than 1watt. According to RAPL power, a USB-PD power analyzer- changing the brightness on my 15" 4k OLED laptop screen can reduce power usage by 15-20W. The nature of OLED makes it hard to get a clear picture.
marc_abonce
I didn't know about this 2 settings but they were already disabled in my about:config. I wonder if Debian distributes a non-default about:config with Firefox.
tremon
They do, see /etc/firefox-esr/firefox-esr.js -- but the aforementioned settings are not in that file by default, and [0] seems to suggest Debian does not alter the compiled-in defaults either.
Some quick digging in the source suggests that it's simply not enabled by default in ESR 128. I don't know if that's because it's only enabled by default in a later release, or because it's disabled in all ESR releases; I suspect the former. Compare [1] and [2]:
-pref("browser.ml.enable", false); # in upstream/128.14.0esr
+pref("browser.ml.enable", true); # in upstream/142.0.1
The other pref, browser.ml.chat.enable[d] is not mentioned in that file at all.(edit: according to [3a] and [3b], it's browser.ml.enable and browser.ml.chat.enabled... yay for consistency, I guess)
[0] https://sources.debian.org/src/firefox-esr/128.14.0esr-1~deb...
[1] https://salsa.debian.org/mozilla-team/firefox/-/blame/upstre...
[2] https://salsa.debian.org/mozilla-team/firefox/-/blame/upstre...
[3a] https://salsa.debian.org/mozilla-team/firefox/-/blame/esr128...
[3b] https://salsa.debian.org/mozilla-team/firefox/-/blame/esr128...
marc_abonce
Thanks for the heads-up! Yeah, I'm running ESR 128 right now so when I upgrade to the next ESR I'll keep an eye on these settings.
styanax
You can preload them now in your profile `user.js` - FF will ignore any settings it does not know about, it's "safe" to leave old things that got deleted and add new things coming in the next ESR without harm (that I'm aware, been doing it for years). A user.js is portable, not relying on any given vendor configurations.
geekamongus
I've been a Firefox die-hard since it was called Phoenix a couple decades ago. That said, over the last two months I've been testing Orion Browser (from Kagi, to which I subscribe), and am smitten with it. It's Apple only at the moment, which is a drawback, but if you live in that ecosphere, it's worth a look.
Orion is Webkit-based, can install extensions from Chrome OR Firefox, privacy respecting, and a whole lotta niceties for per-website tweaks and other customizations.
thisislife2
Orion indeed is a decent option for the privacy conscious as it is one of the few browsers that doesn't make any automated connections on startup (with the right config). But, if I remember right, they are still trying to get Ublock Origin to work perfectly on it (i.e. WebExtension support is still not fully supported on Orion).
PaleMoon ( http://www.palemoon.org/ ) is a hard fork of Firefox, with a mix of old tech (XUL) and new tech (from current codebase of Gecko), that is another full-featured zero-telemetry browser that doesn't make any automated connections. But on this too, the full features of uBlock Origin isn't supported as it is based on the abandoned uBlock Origin (legacy) codebase (though the legacy codebase has been updated by some PaleMoon developers, the original developers of uBlock Origin do not wish to support PaleMoon as it doesn't support WebExtension.
Then there's the Tor Browser ( https://www.torproject.org/ ) - it is a soft fork of Firefox, that supports the Tor network and has been configured by default to be "privacy hardened" - it has none of the crap that Mozilla bundles into Firefox, like Pocket, AI, Ads etc. The Tor software bundled in it can be easily deleted, to use it as privacy hardened Firefox. However, there are two issues with it - it does make unauthorised and unwanted automated connections (to SecureDrop) and you can no longer remove the NoScript browser extension that is bundled in it (you could from previous versions). When a browser maker forcefully bundles something in it, (however useful it may be), and does not allow you to modify it, that's well-founded ground to be suspicious of it. (Note: I did finally figure out that one can stop automated phoning to SecureDrop, after disabling it in about:rulesets ).
As the tor browser laid a good foundation to create a privacy hardened Firefox, there are many other browsers that are Forks of the Tor browser - the Mullvad Browser ( https://mullvad.net/en/browser ) is a popular one, and Mullvad bundles its VPN service in it instead of the Tor network. Last I checked, it made some automated connections on startup, so I didn't bother to explore it further).
noman-land
Curious you specified "hard" fork. What exactly would a soft fork look like for a git repo?
comprev
A soft fork would still be able to merge changes in the upstream project and then add their own changes on top. The most basic example would be a soft fork that only changes the default search engine - everything else is the same.
A hard fork - as I understand it - means the development takes a new direction and integrating the original upstream code becomes more difficult as projects diverge, to the point where they are basically incompatible with each other.
iknowstuff
I just need it to stop using Safari’s slow ass animation for the two-finger trackpad swipe back gesture
ProAm
[flagged]
rsync
A fools errand.
No matter how effective this list is, the settings will either revert, change, or be silently undone.
New settings will alter the efficacy of the old ones.
Existing settings will disappear.
The behavior you hoped to configure changed to its opposite.
Remember: there was one morning when we all woke up and saw every dns query sent to cloudflare doh by default, and with no opt-in.
ekianjo
> saw every dns query sent to cloudflare doh by default, and with no opt-in
True. And most people don't even know it.
amarder
[dead]
userbinator
If the first item isn't "whitelist JS", you're doing it wrong. So many problems arise from letting any site run programs on your computer that it's best to reserve the privilege to the most trusted of sites.
stusmall
Meanwhile if I see that I just move on. It just isn't practical to have a workable browser with JS whitelisting for the general case. I doubt people who do this actually do any kind of thoughtful review before hitting "accept". It just adds manual toil with limited benefit.
If they are doing meaningful review, I question how much they actually get done in life.
Sophira
When it was developed, uMatrix was a brilliant method of being cautious about what runs, and it had a logger so you could easily see what domains you should enable the current domain to have access to.
I still use it honestly, but I'll need to move on at some point - not just because it's MV2-only, but also I've found a way in which uMatrix can be bypassed if a website were to specifically target it. (It doesn't affect uBlock Origin, although I haven't tested the Lite MV3 version.)
schiffern
uMatrix can be (somewhat) replicated by setting up uBlock Origin with multiple modes and configuring the "Relax Blocking Mode" hotkey.
So for instance you can start with an extremely restrictive mode like noJS/3rd-party/images, then with each time pressing the hotkey it relaxes to noJS/3rd-party, and then noJS/embeds, then no embeds, then full access (ie like uBO comes configured out-of-the-box).
https://github.com/gorilla/ublock/wiki/Keyboard-shortcuts
https://github.com/gorhill/uBlock/wiki/Advanced-settings#blo...
https://github.com/gorhill/uBlock/wiki/Blocking-mode
You still need a solution for cookies (eg CookieBro), and I still long for an "expanded expanded" mode on uBO's menu that reveals uMatrix columns, but this might help replace some of your use cases that currently require uMatrix.
neandrake
I'm a huge fan of uMatrix too, and have debated getting involved to help revive it.
Can you share more information on the bypass you mention?
Sophira
Given that uMatrix isn't being developed any more, I've been a bit wary about sharing explicit details. I can say that the bypass works on uMatrix 1.4.4 (the latest release) and that even if you've disabled JavaScript from running via uMatrix - whether via a blacklist or via a whitelist - using this bypass will allow JavaScript to run on the page according to your browser settings.
I haven't tested whether it allows the other elements that uMatrix can block - XHR, frames, etc - but I'm pretty sure that it does.
I've been holding onto this info since the GitHub repository has been archived and read-only for years, and I'm not sure of the best way to handle it given that it's not being developed any more. I've wanted to get this out there but I want to make sure that people are safe, especially now that MV2 is deprecated, so there may be even less chance of an update. This is kinda new territory for me.
SahAssar
> I've found a way in which uMatrix can be bypassed if a website were to specifically target it
Please do tell.
Sophira
I've been a bit wary of giving details due to it not getting updated. See my other comment: https://news.ycombinator.com/item?id=45085342
braiamp
I have NoScript by default set to no run. Some sites work better without it.
memcg
NoScript also allows you to select which scripts you want to allow. It's not all or none. You can also view the source before you decide to let it run.
userbinator
I very clearly remember, many years ago, a site (which was otherwise perfectly usable) nagging me to "enable JS for a better experience"; curious, I did and was immediately assaulted with all manner of hostile and irritating crap like popups, text selection hijacking, and even attempts to disable the right-click menu. Hurriedly disabled JS again to regain sanity. Nope. I'm never falling for that again... Of course the problem these days is with sites that don't work at all without JS even if they're just static content, and I suspect part of the reason is to force-feed you the crap along with the real content.
integralid
>and I suspect part of the reason is to force-feed you the crap along with the real content.
Insert the quote about being malicious and incompetent. Modern frontend frameworks like react make sure that your site won't work without js at all, unless you intentionally put some work for that 0.1% of internet users who browse with js disabled
userbinator
It's quite telling that even the mobile version of Chrome, well known for being the most user-hostile browser, has the option to whitelist or blacklist JS and various other features like location access.
Chrome didn't have anything other than a global JS on/off at first, so they clearly added this feature later.
mixmastamyk
You only have to whitelist your top sites once, not every day.
elcapitan
I have also found that since using Noscript that way and only whitelisting the few sites I actually use interactively, now because all the Cookie warning garbage, clicking away of subscribe dialogs etc is gone, all in all I do less manual annoying interaction on sites I visit.
1oooqooq
and it's trivial to do with uBlock.
it have both a global option to disable js, and a option to set a keyboard shortcut to reenable as needed for each site.
gdgghhhhh
Also consider putting Firefox itself into a jail. E.g. using bubblewrap on Linux: https://gist.github.com/richardweinberger/cae9edeafeec4cdf65...
amarder
This checklist is a work in progress, would love to hear your feedback.
Bender
Good work. There are some hardening options that you may be able to glean from ArkenFox [1] and Betterfox [2]. Another addon to consider listing is CSS Exfil protection [3a] CSS Exfil Test Site [3b].
[1] - https://github.com/arkenfox/user.js
[2] - https://github.com/yokoffing/Betterfox
[3a] - https://addons.mozilla.org/en-US/firefox/addon/css-exfil-pro...
[3b] - https://www.mike-gualtieri.com/css-exfil-vulnerability-teste...
amarder
Awesome, will check these out, thank you!
backscratches
Librefox is the most robust/maintained fork I've come across.
backscratches
Typo! sorry. Librewolf is what I meant.
frm88
Thank you, this helps a lot.
jvdvegt
Nice site, thanks!
arcfour
Personally I leave the anonymous daily usage ping enabled in the (perhaps naive) hope that my use of Firefox being counted might help keep it afloat/popular. I guess that's not really in the spirit of a privacy-focused hardening guide but it is something that some may wish to consider.
Some may argue that the data that is included is a bit much for a "daily usage ping," an assertion that I won't dispute—but I will say that I appreciate the fact that Firefox even provides this level of transparency in the first place:
https://dictionary.telemetry.mozilla.org/apps/firefox_deskto...
touristtam
NoScript to automatically disable JS on first load, something to deal with Cookies (like cookie auto delete) and making use of MultiAccount containers. (defo privacy badger installed as well).
usr1106
I have used Cookie Auto delete for years. Last time I checked development seemed to have stalled.
david_draco
I'm surprised Firefox Multi-Account Containers isn't mentioned. Seems ideal to me to keep Web Universes separate.
ris
Disable WebGL. Not in a funny javascripty extension, in about:config.
tremon
Do you have an authoritative link for how to do so? The wisdom of the web seems to alternatively refer to webgl.disabled, webgl.disable-wgl or webgl.enable-webgl2 . I know I can simply disable them all, but I still like to know why there's three toggles and what each one covers.
trod1234
This is quite a rudimentary checklist, and it won't provide much in terms of privacy protections, but it will break a number of sites.
The current state of browser-fingerprinting is off-the-rails, where they deny service if they don't get those fingerprints, and the browser to a lesser degree has had its securities/privacy protections gradually degraded.
Stock Firefox will not be able to provide any sufficient guarantees. There are patches that need to be re-compiled in, because there have been about:config options removed.
I highly suggest you review Arkenfox's work, most of the hardening feature he recommends will provide a better defense than nothing. He regularly also contributes to the Mullvad browser which implements most of his hardening and then some but also has some differentiation from the Tor Browser, but many of the same protections.
The TL;DR of the problemscope is that there are artifacts that must be randomized within a certain range. There are also artifacts that must be non-distinct so as to not provide entropy for identification (system fonts and such that are shared among many people in a cohort).
JS, and several other components, if its active will negate a lot of the defenses that have been developed to-date.
Additionally, it seems that in some regional localities Eclipse attacks may be happening (multi-path transparent MITM), by terminating encryption early or through Raptor.
At a bare minimum, there seem to be some bad actors that have mixed themselves into the root pki pool. I've seen valid issued Google Trust certs floating around that were not authorized by the owner of the SAN being visited, and it was transparent and targeted to that blog, but its also happened with vendors (providing VOIP related telco services).
It seems Some ISPs may be doing this to collect sensitive data for surveillance capitalism or other unknown malign purposes. In either case TLS can't be trusted.
michaelt
> I've seen valid issued Google Trust certs floating around that were not authorized by the owner of the SAN being visited
Did you confirm with the owner that they were unauthorized?
And can you point to the certificates in the Certificate Transparency logs?
trod1234
> Did you confirm with the owner that they were unauthorized.
I confirmed with their support. I provided the certificate chain and sha-256 fingerprint being served, and they said it didn't match, and that they use a different provider for their certificates; which I suppose is Godaddy, at least that's what shows up on the crt.sh logs.
I don't run nor have access to a CT log for auditing. I was told it was revoked though. If you want to look into it you can; I'm including the CRT chain below.
There have been a number of issues uncovered while investigating the silent failing calls. Ranging from silent fail denial of service, unauthorized password changes after-the-fact, and with login credentials it seems some form of MITM translation, and these are consistent across many devices when accessing the site, or services.
The issues seem to clear up every month or so for about 1-2 weeks starting on the 4th, a new set of certs shows up every couple months.
The translation thing is that voip.ms doesn't allow @ symbols in passwords. About 2-4 hours after a lost password recovery the password that is set stops working with no change logged server-side. Replacing the token I used instead of @ with @, logs in without error from the edge successfully after that period occurs, despite their password policy/validator silent failing, and being against the use of that token which they have confirmed is still in effect. Craziness.
I can only conclude that this is some form MITM. I've seen similar issues across other vendors as well, but they haven't noticed failures yet, or have been completely non-responsive (with no phone contact), so they haven't been looking into it too hard, if at all.
www.voip.ms
-----BEGIN CERTIFICATE-----MIIDmjCCA0GgAwIBAgIRALnZP1MTVuRgEWRq2GuA7BkwCgYIKoZIzj0EAwIwOzELMAkGA1UEBhMCVVMxHjAcBgNVBAoTFUdvb2dsZSBUcnVzdCBTZXJ2aWNlczEMMAoGA1UEAxMDV0UxMB4XDTI1MDYwNjA2MzQxOFoXDTI1MDkwNDA3MzM1M1owEjEQMA4GA1UEAxMHdm9pcC5tczBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABHNo2vDB8rWItKKAgiIWPUU0T7upGdVUZE5uF24AjT9KmZhZBpdrXeOWJqWuA4jPWXBUzGrVzUGYsO6B/CvLkKqjggJNMIICSTAOBgNVHQ8BAf8EBAMCB4AwEwYDVR0lBAwwCgYIKwYBBQUHAwEwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUkFJpHoJsK+n+gV1HFtpEg27jxGAwHwYDVR0jBBgwFoAUkHeSNWfE/6jMqeZ72YB5e8yT+TgwXgYIKwYBBQUHAQEEUjBQMCcGCCsGAQUFBzABhhtodHRwOi8vby5wa2kuZ29vZy9zL3dlMS91ZGswJQYIKwYBBQUHMAKGGWh0dHA6Ly9pLnBraS5nb29nL3dlMS5jcnQwHQYDVR0RBBYwFIIHdm9pcC5tc4IJKi52b2lwLm1zMBMGA1UdIAQMMAowCAYGZ4EMAQIBMDYGA1UdHwQvMC0wK6ApoCeGJWh0dHA6Ly9jLnBraS5nb29nL3dlMS9fLTRpRndmQ2FjTS5jcmwwggEGBgorBgEEAdZ5AgQCBIH3BIH0APIAdwDd3Mo0ldfhFgXnlTL6x5/4PRxQ39sAOhQSdgosrLvIKgAAAZdEKXt9AAAEAwBIMEYCIQDjYC10JgSqWCbCE23l++70zgoHwTPUYsAf56DrZiWJdQIhANPwfZiTkV0N5eAVGYlRpPpQ88KovS80pPmThB8VHHzFAHcAfVkeEuF4KnscYWd8Xv340IdcFKBOlZ65Ay/ZDowuebgAAAGXRCl7agAABAMASDBGAiEAzfEhazBYmOhzSujGbLErjeTwKQvV3/ASvWENwXycXCoCIQDM+tYWt/xzqBcYd4Ivs2Pba/EIuBMhRY9Rq2CdntkqYDAKBggqhkjOPQQDAgNHADBEAiBzcp1G0vLRX+ZvWJFnRG83/pt+0fx4j1uXu66R4nbVyAIgekwYAEhhA7aJ19uykBfTG/wesrmcrkLxX6XjqEzE2L8=-----END CERTIFICATE----------BEGIN CERTIFICATE-----MIICnzCCAiWgAwIBAgIQf/MZd5csIkp2FV0TttaF4zAKBggqhkjOPQQDAzBHMQswCQYDVQQGEwJVUzEiMCAGA1UEChMZR29vZ2xlIFRydXN0IFNlcnZpY2VzIExMQzEUMBIGA1UEAxMLR1RTIFJvb3QgUjQwHhcNMjMxMjEzMDkwMDAwWhcNMjkwMjIwMTQwMDAwWjA7MQswCQYDVQQGEwJVUzEeMBwGA1UEChMVR29vZ2xlIFRydXN0IFNlcnZpY2VzMQwwCgYDVQQDEwNXRTEwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAARvzTr+Z1dHTCEDhUDCR127WEcPQMFcF4XGGTfn1XzthkubgdnXGhOlCgP4mMTG6J7/EFmPLCaY9eYmJbsPAvpWo4H+MIH7MA4GA1UdDwEB/wQEAwIBhjAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwEgYDVR0TAQH/BAgwBgEB/wIBADAdBgNVHQ4EFgQUkHeSNWfE/6jMqeZ72YB5e8yT+TgwHwYDVR0jBBgwFoAUgEzW63T/STaj1dj8tT7FavCUHYwwNAYIKwYBBQUHAQEEKDAmMCQGCCsGAQUFBzAChhhodHRwOi8vaS5wa2kuZ29vZy9yNC5jcnQwKwYDVR0fBCQwIjAgoB6gHIYaaHR0cDovL2MucGtpLmdvb2cvci9yNC5jcmwwEwYDVR0gBAwwCjAIBgZngQwBAgEwCgYIKoZIzj0EAwMDaAAwZQIxAOcCq1HW90OVznX+0RGU1cxAQXomvtgM8zItPZCuFQ8jSBJSjz5keROv9aYsAm5VsQIwJonMaAFi54mrfhfoFNZEfuNMSQ6/bIBiNLiyoX46FohQvKeIoJ99cx7sUkFN7uJW-----END CERTIFICATE----------BEGIN CERTIFICATE-----MIICCTCCAY6gAwIBAgINAgPlwGjvYxqccpBQUjAKBggqhkjOPQQDAzBHMQswCQYDVQQGEwJVUzEiMCAGA1UEChMZR29vZ2xlIFRydXN0IFNlcnZpY2VzIExMQzEUMBIGA1UEAxMLR1RTIFJvb3QgUjQwHhcNMTYwNjIyMDAwMDAwWhcNMzYwNjIyMDAwMDAwWjBHMQswCQYDVQQGEwJVUzEiMCAGA1UEChMZR29vZ2xlIFRydXN0IFNlcnZpY2VzIExMQzEUMBIGA1UEAxMLR1RTIFJvb3QgUjQwdjAQBgcqhkjOPQIBBgUrgQQAIgNiAATzdHOnaItgrkO4NcWBMHtLSZ37wWHO5t5GvWvVYRg1rkDdc/eJkTBa6zzuhXyiQHY7qca4R9gq55KRanPpsXI5nymfopjTX15YhmUPoYRlBtHci8nHc8iMai/lxKvRHYqjQjBAMA4GA1UdDwEB/wQEAwIBhjAPBgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBSATNbrdP9JNqPV2Py1PsVq8JQdjDAKBggqhkjOPQQDAwNpADBmAjEA6ED/g94D9J+uHXqnLrmvT/aDHQ4thQEd0dlq7A/Cr8deVl5c1RxYIigL9zC2L7F8AjEA8GE8p/SgguMh1YQdc4acLa/KNJvxn7kjNuK8YAOdgLOaVsjh4rsUecrNIdSUtUlD-----END CERTIFICATE-----
SHA-256 Fingerprint:
FB:4E:10:D3:58:0A:01:1A:9E:82:92:5B:33:AE:1C:E3:6D:5C:B3:97:53:73:B4:1C:4A:7E:30:8B:49:44:BA:24
Support staff said they were investigating the issue, but its been almost 90 days now without next-steps, explanation, or anything actionable. I've been getting stonewalled for quite awhile now.
I've seen this enough times now recently that TLS doesn't seem trustworthy anymore. Its quite maddening too where at a fairly fundamental level in troubleshooting; what you see on one end isn't what is actually being hosted on the other.
SahAssar
The cert you mention is this one, right? https://crt.sh/?id=18844641499
Seems like they use cloudflare as their DNS provider, which uses Google as their cert provider and this has happened before with them. See for example https://news.ycombinator.com/item?id=40452307 where I got into the same discussion but where it was due to porkbun using cloudflare as their DNS backend.
I would not treat this as TLS being untrustworthy, I would treat it as cloudflare issuing certs for you even if you just want to use their DNS (and not their WAF or other products).
trod1234
If that is the one that matches what was posted then yes. A cursory glance, those fingerprints match so I'd say yes that is one of the certificates with which we've narrowed issues down to.
I would think that a large company like voip, would have their certificate provider documented, and available to check when there is a significant issue, so when their customers report a problem and they say it isn't a match that's exactly what they mean.
Also, the only indicator of any of these issues which prompted all this, with any real explanation, is with the cert and by extension the secure tunnel which cannot be trusted. The issues extend to not just this one vendor, but several others as well across multiple devices and network connections. The translation issue appears only visible with this provider though due I suspect to their non-standard password policy, which appears contradictory at the edge in function.
Saying TLS is trustworthy, where things that shouldn't ever happen under TLS guarantees are happening, with no viable alternative explanation for the issues, where they have been troubleshooted over months at both ends, including all the way down to the raw physical level of the OSI level for traffic (at least at the edge)... that doesn't leave anyone with anywhere to go.
Still Trust TLS? If there were a reasonable alternative explanation that ties in and touches on all the issues both mentioned and unmentioned, I'd be the first to consider it.
Clearly there are objective issues where service cannot be relied upon for a business, let alone for anything less demanding. The issues are also not vendor specific and seem to be coupled loosely to geographical region. The only commonality are these Google Trust certificates.
Communications services fail silently across multiple providers, contact forms either fail to submit with weird HTTP error codes for large providers or submit with success only to have non-response with no verifiable record of submission after-the-fact, support chat's fail to load or load with a chatbot pretending to be a human with no record after-the-fact, emails disappear, and many other things that effectively rely upon only one thing in common when taken in aggregate.
When its one thing that happens in isolation at a single vendor sure I'd be more receptive to it being something else on the vendor side, but when every single path fails regularly in the same chaotic way in narrow time horizons, there's a significant issue, and one must question not only the guarantees, but the only common links.
Three or more path failures related to communication, within a short time horizon, all leading back to TLS guarantees, is beyond an astronomical bayes probability that something there is silently happening over those links that shouldn't be happening.
ranger_danger
> JS, and several other components, if its active will negate a lot of the defenses that have been developed to-date.
I thought if you disabled JS, then that would greatly narrow down which user on the internet you are, since very few people (in comparison to everyone else in the world) actually do this.
> not authorized by the owner of the SAN being visited
Source?
> TLS can't be trusted
Do you have more info on this? Why are more people not worried about it?
trod1234
> I thought if you disabled JS, then that would greatly narrow down which user on the internet you are...
It is a fundamentally cursed problem that has a lot of nuance.
You have buckets of people, and the entropy or difference between your collected artifacts and others must be sufficient to uniquely identify a single person, that is the point of fingerprinting. Your natural defense is in not sticking out of that group/crowd uniquely so others in the group may carry the same range of fingerprints.
At the same time, if you homogenize the artifacts to limit it down to a single fingerprint the sites will simply deny access.
Disabling JS altogether doesn't identify you aside from the fact that you are part of the overall group that has it disabled, the trade-off is that all the entropy JS would normally collect cannot be collected. So while they cannot identify you uniquely they can identify the group by denying that group, and that is the fundamental weakness of binary switches. Its a constant cat and mouse.
> not authorized by the owner of the SAN being visited. > Source?
Firsthand experience with a large VOIP provider where communications would fail intermittently but in targeted ways that avoid common test failures. Call tests would intermittently but routinely fail in the silent-fail domain of interrupt driven calling (where you wouldn't know a call was inbound), and the failures would occur only in that domain. The issues were narrowed down to a mismatch in certificates through a lengthy support correspondence where the hosted certificate vs what was being provided at the edge were different. The artifacts were compared manually through correspondence.
The certificate revocation was revoked within 48h once the vendor reached out to Google, but we've seen it happen twice now. The standards in general use don't have a means aside from revocation to handle bad-acting at the root-PKI level. Chain of trust issues like this have been known about for over 2 decades in the respective fields.
> Do you have any more info on this? Why are more people not worried about it?
On the specifics? The Princeton Raptor attack paper (2015) covers the details. Early termination of encryption, and traffic analysis are pretty bad.
Why more people aren't worried? I suppose its because most of the security industry (not all) has accepted the fact that device security is porous, and there isn't really much you can do to hold the manufacturer responsible or to make changes. Surveillance capitalism is also incentivized through profit motive to impose a state of complete and total dependency/compromise.
The state of security today, with your almost routine data breaches every quarter, is a direct consequence from lack of liability, accountability, and regulation, and honestly people in the overall media have stopped listening to many of the experts. They don't want to know how bad, bad is.
The breadth and depth of scale is enough to drive one a bit crazy when looking at the unvarnished reality, its such a complete departure from what is told that it becomes disbelief. The people are largely powerless to mitigate the issues as most of the market is silently nationalized in one form or another. Its no longer about the features people need, but about coercing the market where the only choice is what gets shoveled.
Do you suppose the average middle class worker has the headspace to worry about their county tracking their minute movements through suites of radio sensors (TPMS/OBD-2), or someone hacking into their car through the telematics unit while their driving and disabling the braking, or inducing race conditions related to safety-critical systems.
While we may not care domestically about many of these things when we are told, given our stance on free-speech, if your a critic of China; they might care, and no ones stopping them because the security deficits are almost equally imposed through inaction as they are through action.
Many of these uses are also no commonly disclosed; and manipulated rhetoric is jamming communication channels.
Cable modem security for instance requires a mandated backward compatibility to a 48bit RSA key (Cyphercon Talk), and while there are elevated security modes it boots in that mode, and pulls the config down remotely making it vulnerable to Eclipse.
Money-printing is largely what drives these incentives towards a dysfunctional market.
https://cyphercon.com/portfolio/exposing-the-threat-uncoveri...
temp0826
I just want something (config or extension or instructions or whatever) to give me the best (rather, most common/average) fingerprint possible according to that EFF tool. Does that exist?
olivergregory
That’s the extension Privacy Badger.
HelloUsername
> Privacy Badger
UBO is enough; https://github.com/arkenfox/user.js/wiki/4.1-Extensions#-don...
henrixd
You have to choose from one of two strategies, either you go with tor-browser (also includes Mulvad-browser) route and try make your browser indistinguishable from others or you randomize values to make stable fingerprinting impossible.
When trying to be similar to everyone else, even small changes to the browser, like changing window size, can make you easily identifiable from everyone else. Randomizing will allow you to modify your browser. None of the fingerprinting protections matter if you use your browser and session to login to some sites.
I use multiple browsers. One is for login to sites and tor-browser is for most of my browsing.
This is easily the best fingerprinting extension that I have found so far: https://jshelter.org/
temp0826
I think you're right with having a two-browser setup and I actually want to give it a spin. I envision I'll need to change some habits but it really does seem like the cleanest way to go about it.
ranger_danger
IMO the EFF tool is a bad test because it only compares you against other people that have used the tool.
A better test would be CreepJS in my opinion: https://abrahamjuliot.github.io/creepjs/
I'm not aware of any FOSS browser setup that can actually result in a random FP ID shown in creepjs on every page load (please prove me wrong).
efilife
This probably won't be perfect on the EFF tool but try arkenfox
temp0826
I think it just makes me a little sad that despite the effort I've put in, that tool (called Cover Your Tracks btw, or other ones like amiunique) still report that I am indeed unique.
navigate8310
Or use LibreWolf and call it a day.
Dwedit
Librewolf randomizes your time zone data on every page load, screwing with websites. It's on by default, and can be turned off.
mixmastamyk
It’s not directly in popular distributions unfortunately.
backscratches
True but the next best thing is arkenfox which is even more of a pain. Librefox makes a lot of the flags toggleable/visible in settings which is convenient too.
pndy
It's available as flatpak for a while - if that changes anything
ris
The paradox being that every thing you customize about your browser config becomes another thing that can potentially be fingerprinted and makes you stand out as one of the 1% who has ever looked in about:config.
styanax
That's a common thought, but it depends what you touch. I have hundreds of user.js customizations related to local browser behaviour (e.g. null out a lot of upstream URLs, I caught FF making DNS queries to services I disabled) - https://coveryourtracks.eff.org/ reports I have extremely strong anti-fingerprinting. The "failures" are not related to Firefox.
Reading the details of the results, my unique values as reported come from factors which are hard to address; I have an Arch Linux user-agent (small population) and I have Linux fonts installed (we'll fail the span-font test easily compared to Windows or macOS) are the two huge outliers. These two are my heavily identified traits, the rest are a wash or normality ("1 in 3").
The fonts one is funny - the span-font metric for my system is 16.82 of 115876.67 kinda showing just how using Fonts you can pick a Linux user out of the results with ease. I have "the usual" font packages installed, nothing too fancy just enough to see CJK / UTF-8 around the web like everyone else. (for completeness I do remap 2 or 3 esoteric fdnts on my side due to a site using them).
Side note: I have webgl disabled in user.js; the site reports I'm 1 of 85 statistically, this being the third largest and only outlier in normality.
integralid
>factors which are hard to address; I have an Arch Linux user-agent
Is this hard to address? Sounds like a easy thing to fix.
>These two are my heavily identified traits, the rest are a wash or normality ("1 in 3").
With enough 1-in-3s you can still be unique, sadly. Your fingerprint is AND of every indicator.
cxplay
Firefox doesn't "help your privacy" and make promises just because it's developed by Mozilla, and Chromium doesn't become worse than Firefox just because it's developed by Google. As others have said, this article feels like it was written in LLM.
merek
Will enabling "HTTPS-Only Mode" block http://localhost? If so, it would interfere with web development.
sltkr
No, it doesn't block localhost.
Also you can add exceptions, so if you have e.g. a HTTP-only server on your local network, you can whitelist it manually.
Refreeze5224
No, you can always continue on to non-HTTPS pages.
qingcharles
Can you create some certs for yourself?
Dwedit
How would you know if DuckDuckGo actually respected privacy? It's a black box.
mixmastamyk
DDG reports all clicks to links.duckduckgo.com and improving.duckduckgo.com, etc. Which AdGuard seems to block, and maybe one of my browser settings/extensions as well.
Dwedit
Allegedly they do that for "referer protection" reasons to hide the search term that was used to get to the site.
wizzwizz4
This is separate to referrer protection, which is only active as-needed – in most modern browsers, DuckDuckGo's referrer protection never kicks in, because they support rel="noopener noreferrer".
patrakov
Important: this is a checklist for privacy, not for general security.
clircle
I just install librewolf.
50208
Thanks for this ... great start. Mozilla Firefox COULD be an even more powerful source for good. Stop focusing on BS VPN, AI, etc ... focus on great browser, security, privacy. There is a possible niche for a centrally managed, security focused browser for companies ... like the Island Browser ... as an option.
bmacho
Basic things that browsers lack:
- hooks between network steps
- hooks between steps while rendering/interacting with a website
Things that I want to do but I can't: - catch a request and modify it, e.g. when a webpage tells my browser to visit ajax.googleapis.com/jquery.js then my browser SHOULD NOT DO IT. Seriously, just don't start running shit on my computer when I click something. Noone wants that, apart from Google. Not the users. I should be able to modify that request, and serve jquery from somewhere else.
- stop the browser's javascript execution
- run my own javascript (these two are currently unavailable together, if you don't allow javascript on a webpage, then you can't run your own) (or modify HTML/DOM in some other language)
I don't think Firefox is worth supporting, I believe it is a Trojan Horse of Google (or at least a Useful Idiot), and its existence is the main reason we have exactly 0 browsers (open source or proprietary) right now. It should die, so something else might flourish.captainepoch
If you want a hardened version of Firefox, download LibreWolf.
procaryote
Yeah, librewolf does a lot of the article's suggested things by default and is less likely to introduce new misfeatures to opt out from
BaudouinVH
or Waterfox
BaudouinVH
Privacy Possum is better than Privacy Badger imho
barnabee
It appears not to have been updated since 2019?
BaudouinVH
Oups - I had no idea. Thanks for noticing this. I'm back to the badger. :)
dotcoma
Shouldn’t Firefox come hardened out of the box ?
jdlshore
There’s tradeoffs between privacy and convenience. Mozilla makes a particular set of tradeoffs, based on their judgment of what the average user will put up with; checklists like this allow you to make more aggressive tradeoffs.
amarder
Yes, but a lot of Mozilla's money comes from Google. https://www.pcworld.com/article/2772034/googles-search-monop...
50208
Isn't that just to provide the search engine default? Which is easily changed?
50208
That would be a great move by Mozilla. Have a "secure" version: Firefox, Firefox ESR, and Firefox SECURE. Or maybe just provide a switch to turn on.
542458
90% of the stuff in the OP will break certain sites… The problem is that non-technical users will think “oh, privacy, that’s good” (which it is, don’t get me wrong), click the “max privacy” option, but then be unable to fix things when they don’t work and switch back to Chrome.
1oooqooq
remember that "firefox -p" opens the profile manager so you can have one profile without the last two items on that list, just for when you need one or two sites that have broken login code that requires 3rd party cookie (it's always for malicious reasons rather than incompetence, but if you have to login you have to login)
pndy
Mozilla introduced new profile manager for Firefox somewhere around May. This thing uses new storage format and ignores already existing "old" profiles, except for the default one. Data remains untouched, profiles created in the past are still here accessible by about:profile and if you don't want to use that new profile manager set browser.profiles.enabled entry to false.
From what I've seen around people using the popular customized Firefox variants, like Floorp, Librewolf were surprised by this and not fond of the change.
panarky
Hypersegregate browsing with profiles.
One profile for banks, a different profile for Amazon, a third profile for Google sites, a fourth for news sites I log into, a fifth for news sites I don't log into, a sixth that automatically forgets everything on exit for sites that UBO breaks.
Then delete all data on each profile periodically, weekly for news sites, monthly for Amazon and banking sites.
It's a giant pain in the ass juggling all these profiles. Seems like there should be a browser that automatically and transparently isolates every site in its own profile.
Pooge
Good luck remembering to do that every time Firefox updates. Hopefully, that's the only time it changes settings, right? Right...?
Go for a Firefox fork and jump ship to Ladybird once it comes out. Forks wouldn't exist if it was trivial to revert Mozilla's mistakes.
mixmastamyk
There are things I also do like removing sponsored links on the about page and url bar. Also disable type-ahead to search engine.
My understanding is that Privacy Badger no longer learns by default. I never wanted that, just block known things, like search engine click hijacks.
I’m not sure what to do about the user agent header. Changing or simplifying it tends to break sites. Also I’d like to promote Linux there but that’s at odds with privacy.
mixmastamyk
Sorry, not the about page, the newtab page.
piskov
After the shit Mozilla pulled with ad/tracking this summer, the first step for improved privacy should be to delete firefox and switch to brave / what have you.
creesch
> switch to brave
Fun suggestion to switch to a browser that has a company behind it that has pulled a lot of shady stuff related to ads and tracking. A company where privacy is more marketing than a core value.
Edit: Since people are going to ask anyway, here is an article that covers a lot of the shady stuff brave pulled https://thelibre.news/no-really-dont-use-brave/
If you are one of those folks who don't care about the political arguments, feel free to skip over paragraph one and two. Paragraph three till ten cover actual shady stuff done by brave the company itself.
There is one more thing I can add to the list, though it wasn't as widely published about. At some point the team behind Brave decided to implement browser extension support from scratch and only support specific extensions. Which sounds okay in theory until you realize how they did so. Without involving the extension creator they would fork a version of the extension and bake that into Brave. They did so without informing the extension creator, meanwhile users would still go to the extension creator for support who couldn't fix a thing.
Every time one of these things come up, the Brave team either is irked (but changes it anyway) or goes "oh, yeah we'll remove it in the future". This to me indicates a company culture where there is no thinking ahead about the impact of features or where they simply don't care as long as they aren't called out on it.
This consistent pattern over a period of years has, to me anyway, shown that issues such as privacy or even being user centered are not a core part of their thinking but merely a marketing gimmick.
And to be ahead of the curve on some other things I have heard people say over this. Just that Mozilla sucks doesn't mean alternatives can't be worse.
antonok
Are you referring to the Manifest V2 extensions supported by Brave? The original extension creators were made fully aware of those plans ahead of time and have been in contact with Brave since then, e.g.:
https://github.com/hackademix/noscript/issues/359 https://github.com/uBlockOrigin/uBlock-issues/discussions/29... https://github.com/brave/brave-browser/issues/41173#issuecom...
I'm not sure how you can interpret forking open source codebases as a "shady" behavior (it's one of the most important reasons to use open source in the first place), but in this case there is a high demand for said extensions and Brave has provided the only way to continue doing so on a Chromium rendering engine.
(I am one of the devs who worked on the spec for this feature)
piskov
Could you actually cite some from Brave’s privacy policy (as firefox has now) to corroborate these claims
creesch
See my updated comment, that contains all the details you should need. Unfortunately nothing about their privacy policy, I personally feel that actions taken by brave speak louder than whatever they have in their written policy.
frm88
Thank you for that article. I've considered Nobara 42 for my new PC but it comes with Brave as the only browser capable of VSS and now I'm worried.
backscratches
A fork of Firefox like librewolf is even better incentives I think
positron26
My entire feeling about privacy is that, while surveillance economy tends to amplify the worst parts, ultimately, Richard Thieme's presentation is right: https://www.youtube.com/watch?v=atDgnkvzD8I
Thought experiment: in 100 years or even ten, can you imagine that there will not be tiny little camera robots that can get into the home of every person alive? Wouldn't every single living person be prone to having nude and unflattering, private moments leaked all over the internet?
Socially, if privacy is a construct, then so is the fallout we expect others and ourselves to feel when privacy is violated. To some extent, not all, this is self-inflicted Victorian thinking. To the extent that it's true, part of the answer is, in the words of the brave (lol) Michael Cohen, "So what?" Really, so what? I hope we can get to that kind of reaction to adults having their privacy upended because it just takes so much of the bite out of the problem, the shame that relatively innocent people would experience for something completely out of their control.
As far as the getting it back under control thing, we may also be coming to a point that more technologies are so dangerous or impactful that there becomes a need for more strict control so that powerful tech like miniaturization produces paper trails and the use of such technology comes with an implicit requirement for openness. I don't really care that people can use miniaturization, but I care if they can anonymize it to the extent that we create a lawless society with no remaining means of accountability.
What *will* Russia and North Korea do when it becomes plausible to unleash little robot assassins either in small numbers to target individuals or mass numbers to carry out what is essentially nuclear scale death without nuclear scale fallout and destruction? It is plausible that this is a new facet of WMDs and MAD-based deterrence.
Privacy, robots, and the inevitable slide into world war 3.
henrixd
[dead]
This is kind of a stupid ChatGPT article.
No, this will not effectively help to reduce the fingerprint of your Browser.
A LOT more tracking services are integrated into the Firefox browser in various places (like New Tab page, Sync, Pocket, Shavar, Google Safebrowsing, OSCP, etc pp).
I wrote a more detailed article about this, and got an "as good as possible" as a result.
But yeah, please please start to use a Host Firewall where you can block on a per-domain and per-port and per-process basis (like LittleSnitch, OpenSnitch etc) to validate your assumptions. UIs will always lie to you, including the one from Firefox.
[1] https://cookie.engineer/weblog/articles/firefox-privacy-guid...