Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can someone paste their results (or at least bits of fingerprinting entropy) from https://panopticlick.eff.org with the latest Firefox?

With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).

With Safari on iOS I get 11.71 bits of entropy, with the most entropic value being my screen size and color depth.



I think it's funny that panopticlick gives me a little red X for not allowing trackers from companies that have "promised" not to track me. I have no incentive to do so, as I do not get any sort of compensation if they are found to be in violation of those terms.


Just sticking this comment here since I think most people would like to see how the site works. People are interpreting the numbers incorrectly, as I also did at first even though it says at the top exactly how they're measuring these numbers.

Your entropy is determined exclusively based upon the people that have used the site in the past 45 days. For now that number is about 204k. So for instance if you see something has an etropy of 9.08 (as my user agent does) you'd also see that it says 1 in 542.15 browsers have this value. 2^9.08 ~= 542.15. The ~ there only because the thousandths and onward digits are not showing. My exact entropy would be about 9.08254825596. All that means is that of the ~204k people that have used the site in the past 45 days, 377 had the same user agent.

The problem with this is that the people using this site are going to be a heavily biased sample. And so by tuning to reduce your entropy, you are not actually reducing your trackability but instead making yourself look more like the subset of people that are actively using this site. And this becomes an even bigger problem since I do imagine this site is actively shared on more technically orientated sites, such as this one. But the settings of technically orientated users are often going to vary somewhat significantly from the settings of the other 99% of users.

The point of this is that by working to reduce your entropy on this site you may, ironically, end up making yourself more trackable. So the numbers should be taken not as a measurement of trackability, but rather as an interesting insight of your browser/setting differences/similarities of other users with the site.

---

Also, 100% agreed on the silliness of them marking you down for not allowing cookies marked Do Not Track friendly. Until such things are enforced, in code and ubiquitously, they're meaningless unenforceable promises that rely on tracking and advertising corporations never lying.


17.62 bits on firefox, 11.0 on Tor, 17.63 on chrome.

On firefox, the big contributors are HTTP headers (my native language is announced), hash of WebGl fingerprint and time zone.

On Tor big contributors are hash of webGL fingerprint, screen size.

On chrome, they are system fonts, hash of canvas fingerprint, user agent, and time zone.

I am not too concerned about the fingerprinting in firefox since I have strict blocking on, ublock origin, and separate containers for facebook and google. Based on the small amount of data facebook has on me, all the blocking is working pretty well.


Similar results for me. Does anyone know if it's possible to turn off WebGL, and if so, how? AFAIK I never use it for anything and I'd rather have increased anonymity. (Assuming disabling it prevents it from being used for fingerprinting.)

Edit: Answering my own question. In `about:config`, change the `webgl.disabled` preference from `false` to `true`. This reduced the "bits of identifying information" from WebGL from 11.26 to 2.56.

Edit 2: Apparently the CanvasBlocker add-on is a better solution as it randomizes the data used for fingerprinting on each read, and works for several exploitable APIs, not just WebGL. https://addons.mozilla.org/en-US/firefox/addon/canvasblocker...


CanvasBlocker actually increases your track-ability because the consistent factor is now that you have a changing canvas fingerprint (which almost no one does).

This is why Safari tries to give a universal canvas fingerprint so you can "blend in" with other users.


I agree that a universal canvas fingerprint is better in principle, but practically who is going to write a script to search for all visitors who only differ by their canvas fingerprint and then identify them as one browser because the fingerprints are non-standard?


Practically, it requires little more work than creating a canvas fingerprint framework itself! If someone puts in the effort to write a framework that tracks you via canvas fingerprints, it’s little more work to add to the script with another one that performs a simple diff to find people trying to evade it.


Panopticlick's numbers are extremely confusing and borderline useless.

On my initial run, I got an overall entropy of 17.63. My two biggest identifiers were screen resolution (1000x595x24 which was approx 1/22000 browsers) and webgl hash (approx 1/3800 browsers). I fixed screen resolution to 1000x600x24 (approx 1/85 browsers) and disabled webgl hashing (approx 1/6 browsers) and the overall entropy did not change one iota, despite also closing browser, flushing cache and cookies, etc. I gave it another run with a deliberately weird resolution (1420x701 which was something like 1/105000 browsers) and once again, the overall entropy was exactly 17.63. So based on my experiment, it seems that screen resolution and webgl hash have no effect whatsoever on [Panopticlick's] overall entropy score.


An update on last night's experiment, if anyone cares. The next largest identifier was system fonts (approx 1/1300 browsers). I set `browser.display.use_document_fonts=0` which hid the system fonts (now the same as approx 1/10 browsers) and my overall entropy dropped to just below 11 bits. At this point, none of the metrics were less common than 1/10 browsers, so I figured I wouldn't be able to do better than that.

As a side note, I ended up re-enabling system fonts because disabling them broke a large percentage of web sites' CSS.


> Based on the small amount of data facebook has on me ...

How did you get all data, that facebook has on you?


Settings -> In list of links on the right, Your Facebook Information -> Access Your Information -> At the bottom, Information About You -> Ads


The numbers don't make much sense to me. On FF I get 14.05 with NoScript active. Curiously the headers increase from 1.68 bits to 3.47 when NoScript is running.


NoScript is likely a valuable fingerprinting indicator, given that the vast majority of browsers have JavaScript enabled but you don’t.


Of course no script increases the entropy. Most people don't run NoScript, so you are more identifiable when you run it.


It increases the entropy for the JavaScript tests as would be expected. It shouldn't affect the HTTP_ACCEPT header.


I'm curious about the difference between things like NoScript and native Brave script blocking.

In particular I was going to make a snarky comment that the site seems to, appropriately, not work when script blocking is enabled on Brave. I do get the site to do the refresh business a couple of times, but no results are ever displayed.


> On Tor big contributors are hash of webGL fingerprint, screen size.

Doesn't tor randomise the window size on startup? Though I guess it chooses some sensible size for your screen which is then leaking info about your screen size (in a pretty indirect way).


Not quite correct. It automatically picks the browser window size based on the monitor its being displayed on, in some multiple of 200x100. There is no randomization on every run.

https://tor.stackexchange.com/questions/15705/why-does-tor-b...


That's false. Tor Browser actually advises the user to keep the window at the default size to avoid it being used as a fingerprinting vector.


I wonder if these fingerprint checks look for the more stealthy and sinister approaches, like localhost port scanning [1] and specific CSS selector behavior...?

[1] https://twitter.com/davywtf/status/1132026581038190592


There's so much fingerprinting that can't really be disabled. Think about it:

Performance

- Single-threaded CPU performance

- Multi-threaded CPU performance

- WebGL performance

- Video performance

- Network performance (how long does it take to transfer data to various locations, what's the lag, is the lag consistent, etc.)

- (Maybe) Time it takes to execute certain JavaScript functions

User behavior

- How does the user use their mouse when navigating web pages?

- Not at all?

- Jerky movements?

- Smooth movements?

- If the user uses the keyboard, do they appear to be advanced keyboard users, do they have an IME, etc.

- Does the user press X buttons on tiny annoying popups that wouldn't interfere with the page's browsing experience?

- Does the user appear to block access to certain resources? (ad blocker)

- Does the user's workplace/country/etc. appear to block anything?


True, in the current situation, we can only "limit" fingerprinting. This is the result of the characteristics of the sandbox we use for the Web. Remove Javascript, and most of these problems go away.

This is why I stay attached to making simple HTTP apps that don't require JS, but this is clearly not the direction the web is going at this time.


Please note: the fingerprinting protection in this blog post is different from the resistFingerprinting about:config pref which would affect your entropy bits on panopticlick.


Interestingly enough, uBlock Origin actually stops that site from working, seems to break the fingerprinting step. If i disable uBlock, I get 16.63 bits of identifying information. Likewise the canvas fingerprint is the biggest, in my case 1 in 101154.


Can anyone explain why canvas fingerprinting is so difficult to eradicate without breaking canvas?


You can draw with different fonts and background colors, then grab the raw pixel values and hash them. The hash will be different depending on the versions of fonts installed, the OS, the GPU, the browser's text rendering algorithms, and the subpixel order/orientation of the display.

See https://en.wikipedia.org/wiki/Canvas_fingerprinting for more info.


They should have standardized the font rendering for canvas. For those rare graphs using canvas i can live without Cleartype and only use normal anti aliasing, and with modern high DPI displays you hardly even need that in the first place anyway.

I mean, one of the major reasons for using canvas over DOM is to get pixel perfect placement of things, like text connecting to an arrow. If your fonts suddenly change size that won't work anymore. SVG has the same problem, on some computers with slightly larger letter spacing a line might become too long so it wraps and become two lines, totally spoiling the desired diagram.


Thanks. I'd read the Wikipedia page, I'm just not clear why this process is allowed (or more importantly, why it can't be removed).

Is there a legitimate use case for being able to read back pixels?


> I'm just not clear why this process is allowed (or more importantly, why it can't be removed).

Because we don't know how to make CPUs do pixel perfect images every single time. (I wrote a little more above)


Literally any kind of image or photo manipulation, from an MS Paint-like webapp to Instagram-like photo filters.


There are many legitimate uses. Vendors have experimented with making canvas readback opt-in (with a popup) but I don't know if it'll ever ship because it simply breaks too many websites. Sometimes it's used at page load to generate variants of a single image to reduce file sizes, or used by games to prepare image assets before they start up.


A lot actually, for example you can do easy image resizing (like taking a selected image and resizing before uploading to the server.)


tldr: drawings aren't pixel perfect.

Longer: this is actually a viable way to do many types of fingerprinting, not just canvas. I'll give an example. In a graphics class I took our professor gave us output images to compare to. Two people with the same model computer, same specs, would frequently have a pixel or two different from one another. Change the specs and you're easily a dozen off. Worse than that, the pixels that are off from the original image can be different pixels. This comes down to the silicon lottery. So if you can think of anything that you can access where you can get the user's computer to do some sort of floating point calculation, you can probably get a fingerprint out of that.

So to fix this problem, you'd have to figure out how not just to make pixel perfect images, but for two CPUs of different types (which even same type doesn't currently) to always calculate the save answer to the same precision, every time. There's tricks that can be done like rounding, but it gets hairy really fast and becomes unpractical. But if you do know how to solve the problem, I'm sure people would really appreciate the answer.


Random, probably uninformed thought: I wonder if the solution could be LESS determinism rather than more. If you could make it so the same hardware rendered pixels in a slightly different (random) way each time, it would no longer be possible to determine if you were looking at the same machine.


That's an interesting idea. It might be a good way to circumnavigate this problem. But there are some drawbacks. Maybe there's a lot of things we could get away with actually needing FP16 accuracy (like iterative methods can sometimes do this, especially in ML) but call FP32 but there's plenty of times where FP32 matters. So I guess it is highly dependent upon those issues and where you can get away with them. But further, how do you enforce that? I think it is interesting though.


So just go with the uniform return value someone wrote Apple is going for with Safari: Return a pitch-black rectangle every time.


I have ublock origin. The website works fine. I have all the default filters on.


This website works even with uBlock Origin with disabled JavaScript option. 9,81 bits for me.


Firefox Fingerprinting protection has nothing to do with real fingerprinting protection, and does not afffect panopticlick results in any way.

It just bloocks a couple of known scripts based on the disconnect list.

Of course they don't tell us in their marketing posts.


Firefox deploys two different forms of fingerprint protection:

1) blocking known fingerprinting scripts.

2) blocking underlying techniques.

The second is the one people here are talking about, and it can be enabled by going to your settings and turning on resistFingerprinting. Keep in mind it will do things like normalize your time zone and decrease timer precision.


Nope. The linked mozilla blog post clearly talks about the fingerprinting settings, and the comment I replied to did not specify anything related to "blocking underlying techniques".


This entire thread is filled with people comparing browser results on a site that breaks down underlying fingerprint techniques, while asking questions like, "is there a way for me to disable WebGL?", and "I wonder if they block localhost port scanning?".

I understand what the original posted link is talking about, but the specific thread you're currently on is very clearly talking about more than whether or not Panoptoclick's tracking script is blocked. They're talking about how well different browsers can resist the techniques it uses[0]. Why else would anyone be comparing their results to Tor?

[0]: https://wiki.mozilla.org/Security/Fingerprinting


If the thing in the OP doesn't do what a bunch of the comments here are discussing -- it is indeed important to point that out.

I had assumed it did, cause why else would we be discussing it here, and neither the OP post nor the FF setting are very clear about what it does. So I would have been thinking it was protecting me.


Resist fingerprinting will also stop sensor data (e.g. accelerometer) from being exposed on mobile, which can be used to identify[1].

You can verify if it's exposed on this site[2].

[1] https://www.zdnet.com/article/android-and-ios-devices-impact... [2] http://www.albertosarullo.com/demos/accelerometer/


mine is ~17. language, platform, screen size, time zone, user agent, and plugin info are the most identifying.

anyone know of a list of the most used values for these so we could lower our uniqueness by setting our browser values to them?


Likewise. It makes no difference whether I enable or disable the Fingerprinters checkbox.

Maybe due to the "uBlock Origin stops it from working" mentioned elsewhere, or some other glitch. Disabling uBlock Origin on panopticlick.eff.org didn't make a difference.


>With the fancy new anti-fingerprinting Safari on macOS Mojave I get just over 14.5 bits of entropy with the most entropic source being my canvas fingerprint (1 in 600).

That's actually pretty good, considering tor browser (which has resistfingerprinting enabled) with default window size (1000x1000) has 14.82 bits of entropy.


"Currently, we estimate that your browser has a fingerprint that conveys at least 17.66 bits of identifying information."

However, some of the information sent by my (stock browser) is clearly false:

User Agent: Mozilla/5.0 (Windows NT 6.1; rv:60.0) Gecko/20100101 Firefox/60.0

Platform: Linux x86_64

I didn't mess with my user agent, so I assume this is related "resistfingerprinting" in about:config.


Firefox with uBlock Origin and JavaScript disabled: 9.81 bits, one in 900.21 browsers. Google Chrome with uBlock Origin: at least 17.68 bits, unique among the 209,744. iPhone 8, Safari, some adblocker: the same (unique). Probably I should update OS, I'm using pretty old version.


> 9.93 bits of identifying information.

with this new setting turned on + uBlock Origin + NoScript


You can disable JS with uBlock Origin. No need for NoScript.


I'd rather have finer control on a site by site basis than wholly disabling js though.


You can do that with uBlock Origin.


Oh, did not know that. Will look into it, ty.


If you haven't already, change your user agent to get it down further.


Following this advice, I discovered an extension that automatically spoofs the user agent and does some other things. Haven't tested yet, but it seems to be actively maintained.

Since I'm on the topic, other two lesser known extension I have are CanvasBlocker (fakes canvas fingerprinting (or disables it)) and Privacy Badger (heuristically detects and disables trackers; complements ublock).


You have to stick with known popular user agents. To mitigate tracking by UA you need a randomized user agent that changes periodically. Panopticlick won't be able to account for that in its stats. It's not a good idea to switch UA on every request since it will be hard to diagnose breakage caused by a site that rejects particular UAs.


I dunno...

"Periodically changing user agent" sounds pretty unique if you ask me. Especially if the extension isn't super-clever and changes the user agent for accesses that happen on the same page.

And the fingerprinter could be super clever and look for features that your purported browser isn't supposed to support... And if your browser does support them, that's a strong identifier.


Unless you use some obscure browser, it is better to use your real user agent. If you keep your browser and operating system up to date, chances are it will be one of the most popular ones.

Your UA will correlate with other means of fingerprinting you making you more common. Being clever can make things worse.

For example, the most common UA is from an iPhone, but the most common screen width is 1920 pixel. If you decide to make your UA an iPhone with a 1920 pixel screen, then you will be easily identified.


Most people don't keep their browsers up to date, let alone their OS. I'd say using FF is rare enough that switching to a chrome based UA would help.


I switched to Windows Chrome, from FF Linux and it actually increased my score by 0.02 bits.


Panopticlick says I have "strong protection against Web tracking" but amiunique.org says I'm unique. Though amiunique also claims my TOR is fingerprint is one of six.


I get very different results from each. Some don't quite make sense to me. For example, amiunique says Timezone 3.37%. Panopticlick says 1 in 16, so half as bad. But how the hell is that timezone so identifying? I live on the west coast, how is that timezone so identifying?


I have 3 more bits of identifying information than you. (block fingerprinting enabled)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: