What’s Brave Done For My Privacy Lately? Episode #3: Fingerprint Randomization
Brave now protects users from being fingerprinted by making them appear subtly different to each website.
Note: This is the third in an ongoing, regular series of blog posts, describing new privacy-related features in Brave. This post describes work done by Senior Software Engineer Mark Pilgrim, Senior Privacy Researcher Peter Snyder, and Chief Scientist Ben Livshits.
Brave is releasing a new form of browser fingerprinting protection, available today in our Nightly version. These new protections both provide the strongest fingerprinting protections of any popular browser, and work without introducing bothersome permission prompts or breaking websites.
Brave’s approach differs from existing fingerprinting protection tools by randomizing fingerprintable values in ways that are imperceptible to humans, but which confuse fingerprints. As trackers switch from traditional cookie based tracking to fingerprinting, having practical and effective fingerprinting protections will be an increasingly important way to maintaining a user serving, privacy-respecting Web.
Problem: Trackers Use “Browser Fingerprints” To Follow You
Tracking on the Web is moving from cookie-based to fingerprinting-based, and most Web browsers do not have useful defenses against browser fingerprinting. Historically, online tracking used cookies, identifiers intended to be used for Web authentication, but have since been hijacked by advertisers and trackers to follow you on the Web.
To fight cookie-based tracking, privacy-focused tools like Brave place tight restrictions on what sites can place cookies on your browser, and thus limit which sites can identify you. Other privacy-preserving tools like Safari’s “Intelligent Tracking Protection“, or Firefox’s “Enhanced Tracking Protection” similarly place restrictions on cookies, though less stringent than Brave. The result of these restrictions on cookies is that people can enjoy the Web, at much lower privacy risk.
Unfortunately, online trackers haven’t gotten the hint; instead of adopting to respect people’s privacy, many advertisers have looked for alternative ways to track people online. The most common, post-cookie way people are tracked on the Web is now browser fingerprinting.
Browser fingerprinting works by building a large collection of things that are a little bit unique about your browser and environment (e.g. your operating system, the size of your browser window, details about your computer hardware), and combining them into one single “mega identifier”, or “fingerprint”.
Put differently, while many people use Windows, and many people have their browser resized to 1494×943, and many people use a “IntelHD Graphics 630” graphics card, all three of those things will be true for a very small number of people. Combine enough of these “semi-identifiers”, or fingerprinting data points together, and you can uniquely identify almost anyone.
Fingerprinting Is Very Difficult to Defend Against
Fingerprinting based tracking is very difficult to protect against. This is true for several reasons. First, browsers cannot block fingerprinting data points the same way a browser can block a cookie. It is either impossible, or will break a large number of websites, to prevent a site from learning your screen dimensions, or the fonts you have installed on your system.
Second, blocking fingerprinting is difficult because of the large number of possible data points in the Web platform. For example, one popular fingerprinting script, fingerprint2.js, uses dozens of data points to compute a browser fingerprint; the list of additional possible fingerprinting data points is at least as long. Removing all these fingerprint data points without breaking websites is difficult-to-impossible.
Because of this difficulty, privacy tools have generally not been effective guards against browser fingerprinting. As discussed in a previous blog post on privacy budgets, there are several approaches for defending against fingerprinting.
The most common defense is to try and make different browsers look as similar as possible, by having all versions of the browser report the same value for different fingerprinting data points. In some cases, this is by removing fingerprintable values where possible (Brave, for example, by default removes the canvas and WebGL APIs from third parties). Other tools try to protect against fingerprinting by having all versions of the browser report similar value (Safari for example reduces differences between its browsers by limiting which fonts web sites can access).
Another approach to limit fingerprinting is to only allow sites to access certain features after the user has explicitly given permission (Firefox in “resist fingerprinting” mode, and the Tor Browser Bundle, both require permission for sites to access canvas).
The unfortunate truth about all these approaches is that, despite being well intentioned, none of them are very effective in preventing fingerprinting. The enormous diversity of fingerprinting surface in modern browsers makes these “block”, “lie” or “permission” approaches somewhere between insufficient and useless, unfortunately.
Solution: Prevent Fingerprinting By Randomizing
Starting in Brave Nightly now, Brave is deploying a new type of fingerprinting defense that we expect will be uniquely effective. Instead of removing or modifying browser differences, or adding troublesome permission prompts, the Brave browser will start adding subtle randomization to some fingerprinting endpoints. This privacy-through-randomization approach has been studied by privacy-focused computer scientists, most recently in the PriVaricator (Nikiforakis et al, WWW 2015) and FPRandom (Laperdrix et al, ESSoS 2017) projects. This is the first time those approaches are implemented in a mainstream browser.
This approach is fundamentally different from existing fingerprinting defense approaches; current approaches attempt to make all browsers look identical to websites (an impossible goal). Brave’s new approach aims to make every browser look completely unique, both between websites and between browsing sessions. By making your browser constantly appear different when browsing, websites are unable to link your browsing behavior, and are thus unable to track you on the Web.
You can see these defenses at work by visiting fingerprinting demonstration websites (e.g., web audio, canvas). First, to demonstrate how fingerprinting can identify you across sessions, try the following steps in any current browser (Chrome, Firefox, Safari, Edge, or even the Tor Browser Bundle).
- Visit audiofingerprint.openwpm.com or browserleaks.com/canvas
- Note the fingerprinted values
- Reload the browser after clearing storage, either by deleting all browser data or opening a new private window
- Note the same fingerprint is assigned, despite all storage, cookies, etc being cleared.
This cross-storage fingerprint value is how finger-printers track you on the Web. If you now perform the same four steps in Brave Nightly, you’ll notice a different fingerprint value on each visit, demonstrating that your fingerprint cannot be used to link these two visits, and protecting your privacy. Additionally, because these fingerprinting still work the way sites expect, Brave users can still enjoy sites that use audio, canvas and WebGL for user-serving purposes, without the risk of being tracked.
If you’re interested in the details of how this privacy-through-randomization feature is implemented, the details are available on our wiki. But this is just one of the many ways Brave is working to improve privacy on the Web. In addition to new privacy features we’ll be sharing shortly, Brave also works on standards bodies to promote privacy for all browser users, conducts research on Web privacy (among other topics), and is developing a privacy-preserving alternative for funding the Web.
We’re eager to have you give our new privacy protections a try, and we look forward to sharing more about our privacy-preserving work with you in the future.
This post was updated on June 2nd, 2020. The post originally linked to fingerprintjs.com/demo, as a demonstration of how Brave randomizes fingerprints. That fingerprinting site has been updated to ignore canvas and audio fingerprints, as a countermeasure to Brave’s fingerprint randomizations. This is a good thing for Brave users, as it means fingerprinters are consuming less information about users, and so are less able to track people on the Web. It does, though, make for a more difficult to understand demonstration.
The post now links to audiofingerprint.openwpm.com and browserleaks.com/canvas as demonstrations of audio and canvas fingerprinting. Brave users can instead visit those sites to verify that Brave prevents users from being identified through canvas and audio operations.
Continue reading for news on ad blocking, features, performance, privacy and Basic Attention Token related announcements.
This is the eleventh post in an ongoing, regular series describing new privacy features in Brave. This post describes work done by Senior Software Engineer Mark Pilgrim and Filter List Engineer Ryan Brown, and was written by Director of Privacy Peter Snyder.
Brave, along with a team of DNS experts from the industry and open source communities, recently helped publish an IETF standard (RFC 9103) to fix a long-standing privacy and security hole in the DNS.
Today, Brave launched Brave Talk, a new privacy-focused video conferencing feature built directly into the Brave browser.