Peter Snyder, Senior Privacy Researcher at Brave
Brendan Eich, CEO and co-founder of Brave
A Step in the Wrong Direction
FLoC is a recent Google proposal that would have your browser share your browsing behavior and interests by default with every site and advertiser with which you interact. Brave opposes FLoC, along with any other feature designed to share information about you and your interests without your fully informed consent. To protect Brave users, Brave has removed FLoC in the Nightly version of both Brave for desktop and Android. The privacy-affecting aspects of FLoC have never been enabled in Brave releases; the additional implementation details of FLoC will be removed from all Brave releases with this week’s stable release. Brave is also disabling FLoC on our websites, to protect Chrome users learning about Brave.
Companies are finally being forced to respect user privacy (even if only minimally), pushed by trends such as increased user education, the success of privacy-first tools (e.g., Brave among others), and the growth of legislation including the CCPA and GDPR. In the face of these trends, it is disappointing to see Google, instead of taking the present opportunity to help design and build a user-first, privacy-first Web, proposing and immediately shipping in Chrome a set of smaller, ad-tech-conserving changes, which explicitly prioritize maintaining the structure of the Web advertising ecosystem as Google sees it.
For the Web to be trusted and to flourish, we hold that much more is needed than the complex yet conservative chair-shuffling embodied by FLoC and Privacy Sandbox. Deeper changes to how creators pay their bills via ads are not only possible, but necessary. The success of Brave’s privacy-respecting, performance-maintaining, and site-supporting advertising system shows that more radical approaches work. We invite Google to join us in fixing the fundamentals, undoing the harm that ad-tech has caused, and building a Web that serves users first.
The rest of this post explains why we believe FLoC is bad for Web users, bad for sites, and a bad direction for the Web in general.
FLoC is Harmful to Web Users
The worst aspect of FLoC is that it materially harms user privacy, under the guise of being privacy-friendly. Others have already detailed many of the ways FLoC is privacy harming. We note here just three aspects of FLoC that are particularly harmful and concerning.
FLoC Tells Sites and Third Parties About Your Browsing History
FLoC harms privacy directly and by design: FLoC shares information about your browsing behavior with sites and advertisers that otherwise wouldn’t have access to that information. Unambiguously, FLoC tells sites about your browsing history in a new way that browsers categorically do not today.
Addendum
Worse, Google neglects the harm done on sites that already know you, because you have created an account or otherwise identified yourself. For example, you may have an existing account with Walgreens, possibly to fill prescriptions. Walgreens necessarily knows who you are. FLoC tells Walgreens things that Walgreens has no business knowing about you (not a pseudonymous you or a cohort including you, but you as identified by your Walgreens login), based on your browsing behavior.
Chrome telling Walgreens (and X (formerly Twitter), and GitHub, and Facebook, and any other site you have an account with) is unquestionably harming your privacy, by telling sites information about you that they otherwise wouldn’t have, information you didn’t decide to share with those sites, and information that is likely unrelated to why you chose to visit those sites where you do log in.
Worse yet, FLoC also exposes this information to every third-party on these sites. So, to build on the above example, FLoC doesn’t tell only Walgreens your interests and behaviors, but it tells, at time of writing, each of the following ad-tech companies, your interests too:
- Monetate
- Adobe
- Bing
- Branch.io
- InMomentum
All of these are ad-tech companies who track, record and profile you across the web. Google’s response to this concern in the FLoC proposal (that it may be privacy harm, but it’s better than current Chrome behavior) is damning.
Google claims that FLoC is privacy improving, despite intentionally telling sites more about you, for broadly two reasons, each of which conflate unrelated topics. First, Google says FLoC is privacy preserving compared to sending third-party cookies. But this is a misleading baseline to compare against. Many browsers don’t send third-party cookies at all; Brave hasn’t ever. Saying a new Chrome feature is privacy-improving only when compared to status-quo Chrome (the most privacy-harming popular browser on the market), is misleading, self-serving, and a further reason for users to run away from Chrome.
Second, Google defends FLoC as not privacy-harming because interest cohorts are designed to be not unique to a user, using k-anonymity protections. This shows a mistaken idea of what privacy is. Many things about a person are i) not unique, but still ii) personal and important, and shouldn’t be shared without consent. Whether I prefer to wear “men’s” or “women’s” clothes, whether I live according to my professed religion, whether I believe vaccines are a scam, or whether I am a gun owner, or a Brony-fan, or a million other things, are all aspects of our lives that we might like to share with some people but not others, and under our terms and control.
In general, the idea that privacy is, and is only, the absence of cross-site tracking, is wrong. Any useful concept of privacy should include some concept of “don’t tell others things you know about me, without my permission.” FLoC is only “privacy protecting” by cynically ruling out common-sense understandings of what privacy is.
FLoC Makes it Easier For Sites To Track You Across The Web
FLoC adds an enormous amount of fingerprinting surface to the browser, as the whole point of the feature is for sites to be able to distinguish between user interest-group cohorts. This undermines the work Brave is doing to protect users against browser fingerprinting and the statistically inferred cohort tracking enabled by fingerprinting attack surface.
Google’s proposed solution to the increased fingerprinting risk from FLoC is both untestable and unlikely to work. Google proposes using a “privacy budget” approach to prevent FLoC from being used to track users. First, Brave has previously detailed why we do not think a “budget” approach is workable to prevent fingerprinting-based tracking. We stand by those concerns, and have not received any response from Google, despite having raised the concerns over a year ago. And second, Google has yet to specify how their “privacy budget” approach will work; the approach is still in “feasibility-testing” stages.
Shipping a privacy harming feature, while exploring how to fix the privacy harm, is exactly the “keep digging your way out of the deep hole” anti-pattern that has made browser fingerprinting such a difficult problem to solve.
FLoC Promotes A False Notion of What Privacy Is, and Why Privacy Is Important
Google is aware of some of these concerns, but gives them shallow treatment in their proposal. For example, Google notes that some categories (sexual orientation, medical issues, political party, etc.) will be exempt from FLoC, and that they are looking into other ways of preventing “sensitive” categories from being used in FLoC. Google’s approach here is fundamentally wrong.
First, Google’s approach to determining whether a FLoC cohort is sensitive requires (in most cases) Google to record and collect that sensitive cohort in the first place! A system that determines whether a cohort is “sensitive” by recording how many people are in that sensitive cohort doesn’t pass the laugh test.
Second, and more fundamental, the idea of creating a global list of “sensitive categories” is illogical and immoral. Whether a behavior is “sensitive” varies wildly across people. One’s mom may not find her interest in “women’s clothes” a private part of her identity, but one’s dad might (or might not! but, plainly, Google isn’t the appropriate party to make that choice). Similarly, an adult happily expecting a child might not find their interest in “baby goods” particularly sensitive, but a scared and nervous teenager might. More broadly, interests that are banal to one person, might be sensitive, private or even dangerous to another person.
The point isn’t that Google’s list of “sensitive cohorts” will be missing important items. The point, rather, is that a “privacy preserving system” that relies on a single, global determination of what behaviors are “privacy sensitive,” fundamentally doesn’t protect privacy, or even understand why privacy is important.
FLoC is Harmful to Sites and Publishers
While our primary concerns with FLoC are around the privacy harms to users, FLoC is also harmful to some sites. Default FLoC behavior will leak and share user behavior on your site, which will harm sites that have high trust, or highly private relationships, with their users.
Here is a synthetic but demonstrative example. Say I run a website selling polka music, and I serve a dedicated community of die-hard polka fans. My site is successful because I’ve identified a niche market that is poorly served elsewhere, which allows me to charge higher than, say, Amazon prices. However, FLoC may stick users browsing in Chrome in a “polka music lover” cohort, and begin having my users broadcast their “polka love” to other sites, including Amazon. Amazon could then peel off my polka-record buyers, leaving me worse off. This audience stealing is common with ad-tech that Brave blocks.
Many similar examples are possible, but the general point is that FLoC will have your users broadcast their interest in your site (and sites like your site) to unrelated sites on the Web. Those other sites may use this information to engage in forms of price discrimination, or otherwise more aggressively market to your users. Programmatic ad-tech has done exactly this for years, and FLoC would continue this practice into the “post third-party cookies” era.
We Encourage All Sites to Disable FLoC
Given that FLoC can be harmful for site operators too, we recommend that all sites disable FLoC. In general, any new privacy-risking features on the web should be opt-in. This is a common-sense principle to respect Web users by default. One might wonder why Google isn’t making FLoC opt-in. We suspect that Google has made FLoC opt-out (for sites and users) because Google knows that an opt-in, privacy harming system would likely never reach the scale needed to induce advertisers to use it.
Given the wrong-headed opt-out design, all sites should disable FLoC; a few already have. It’s difficult to come up with any reason right now, prior to Google disabling third party cookies, for why a site would benefit from enabling FLoC. As discussed above, there are concrete ways in which leaving FLoC on by default could harm a site.
Conclusions
Overall, FLoC, along with many other elements of Google’s “Privacy Sandbox” proposal, are a step backward from more fundamental, privacy-and-user focused changes the Web needs. Instead of deep change to enforce real privacy and to eliminate conflicts of interest, Google is proposing Titanic-level deckchair-shuffling that largely maintains the current, harmful, inefficient system the Web has evolved into, a system that has been disastrous for the Web, users and publishers.
What the Web desperately needs is radical change, one where “would users want this?” is the most important question asked for each new feature. Instead, FLoC and “Privacy Sandbox” ask “how can we make this work for ad-tech, in a way that users will tolerate or not notice.” Brave is proof that more radical changes can result in a better Web for users, publishers, and even advertisers. We urge Google to join the other browsers, experts, and activists working to make the Web user-first.