WebStandards@Brave

Privacy And Competition Concerns with Google’s Privacy Sandbox

By the Brave Web Standards Team

This is the sixth in a series of blog posts discussing proposed Web standards. This post was written by Director of Privacy Peter Snyder.

Brave recently submitted concerns and comments to the UK Competition and Markets Authority (CMA) as part of the CMA’s effort to secure commitments from Google regarding Google’s Privacy Sandbox plans. Brave appreciates the CMA’s critical work, and applauds the CMA’s efforts to prevent Google’s long overdue privacy protections from harming the open Web.

But the CMA (along with other regulators) seem to be evaluating Google’s Privacy Sandbox as an isolated, independent set of features. This misses the bigger picture: how Privacy Sandbox will interact with other Google proposals, and how radical and harmful it will be in practice.

Google presents Privacy Sandbox as a system for protecting privacy and openness on the Web. In practice, we expect that Privacy Sandbox will harm Web privacy, and further cement Google’s control over the Web. This post presents concerns with Privacy Sandbox that we have not seen discussed elsewhere, which we worry may go unconsidered in the CMA’s analysis.

We present our concerns with Privacy Sandbox not only as a browser maker, but as individuals worried that Privacy Sandbox threatens what makes the Web special and unique: that users can modify their Web experience to best suit their needs and wants, and that features in the Web are designed first and foremost to benefit users.

Privacy Sandbox Will Harm User Choice

It’s prudent to doubt someone selling antidotes after they’ve gotten rich by poisoning wells. The Web community should similarly doubt Google’s privacy promises (and intentions) after Google has spent decades profiting from (and entrenching) the Web’s worst privacy violations.

Google’s Privacy Sandbox proposal improves privacy only when your baseline for comparison is Google Chrome, the browser with the unambiguously worst privacy protections of any major browser.

But while Chrome is terrible for privacy, it’s at least designed to work with a Web where other browsers and privacy tools can deliver very high levels of privacy. Privacy Sandbox achieves its (modest) privacy improvements in ways that restrict, break or weaken other more robust privacy tools. It kills user choice.

Here are just some examples of upcoming Google proposals in, or adjacent to, Privacy Sandbox that will weaken the ability for users to protect their privacy online.

Manifest v3

Manifest v3 broadly refers to Google’s planned changes for browser extensions. These plans have been heavily criticized by privacy organizations and developers. Among other concerns, Manifest v3 weakens how extensions can block trackers, including (unsurprisingly) Google’s new generation of tracking scripts. Google has begun promoting “server-side tagging” capabilities in its tracking libraries which can be used to circumvent privacy tools. Clever extensions can use Manifest v2 capabilities to block server-side tagging1 trackers; Manifest v3 removes these capabilities, making server-side tagging effectively unblockable.

First-Party Sets

Google’s First-Party Sets proposal would (intentionally) weaken privacy boundaries between sites, making it easier for sites to track users while they browse. The proposal also makes it difficult-to-impossible for users to predict how their identity will be linked while they browse.

Signed Exchange

Google’s Signed Exchange (SXG) proposal is designed to allow one organization to serve sites on behalf of another organization. Today, when you load pages from (say) news.example, you know you’re speaking with, and fetching content from, news.example; SXG is designed to make it appear that you’re talking with news.example, when in fact you’re talking with Google (who will likely use such information to further profile and track you on the Web).

WebBundles

Google’s WebBundles proposal, part of Privacy Sandbox proposals like FLEDGE, would allow a website to fetch multiple web resources (images, scripts, etc) from a single URL, similar to how a zip file includes multiple files. As Brave discussed over a year ago, WebBundles are extremely harmful to Web privacy. In the same way attackers sneak malware onto your computer by packaging harmful software in benign looking attachments (think an email attachment labeled “family-photos.zip” that includes a computer virus), websites will use WebBundles to sneak trackers into your browser, by calling a WebBundle “site-resources” and filling the bundle with harmful scripts.

Not surprisingly, Manifest v3 prevents extensions from meaningfully blocking resources in bundles. Brave previously warned that trackers would use WebBundles functionality to remove meaningful information (i.e., URLs) from bundled resources: we now see Google proposing exactly this feature. Google’s planned approach also prevents clients from choosing which kinds of resources in a bundle they want to download, thus resulting in clients downloading unnecessary bytes (especially bad for users for whom bandwidth is expensive or scarce).

User Choice Concerns Conclusion

We note that the above is an incomplete list; other proposals in or around Privacy Sandbox have other well known privacy harms. FLoC intentionally shares your browsing interests with the sites you visit, Privacy Budget is a fool’s-errand-approach to allow Google to continue to pack the Web with functionality that allows browser fingerprinting, etc.

We urge the CMA, and the wider Web community, to note the common themes:

  • Privacy Sandbox (and related features attached to it) improves privacy only by the narrow, cynical standard of “better than Google Chrome”

  • These new features push Web standards in a direction that will make it more difficult for other browsers and privacy tools to protect privacy

Privacy Sandbox Encourages Centralization

Privacy Sandbox would further cement Google’s position as a Web monopolist. As others have noted, true privacy would improve competitiveness and openness on the Web. Privacy Sandbox is the opposite: a cynical proposal adopting just enough of the language and colors of the privacy community to keep regulators at bay, while in practice benefitting Google’s monopoly, all to the detriment of the Web at large.

The below is a partial list of features in and around Privacy Sandbox that favor centralization in general, and Google in particular.

FLEDGE, TURTLEDOVE and Aggregate Reporting

Google’s FLEDGE and TURTLEDOVE proposals require centralized trusted servers to collect sensitive, identifying data2. These systems would have a small number of servers collect sensitive information from millions of browser users, in the hopes that if enough sensitive information is collected, it will be impossible for the final recipients of the data to identify individuals3. It’s absurd that the Web should trust the companies that harm Web privacy today to be honest and privacy-respecting stewards of our data tomorrow.

But worse, trusted-server approaches require centralization, and so will inevitably favor monopolists and harm privacy-respecting innovators. For the same reason that it is difficult to create new social networks, it’ll be difficult to create non-giant-favoring implementations of Privacy Sandbox features. By design, these are systems that become more powerful and useful the more users they have, and (intentionally or otherwise) will make it nearly impossible for competitors to emerge.

“AppStore-ification” of The Web: SXG, WebBundles, and AMP 2.0

Privacy Sandbox and its related features threaten to turn the Web into the Google Play store. Google has pushed sites to use AMP, a system which allows Google to track you across an even greater percentage of the Web; the system is so hated that Web users pay money to avoid it.

Just as on the Google Play store, Google will control what users can find (by monopolizing search), and learn even more about what users are interested in. With Signed Exchange and WebBundles, Google is building “AMP 2.0,” a system where, no matter what search result you choose, or which news item you click on, you never leave Google’s grip. Privacy Sandbox and related features will create a Web with the privacy, performance and lock-in nightmares that plague mobile apps.

And, just as on the Google Play store, users will no longer be able meaningfully to customize or control the content they view on the Web. Today, people can save mobile data and battery life4, protect privacy, and generally modify websites as they see fit. Privacy Sandbox will weaken or remove these vital abilities. Apps5 are “all or nothing” affairs, with little ability for each user to control their experience. Signed Exchange and WebBundles will do the same for websites: change websites into all-or-nothing “bundles,” and put sites in control, restricting users (and the tools users select) to be passive observers6.

Finally, we note that the above concerns and claims are not only Brave’s worry and speculation. They are mirrored in Google’s stated use cases for WebBundles and Signed Exchange, in the standards proposals around WebBundles, and in Google’s internal documents that were made public from Google’s antitrust filings.

Privacy Sandbox Is Harmful to User Autonomy

Privacy Sandbox will erode the Web’s most important and unique principle: the Web is for people first and foremost. These principles, that the Web is for users first, and that the Web should give users control and power, are enshrined in core W3C documents.

In ethical systems, people arrive with their own preferences and desires. Companies then compete by designing products and systems that are compatible with those preferences and desires. With Privacy Sandbox, Google has done the opposite: build a system intentionally designed to perpetuate a specific business strategy7, and use their monopoly power to force people to participate in that system.

The difference between Privacy Sandbox on one hand, and user-serving browser features on the other, is stark. Browsers include many features that people want: features that allow people to play games they love, safely buy the things they need, or communicate privately with the people they care about. Browser vendors eagerly advertise these kinds of features, since they know people want them, and might even switch browsers to take advantage of them.

The features in Privacy Sandbox are the opposite; no user has ever requested that their browser passively broadcast their interests across the web (a la FLoC), conduct ad auctions to benefit organizations the user will never know (a la FLEDGE), or remove and restrict the ability to customize their browsing (a la WebBundles and Manifest v3), among many others. That Google discusses its feature plans with advertisers and developers, but is vague about them to their users, is telling.

The Web is the one remaining popular application platform that puts users first (thanks in no small part to the work of activists, market competition, and the civil society participation in standards bodies). Privacy Sandbox would be an enormous, potentially irreversible step away from what makes the Web important, and towards the Web being just another system where the people are batteries carrying out tasks to power monopolists’ goals.

Things That Start in Chrome, Don’t Stay in Chrome

A Privacy Sandbox apologist might reply to the above examples by noting “these are changes in Chrome that other browsers can implement as (and if) they see fit.” We’re familiar with these replies from working with Google in standards bodies and in our interactions with regulators. Such replies don’t pass the laugh test.

A change (as for example with Manifest v3) that reduces the capabilities of privacy-extensions will similarly reduce the usefulness of those tools, and so reduce the number of contributors to the filter lists that power them. Plus, changes to how the dominant browser handles privacy boundaries (as with SXG and First-Party Sets) won’t affect only the majority browser. Such changes will leak into privacy-respecting browsers too, who are repeatedly forced into a no-win game from Google’s monopoly: implement Google’s privacy-harming features, or ship a Web browser that breaks on sites designed to work with Chrome.

Conclusion

2022 will be a make or break year for the Web, and there are reasons for both optimism and pessimism.

On the hopeful side, we’re encouraged that regulators have finally noticed the enormous harm online surveillance (or its euphemism, “realtime behavioral advertising”) does to society, and how that harm is enabled by online tech monopolists. The UK CMA’s concerns around Privacy Sandbox, alongside meaningful privacy legislation like California’s CCPA, and long overdue skepticism from governments around Facebook’s and Google’s business strategies, all give reason for hope.

However, unless Web users and regulators consider all of Google’s activities (Privacy Sandbox and otherwise), 2022 will be as disappointing as it was promising. We worry that the Web community, privacy activists, and regulators are too narrowly focused on what Google is proposing today, and missing the far-more-concerning trajectory Google plans for the Web of tomorrow.

We encourage and are ready to support efforts to make sure the Web stays open, user-first and competitive. Urgently needed (and long overdue) privacy protections the Web needs can’t come at the expense of what makes the Web uniquely valuable to users.

We’ve included Brave’s full response to the CMA below.


Brave’s Response to CMA Consultation on modified commitments in respect of Google’s ‘Privacy Sandbox’ browser changes

Authors: Pete Snyder; Shivan Kaul Sahib; Pat Walshe

Date: 17 December, 2021

Brave Software welcomes the opportunity to make brief comments in response to the UK CMA’s consultation on Google’s modified commitments regarding Privacy Sandbox browser changes.

  1. CMA should discourage and disapprove of privacy approaches that limit the amount of information available to privacy-focused clients. For example, proposals like Google’s “Signed Exchanges” (SXG) or WebBundles (the latter of which is deeply integrated into Privacy Sandbox) come with promises of improving user privacy, by making Google a proxy for advertisements and websites coming from other parties. This has the privacy upside of preventing non-Google advertisers from learning certain kinds of information about Web users.

    However, the cost of these privacy improvements are that privacy-focused tools and Web browsers are also prevented from learning which parties are involved in sending content to Web users, restricting these tools (which include tracker blockers, anonymizers, anti-fingerprinting tools, tools designed to filter out pornography or other age-restricted material, etc.). from understanding where Web content comes from, and preventing entire categories of privacy protection techniques (see also EFF’s concerns with Google pushing Manifest V3 for Web Extensions).

    We urge the CMA to examine this serious privacy harm Privacy Sandbox poses to the web, and make explicit that Google should not pursue approaches to web privacy (where “privacy” is defined in a very narrow, cynical manner) that prevent other tools from protecting their users, particularly tools designed to protect against privacy threats too niche, population specific, or at odds with Google’s business model.

  2. In order for the quarterly summaries to be accurate representations of 3rd party feedback received at W3C and other fora (para 12, 32a), we strongly urge the CMA to clarify what the process is for ensuring that the feedback given by 3rd parties is fairly captured and addressed by Google. We also suggest that these summaries be made public for transparency and to allow for other fora participants to add context.

  3. We encourage the CMA to explicitly discourage / disapprove of “make the user pay for their own privacy” approaches to privacy, for example, by moving expensive ad auction computation to the user’s device without compensating the user for this additional on-device computation.

  4. CMA should explicitly state that the goal is to increase competitiveness within a privacy-respecting legal framework, not to discard or roll back privacy protections to allow current privacy-harming parties to compete. It should be made explicit that “competition must happen within privacy protections, not as a shield against privacy protections.”

  5. CMA should discourage and disapprove of privacy approaches that require centralization for their privacy protections (FLEDGE / PARAKEET, AMP / SXG, etc.) There seems to be a growing use of centralized “trusted servers” in the still-developing Privacy Sandbox proposals which will promote large parties at the expense of the small, and make it difficult to impossible for people to verify the privacy properties of the system.

  6. CMA should be on guard about Google using Privacy Sandbox to create a moat against competitors, and discourage Google from leveraging its universal first-party status as an ill-gotten competitive advantage against other, more privacy respecting organizations.

    Google public statements seem to commit it to not use Google’s first-party data to improve the ads Google serves on non-Google sites. However, to the best of our knowledge, Google has made no such commitment to not use data learned from Google operating in a first-party context on non-Google sites (e.g., information learned by serving Web users AMP pages, information learned from the wide range of data-collecting features included in Chrome, information learned from YouTube and Google Map embeds, etc.) to “improve” ads Google places on Google sites.

    We urge the CMA to have Google clarify whether information Google collects from user behaviors on non-Google first-party sites will be used to “enrich” the ads Google shows on its own sites. If this is Google’s intent, then we expect Google’s dominant position on the Web will in effect restrict the kinds of information other advertisers can collect (a good thing), but not restrict the kinds of information Google can collect for its own advertising purposes.

  7. Google has offered to “appoint an independent Monitoring Trustee.” We would ask that the CMA provides clarity as to the legal status and independence and authority of the Trustee especially given Section 1.10 (h). Also, what criteria does the CMA envisage for the appointment of a Trustee; how will reporting and compliance criteria be determined (if Google has not sufficiently identified the privacy impacts of the Sandbox as commented by the UK ICO and the CMA)?


  1. By looking at request patterns, co-occurring-but-unordered query parameters, or contextual page information, among other techniques. ↩︎

  2. Microsoft’s PARAKEET proposal similarly requires centralized servers to collect sensitive, identifying information. ↩︎

  3. Depending on the implementation details, this approach might use k-anonymity, might use differential privacy, or several other approaches. The browser side features in Privacy Sandbox are roughly generalized into the Aggregate Reporting API↩︎

  4. By blocking images, videos, scripts, and audio. Disabling these kinds of page sub-resources is a popular way users on low bandwidth, or low data, internet plans use the Web. ↩︎

  5. Android and iOS alike. ↩︎

  6. Websites today are the conglomeration of dozens or hundreds of resources, each of which the browser can fetch or block, as the user sees fit. WebBundles will remove this a la carte ability, and turn websites into effectively zip files, forcing users to either download all or none of the site. Similarly, today, browser tools can make blocking decisions based on the URL that a resource was fetched from when making blocked decisions. WebBundles sub-resource URLs into meaningless identifiers, preventing browser tools from protecting their users. ↩︎

  7. Specifically, online real time behavioral advertising, a system that Google just so happens to dominate. ↩︎

Related articles

Ready for a better Internet?

Brave’s easy-to-use browser blocks ads by default, making the Web cleaner, faster, and safer for people all over the world.

close

Almost there…

You’re just 60 seconds away from the best privacy online

If your download didn’t start automatically, .

  1. Download Brave

    Click “Save” in the window that pops up, and wait for the download to complete.

    Wait for the download to complete (you may need to click “Save” in a window that pops up).

  2. Run the installer

    Click the downloaded file at the top right of your screen, and follow the instructions to install Brave.

    Click the downloaded file, and follow the instructions to install Brave.

  3. Import settings

    During setup, import bookmarks, extensions, & passwords from your old browser.

Need help?

Get better privacy. Everywhere!

Download Brave mobile for privacy on the go.

Download QR code
Click this file to install Brave Brave logo
Click this file to install Brave Brave logo
Click this file to install Brave Brave logo