Google Is Testing Its Controversial New Ad Targeting Tech in Millions of Browsers. Here's What We Know. – EFF

Update, April 9, 2021 : We’ve launched Am I FLoCed, a new site that will tell you whether your Chrome browser has been turned into a guinea pig for Federated Learning of Cohorts or FLoC, Google’s latest targeted advertising experiment. 
Today, Google launched an “origin trial” of Federated Learning of Cohorts (aka FLoC), its experimental new technology for targeting ads. A switch has silently been flipped in millions of instances of Google Chrome: those browsers will begin sorting their users into groups based on behavior, then sharing group labels with third-party trackers and advertisers around the web. A random set of users have been selected for the trial, and they can currently only opt out by disabling third-party cookies.
Although Google announced this was coming, the company has been sparse with details about the trial until now. We’ve pored over blog posts, mailing lists, draft web standards, and Chromium’s source code to figure out exactly what’s going on.

EFF has already written that FLoC is a terrible idea.  Google’s launch of this trial—without notice to the individuals who will be part of the test, much less their consent—is a concrete breach of user trust in service of a technology that should not exist.

Below we describe how this trial will work, and some of the most important technical details we’ve learned so far.
Google designed FLoC to help advertisers target ads once third-party cookies go away. During the trial, trackers will be able to collect FLoC IDs in addition to third-party cookies. 
That means all the trackers who currently monitor your behavior across a fraction of the web using cookies will now receive your FLoC cohort ID as well. The cohort ID is a direct reflection of your behavior across the web. This could supplement the behavioral profiles that many trackers already maintain.
We’ve been told that the trial is currently deployed to 0.5% of Chrome users in some regions—for now, that means Australia, Brazil, Canada, India, Indonesia, Japan, Mexico, New Zealand, the Philippines, and the U.S. Users in eligible regions will be chosen completely at random, regardless of most ad and privacy settings. Only users who have turned off third-party cookies in Chrome will be opted out by default.
Furthermore, the team behind FLoC has requested that Google bump up the sample to 5% of users, so that ad tech companies can better train models using the new data. If that request is granted, tens or hundreds of millions more users will be enrolled in the trial.
As described above, a random portion of Chrome users will be enrolled in the trial without notice, much less consent. Those users will not be asked to opt in. In the current version of Chrome, users can only opt out of the trial by turning off all third-party cookies.
Future versions of Chrome will add dedicated controls for Google’s “privacy sandbox,” including FLoC. But it’s not clear when these settings will go live, and in the meantime, users wishing to turn off FLoC must turn off third-party cookies as well.
Turning off third-party cookies is not a bad idea in general. After all, cookies are at the heart of the privacy problems that Google says it wants to address. But turning them off altogether is a crude countermeasure, and it breaks many conveniences (like single sign-on) that web users rely on. Many privacy-conscious users of Chrome employ more targeted tools, including extensions like Privacy Badger, to prevent cookie-based tracking. Unfortunately, Chrome extensions cannot yet control whether a user exposes a FLoC ID.
FLoC calculates a label based on your browsing history. For the trial, Google will default to using every website that serves ads—which is the majority of sites on the web. Sites can opt out of being included in FLoC calculations by sending an HTTP header, but some hosting providers don’t give their customers direct control of headers. Many site owners may not be aware of the trial at all.

This is an issue because it means that sites lose some control over how their visitors’ data is processed. Right now, a site administrator has to make a conscious decision to include code from an advertiser on their page. Sites can, at least in theory, choose to partner with advertisers based on their privacy policies. But now, information about a user’s visit to that site will be wrapped up in their FLoC ID, which will be made widely available (more on that in the next section). Even if a website has a strong privacy policy and relationships with responsible advertisers, a visit there may affect how trackers see you in other contexts.
Anyone can sign up for Chrome’s origin trial. After that, it can access FLoC IDs for users who have been chosen for the trial whenever it can run JavaScript. This includes the vast ecosystem of nameless advertisers to whom your browser connects whenever you visit most ad-serving sites. If you’re part of the trial, dozens of companies may be able to gather your FLoC ID from each site you visit.
One of the most important portions of the FLoC specification left undefined is exactly how many cohorts there are. Google ran a preliminary experiment with 8-bit cohort IDs, which meant there were just 256 possible groups. This limited the amount of information trackers could learn from a user’s cohort ID. 
However, an examination of the latest version of Chrome reveals that the live version of FLoC uses 50-bit cohort identifiers. The cohorts are then batched together into 33,872 total cohorts, over 100 times more than in Google’s first experiment. Google has said that it will ensure “thousands” of people are grouped into each cohort, so nobody can be identified using their cohort alone. But cohort IDs will still expose lots of new information—around 15 bits—and will give fingerprinters a massive leg up.
Any tracker, advertiser, or other third party can sign up through Google’s Origin Trial portal to begin collecting FLoCs from users. The page currently indicates that the trial may last until July 13. Google has also made it clear that the exact details of the technology—including how cohorts are calculated—will be subject to change, and we could see several iterations of the FLoC grouping algorithm between now and then.
Google has pledged to make sure that cohorts aren’t too tightly correlated with “sensitive categories” like race, sexuality, or medical conditions. In order to monitor this, Google plans to collect data about which sites are visited by users in each cohort. It has released a whitepaper describing its approach. 
We’re glad to see a specific proposal, but the whitepaper sidesteps the most pressing issues. The question Google should address is “can you target people in vulnerable groups;” the whitepaper reduces this to “can you target people who visited a specific site.” This is a dangerous oversimplification. Rather than working on the hard problem, Google has chosen to focus on an easier version that it believes it can solve. Meanwhile, it’s failed to address FLoC’s worst potential harms.
During the trial, any user who has turned on “Chrome Sync” (letting Google collect their browsing history), and who has not disabled any of several default sharing settings, will now share their cohort ID attached to their browsing history with Google. 
Google will then check to see if each user visited any sites that it considers part of a “sensitive category.” For example, WebMD might be labelled in the “medical” category, or PornHub in the “adult” category. If too many users in one cohort have visited a particular kind of “sensitive” site, Google will block that cohort. Any users that are part of “sensitive” cohorts will be placed into an “empty” cohort instead. Of course, trackers will still be able to see that said users are part of the “empty” cohort, revealing that they were originally classified as some kind of “sensitive.”
For the origin trial, Google is relying on its massive cache of personalized browsing data to perform the audit. In the future, Google plans to use other privacy-preserving technology to do the same thing without knowing individuals’ browsing history.
Regardless of how Google does it, this plan won’t solve the bigger issues with FLoC, discrimination, and predatory targeting. The proposal rests on the assumption that people in “sensitive categories” will visit specific “sensitive” websites, and that people who aren’t in those groups will not visit said sites. But behavior correlates with demographics in unintuitive ways. It’s highly likely that certain demographics are going to visit a different subset of the web than other demographics are, and that such behavior will not be captured by Google’s “sensitive sites” framing. For example, people with depression may exhibit similar browsing behaviors, but not necessarily via something as explicit and direct as, for example, visiting “” Meanwhile, tracking companies are well-equipped to gather traffic from millions of users, link it to data about demographics or behavior, and decode which cohorts are linked to which sensitive traits. Google’s website-based system, as proposed, has no way of stopping that.
As we said before, “Google can choose to dismantle the old scaffolding for surveillance without replacing it with something new and uniquely harmful.” Google has failed to address the harms of FLoC, or even to convince us that they can be addressed. Instead, it’s running a test that will share new data about millions of unsuspecting users. This is another step in the wrong direction.

In Privacy Without Monopoly: Data Protection and Interoperability, we took a thorough look at the privacy implications of various kinds of interoperability. We examined the potential privacy risks of interoperability mandates, such as those contemplated by 2020’s ACCESS Act (USA), the Digital Services Act and Digital Markets Act (EU),…
Cookies are dying, and the tracking industry is scrambling to replace them. Google has proposed Federated Learning of Cohorts (FLoC), TURTLEDOVE, and other bird-themed tech that would have browsers do some of the behavioral profiling that third-party trackers do today. But a coalition of independent surveillance advertisers has…
Washington, D.C.—The Electronic Frontier Foundation (EFF) today urged the Supreme Court to rule that consumers can take big tech companies like Facebook and Google to court, including in class action lawsuits, to hold them accountable for privacy and other user data-related violations, regardless of whether they can show they suffered…
Update, April 9, 2021 : We’ve launched Am I FLoCed, a new site that will tell you whether your Chrome browser has been turned into a guinea pig for Federated Learning of Cohorts or FLoC, Google’s latest targeted advertising experiment. The third-party cookie is dying, and Google is trying…
San Francisco—The Electronic Frontier Foundation (EFF) today launched Cover Your Tracks, a interactive tool that teaches users how advertisers follow them as they shop or browse online, and how to fight back against corporate trackers to protect their privacy, mitigate relentless ad targeting, and improve the web ecosystem for…
Twitter greeted its users with a confusing notification this week. “The control you have over what information Twitter shares with its business partners has changed,” it said. The changes will “help Twitter continue operating as a free service,” it assured. But at what cost?What Changed?Twitter has changed what happens when…
On January 15, a Norweigian Consumer Council (NCC) investigative report exposed the ways that Grindr, OKCupid, and eight other apps are collecting and sharing extremely sensitive personal data. Grindr in particular was sharing users’ age and location tied to a device ID that would allow trackers to match…
A new year often starts with good resolutions. Some resolve to change a certain habit, others resolve to abandon an undesired trait. Mobile app makers, too, claim to have user behavior and their preferences at their heart. From dating to health to music, their promise is to add convenience to…
San Francisco—The Electronic Frontier Foundation (EFF) today released a comprehensive report that identifies and explains the hidden technical methods and business practices companies use to collect and track our personal information from the minute we turn on our devices each day.Published on Cyber Monday, when millions of consumers are shopping…
If you open Facebook’s mobile app today, it will likely suggest that you try the company’s new Dating service, which just launched in the U.S. after a rollout in 19 other countries last year. But with the company’s track record of mishandling user data, and its business model of monetizing…
Back to top


You might also like

Leave a Reply

Your email address will not be published. Required fields are marked *