It barely exists so far and is only implemented by a single browser that I'd never heard of (Puma). Hardly fair to demand if people are using it yet.
> how do you propose things work?
We go back to advertising without tracking.
If I was to receive an unwanted phone call from a travel agency while I am browsing plane tickets on the net, that would be creepy and annoying to me: I prefer to make thoughtful decisions by myself, thank you.
I realize not everyone thinks the same way. But in my opinion, advertisement has a severe net negative impact on our society, and would like to get rid of it altogether.
I already pay for targeted advertisement that comes in the news articles I read, no need to force-feed me.
I've seen that fun video (in French [1]) where a person asks various advertisers their opinion on the role of advertising in the society, then asks them about an "electric knife" ad that was then running. The cognitive dissonance that follows is hilarious.
[1] (1990, no subs): https://www.dailymotion.com/video/x869qr
https://chrome.google.com/webstore/detail/coil/locbifcbeldmn...
I agree that their web presentation leaves a lot to be desired.
[0] https://twitter.com/__jakub_g/status/1365400306767581185
> "Whether the browser sends a real FLoC or a random one is user controllable."
FLoC stuff is client side. You can send nil FLoC IDs. You can randomize them on every request. You can swap them with your friends. Whatever.
Vanilla Chrome might not let you (my money would be on an off-switch but not anything fun) but that's hardly going to be a blocker.
(googler but works on something completely unrelated)
I tried this, but after some time your IP just gets flagged for click fraud.
It's interesting because you won't see ads anymore, but you also won't be able to pollute more datasets.
Overall it’s been a “meh”
> A browser with FLoC enabled would collect information about its user’s browsing habits, then use that information to assign its user to a “cohort” or group. Users with similar browsing habits—for some definition of “similar”—would be grouped into the same cohort. Each user’s browser will share a cohort ID, indicating which group they belong to, with websites and advertisers. According to the proposal, at least a few thousand users should belong to each cohort (though that’s not a guarantee).
> If that sounds dense, think of it this way: your FLoC ID will be like a succinct summary of your recent activity on the Web.
> Google’s proof of concept used the domains of the sites that each user visited as the basis for grouping people together. It then used an algorithm called SimHash to create the groups. SimHash[0] can be computed locally on each user’s machine, so there’s no need for a central server to collect behavioral data. However, a central administrator could have a role in enforcing privacy guarantees. In order to prevent any cohort from being too small (i.e. too identifying), Google proposes that a central actor could count the number of users assigned each cohort. If any are too small, they can be combined with other, similar cohorts until enough users are represented in each one.
[0] https://en.wikipedia.org/wiki/SimHash
> In computer science, SimHash is a technique for quickly estimating how similar two sets are. The algorithm is used by the Google Crawler to find near duplicate pages. It was created by Moses Charikar.
So...in addition, Google can use all the users' CPUs instead of their own.
You may want to see https://en.wikipedia.org/wiki/Draft:Effects_of_the_2007-2008...
A browser leaking browsing history was considered an outright bug (https://blog.mozilla.org/security/2010/03/31/plugging-the-cs...).
One could standardize a list of things people would want to fill out about themselves (i.e. please put in your age and sex or else we can't guarantee you won't be hassled with ads of things completely irrelevant to you), but your software inferring things about you and snitching to the world is outright malware.
https://github.com/WICG/floc describes something open source and running on the client. Will that be sufficient, or is there additional disclosure you'd like to see?
We had the same choice on Read the Docs, but didn't really have any other way to make money but advertising. We decided to build ethical advertising, so that we could be proud of the ads we show, knowing we weren't adding to massive pool of data out there. I talked a bit more about it here: https://www.ericholscher.com/blog/2016/aug/31/funding-oss-ma...
Last time FLoC came up, I commented that the idea of FLoC missed the point of why we oppose tracking: https://news.ycombinator.com/item?id=25906791
The EFF writes:
> The power to target is the power to discriminate.
I would extend on this point: the power to target information that the user is not choosing to share is the power to discriminate. Part of recognizing people's agency online is giving them the ability to choose how they present themselves and to choose what they share. It's not inherently wrong to say that someone might want to signal something about themselves that they find important or even just convenient to share. But that should always be their choice, it should not be a top down decision about what information is "safe" or "dangerous".
FLoC has some benefits (although they won't matter once every website decides to use FLoC as a fingerprinting vector), but even saying that FLoC has benefits, it is still based on the idea that users should not be in charge of their identities. It's got to be automated, it's got to happen in the background, it's got to use machine learning and be something that users can't inspect. I oppose the philosophy behind both current tracking systems and proposals like FLoC.
Last time this came up I also theorized about what a privacy-respecting version of FLoC could look like for people who do want to see ads or who do want personalized content online -- what a version of FLoC could be that I would be more supportive of: https://news.ycombinator.com/item?id=25907079
None of those ideas are fleshed out, but they try to get at the heart of what the fundamental difference is between allowing a user to easily signal that they want to see personalized content about shoes, and trying to intuit behind a user's back that they will buy shoes if you show them a particular ad.
What you get in return is that ad-supported sites you visit are better funded because they can show better-targeted advertising.