zlacker

[return to "Testimony to House committee by former Facebook executive Tim Kendall"]
1. save_f+q8[view] [source] 2020-09-24 16:05:14
>>aaronb+(OP)
> In 2016, internal analysis at Facebook found 64% of all extremist group joins were due to their own recommendation tools. Yet repeated attempts to counteract this problem were ignored or shut down.

That's pretty damning. Facebook execs knew that extremist groups were using their platform and Facebook's own tooling catalyzed their growth, and yet they did nothing about it.

◧◩
2. dcolki+Ki[view] [source] 2020-09-24 16:50:27
>>save_f+q8
On the surface it sounds pretty outrageous. My question would be though, what should Facebook do instead?

A recommendation engine is just an algorithm to maximize an objective function. That objective function being matching users with content that they enjoy and engage with. The algorithm has no in-built notion of political extremism. It almost assuredly seems to be the case that people with radical opinions prefer to consume media that matches their views. If Bob is a three-percenters, it's highly unlikely he'd prefer to read the latest center-left think piece from The Atlantic.

Unless you're willing to ban recommend engines entirely, the only possible alternative I can see is for Facebook to intentionally tip the scales. Extremist political opinions would have to be explicitly penalized in the objective function.

But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion. It means some humans at Facebook are intentionally deciding what people should and should not read, watch and listen to. Remember Facebook as an organization is not terribly representative of the country as a whole. Fewer than 5% of Facebook employees vote Republican, compared to 50% of the country. Virtually no one is over 50. Males are over-represented relative to females. Blacks and hispanics are heavily under-represented. And that doesn't even get into international markets, where the Facebook org is even less representative.

The cure sounds worse than the disease. I really think it's a bad idea to pressure Facebook into the game of explicitly picking political winners and losers. A social media platform powerful enough to give you everything you want is strong enough to destroy everything you value.

◧◩◪
3. Const-+rl[view] [source] 2020-09-24 17:04:42
>>dcolki+Ki
> what should Facebook do instead?

The same thing social networks did before.

If I subscribed to 1000 people, show me whatever the hell they wrote, all of it, in chronological order.

Don't show me what my friends wrote on other pages, if they think that's important of interesting, they will link or share manually.

◧◩◪◨
4. notaco+3g1[view] [source] 2020-09-24 22:19:45
>>Const-+rl
It's the linking or sharing manually that matters. Chronological order doesn't. If a hundred things are shared with you, who cares what order they're shown? If you want to have any real effect, you have to change the distribution patterns. Change what's in the list, not in what order. (Note: recommendation systems are a whole different matter. I'm talking about what's left after they're taken out of the picture.)

Limit shares/retweets. Limit groups sizes. Surface more information as topics/tags/whatever so that users can do more sophisticated filtering themselves or collaboratively. I want to mute my uncle when he talks about politics, not all the time. Facebook already does more sophisticated analyses than just extracting topic information (I know because I work there and I can see the pipelines). Show those results to me so I can use them as I see fit. That's how you make things better. Chronological vs. algorithmic order? Pfft. In fact, I do want the most interesting things out of that set shown first. I just want to have more control over what's in the set.

[go to top]