zlacker

[return to "Testimony to House committee by former Facebook executive Tim Kendall"]
1. save_f+q8[view] [source] 2020-09-24 16:05:14
>>aaronb+(OP)
> In 2016, internal analysis at Facebook found 64% of all extremist group joins were due to their own recommendation tools. Yet repeated attempts to counteract this problem were ignored or shut down.

That's pretty damning. Facebook execs knew that extremist groups were using their platform and Facebook's own tooling catalyzed their growth, and yet they did nothing about it.

◧◩
2. johnch+Za[view] [source] 2020-09-24 16:16:13
>>save_f+q8
I don't like Facebook but why would they configure a recommendation engine to stop suggesting extremist groups to people with extremist affinities ? There are no humans behind those wheels.

What else beside outright banning should have they done ? (I think banning extremists wouldn't have impacted their revenues much so they should have but that's another debate)

◧◩◪
3. rsynno+sc[view] [source] 2020-09-24 16:23:29
>>johnch+Za
> I don't like Facebook but why would they configure a recommendation engine to stop suggesting extremist groups to people with extremist affinities

That's almost certainly not what they did. When you see someone ranting about the 5G turning the coronavirus communist or whatever, that person didn't generally come up with that idea themselves; they were exposed to it online, either via friends, or via this.

Their algorithm is likely pushing extremist nonsense on people which it determines are vulnerable to believing it, which isn't the same as having an affinity for it. Obviously this isn't what they set out to do; they presumably set out to increase engagement, and if that happens to increase engagement, well...

[go to top]