zlacker

[parent] [thread] 34 comments
1. save_f+(OP)[view] [source] 2020-09-24 16:05:14
> In 2016, internal analysis at Facebook found 64% of all extremist group joins were due to their own recommendation tools. Yet repeated attempts to counteract this problem were ignored or shut down.

That's pretty damning. Facebook execs knew that extremist groups were using their platform and Facebook's own tooling catalyzed their growth, and yet they did nothing about it.

replies(11): >>emilse+91 >>ballen+n2 >>johnch+z2 >>boston+83 >>rsynno+l3 >>justic+B3 >>ipsum2+68 >>dcolki+ka >>anigbr+Ma >>chinat+we >>antpls+MI4
2. emilse+91[view] [source] 2020-09-24 16:09:53
>>save_f+(OP)
Well, based on the original quote, it's not that they passively did nothing about it.

The consciously and proactively blocked attempts to fix it.

replies(1): >>eximiu+32
◧◩
3. eximiu+32[view] [source] [discussion] 2020-09-24 16:13:59
>>emilse+91
> attempts to counteract this problem were ignored or shut down.

I think you misinterpreted this?

replies(1): >>emilse+b3
4. ballen+n2[view] [source] 2020-09-24 16:15:32
>>save_f+(OP)
How did they define "extremist" in that analysis? And how many total people are we talking about?

Seems like the relevance of that line really depends on answers to both. I.e., if extremist is super narrow we may be talking about 64 people out of 100. If extremist is overly broad, then maybe all the recommendations were for groups that a majority of the population would not find offensive.

Just saying the line by itself without context doesn't convey as much information as it first appears.

5. johnch+z2[view] [source] 2020-09-24 16:16:13
>>save_f+(OP)
I don't like Facebook but why would they configure a recommendation engine to stop suggesting extremist groups to people with extremist affinities ? There are no humans behind those wheels.

What else beside outright banning should have they done ? (I think banning extremists wouldn't have impacted their revenues much so they should have but that's another debate)

replies(1): >>rsynno+24
6. boston+83[view] [source] 2020-09-24 16:19:05
>>save_f+(OP)
Do we know what percentage of group joins in general are due to recommendations? Would be an interesting data point to have.
◧◩◪
7. emilse+b3[view] [source] [discussion] 2020-09-24 16:19:18
>>eximiu+32
Maybe. How do you interpret it?
replies(2): >>whymau+94 >>ghayes+an
8. rsynno+l3[view] [source] 2020-09-24 16:20:03
>>save_f+(OP)
But doing so would hurt engagement, and hence the bottom line!

Facebook have, perhaps accidentally, created a monster of perverse incentives. Not sure what the solution is, besides regulation (which would be extremely difficult).

replies(2): >>goatin+l4 >>shawn-+Z4
9. justic+B3[view] [source] 2020-09-24 16:21:59
>>save_f+(OP)
Well, I dont think anyone should be surprised...

Why anyone think capitalists can actually practice morality? That's never been done in the hundreds of years of history of capitalism.

And capitalists can be quite moral personally. Across the history, the rich and powerful have always had a positive image. But their enterprises have always been requiring regulations.

◧◩
10. rsynno+24[view] [source] [discussion] 2020-09-24 16:23:29
>>johnch+z2
> I don't like Facebook but why would they configure a recommendation engine to stop suggesting extremist groups to people with extremist affinities

That's almost certainly not what they did. When you see someone ranting about the 5G turning the coronavirus communist or whatever, that person didn't generally come up with that idea themselves; they were exposed to it online, either via friends, or via this.

Their algorithm is likely pushing extremist nonsense on people which it determines are vulnerable to believing it, which isn't the same as having an affinity for it. Obviously this isn't what they set out to do; they presumably set out to increase engagement, and if that happens to increase engagement, well...

◧◩◪◨
11. whymau+94[view] [source] [discussion] 2020-09-24 16:23:47
>>emilse+b3
I'm interpreting that the business development culture at Facebook disincentivized taking action against extremist groups.
replies(1): >>lallys+K6
◧◩
12. goatin+l4[view] [source] [discussion] 2020-09-24 16:24:48
>>rsynno+l3
Not sure what the solution is, besides regulation (which would be extremely difficult).

The solution is only difficult if you start from the basis that Facebook must continue to exist. If they cannot run a profitable business that isn’t harmful, that’s noone’s problem but theirs.

replies(1): >>dylan6+96
◧◩
13. shawn-+Z4[view] [source] [discussion] 2020-09-24 16:27:49
>>rsynno+l3
I would be very careful about considering any actions that Facebook, in particular, takes to be accidental. From the very beginning, intentional recklessness ("Move fast and break things") has been their credo.

When you're being reckless on purpose, none of the damage you create is accidental.

replies(1): >>bognit+By
◧◩◪
14. dylan6+96[view] [source] [discussion] 2020-09-24 16:32:06
>>goatin+l4
I would love to see Facebook labeled as a domestic terrorist group for how much they have aided and abetted, and sanctioned accordingly. Make it illegal for companies to do business with them. You don't have to shut them down, but if you make it so they can't earn enough to survive, then oh well. Sorry, not sorry, you're business model wasn't a good one.
◧◩◪◨⬒
15. lallys+K6[view] [source] [discussion] 2020-09-24 16:34:46
>>whymau+94
Attempts to prevent it were shut down. Shutting something down is an activity, thus /active/.
replies(1): >>whymau+Pl
16. ipsum2+68[view] [source] 2020-09-24 16:40:37
>>save_f+(OP)
According to Tim Kendall's LinkedIn, he stopped working at Facebook in 2010. So it's interesting that he claims to have internal information from 2016.
replies(2): >>save_f+E8 >>evgen+zH
◧◩
17. save_f+E8[view] [source] [discussion] 2020-09-24 16:43:00
>>ipsum2+68
Maybe, but it's pretty common for people to keep up with former coworkers and happenings in the company, especially since he was an early employee with (I'm assuming) a fair amount of equity.

I could see an employee giving him that data out of concern, but that's a fair point.

18. dcolki+ka[view] [source] 2020-09-24 16:50:27
>>save_f+(OP)
On the surface it sounds pretty outrageous. My question would be though, what should Facebook do instead?

A recommendation engine is just an algorithm to maximize an objective function. That objective function being matching users with content that they enjoy and engage with. The algorithm has no in-built notion of political extremism. It almost assuredly seems to be the case that people with radical opinions prefer to consume media that matches their views. If Bob is a three-percenters, it's highly unlikely he'd prefer to read the latest center-left think piece from The Atlantic.

Unless you're willing to ban recommend engines entirely, the only possible alternative I can see is for Facebook to intentionally tip the scales. Extremist political opinions would have to be explicitly penalized in the objective function.

But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion. It means some humans at Facebook are intentionally deciding what people should and should not read, watch and listen to. Remember Facebook as an organization is not terribly representative of the country as a whole. Fewer than 5% of Facebook employees vote Republican, compared to 50% of the country. Virtually no one is over 50. Males are over-represented relative to females. Blacks and hispanics are heavily under-represented. And that doesn't even get into international markets, where the Facebook org is even less representative.

The cure sounds worse than the disease. I really think it's a bad idea to pressure Facebook into the game of explicitly picking political winners and losers. A social media platform powerful enough to give you everything you want is strong enough to destroy everything you value.

replies(3): >>JoshTk+Wc >>Const-+1d >>munifi+Kk
19. anigbr+Ma[view] [source] 2020-09-24 16:53:00
>>save_f+(OP)
Sadly unsurprising. I have Facebook sockpuppet accounts that I use just for researching extremist types and I am constantly amazed at how much of the work FB does for me.
◧◩
20. JoshTk+Wc[view] [source] [discussion] 2020-09-24 17:04:22
>>dcolki+ka
We probably should ban recommendation engines for content.
replies(1): >>nitrog+Uw
◧◩
21. Const-+1d[view] [source] [discussion] 2020-09-24 17:04:42
>>dcolki+ka
> what should Facebook do instead?

The same thing social networks did before.

If I subscribed to 1000 people, show me whatever the hell they wrote, all of it, in chronological order.

Don't show me what my friends wrote on other pages, if they think that's important of interesting, they will link or share manually.

replies(1): >>notaco+D71
22. chinat+we[view] [source] 2020-09-24 17:12:51
>>save_f+(OP)
Who will sue the execs Zuckerberg/Sandberg at this point? It's about time.
◧◩
23. munifi+Kk[view] [source] [discussion] 2020-09-24 17:48:11
>>dcolki+ka
> My question would be though, what should Facebook do instead?

What should Big Tobacco do? If your business is a net negative for the world... get out of business. This is not hard. Corporations are not precious endangered species that we have some moral obligation to keep alive.

> A recommendation engine is just an algorithm to maximize an objective function.

A cigarette is just dried leaves wrapped in paper. If the use and production of that devices harms the world, stop using and producing it.

> But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion.

Facebook is already a non-neutral platform. Humans at Facebook chose to use an algorithm to decide recommendations and chose which datasets to use to train that algorithm.

Playing Russian roulette and pointing the gun at someone else before pulling the trigger does not absolve you of responsibility. Sure, the revolver randomly decided which chamber to stop at, but you chose to play Russian roulette with it.

replies(1): >>dcolki+Uq
◧◩◪◨⬒⬓
24. whymau+Pl[view] [source] [discussion] 2020-09-24 17:54:03
>>lallys+K6
What? This literally makes no sense, lol.

Edit: I see the confusion now. emilsedgh, you, and I all agree. I thought emilsedgh was saying the opposite of what they wrote.

◧◩◪◨
25. ghayes+an[view] [source] [discussion] 2020-09-24 17:59:58
>>emilse+b3
Your original statement can be casually read to seem like your disagreeing with OP’s message (that Facebook did not quell extremist groups), due to the structure of your message (a lead statement followed by a delayed contradiction). You could make it clearer with better emphasis that you are stating “agree and” instead of “no because”.
◧◩◪
26. dcolki+Uq[view] [source] [discussion] 2020-09-24 18:17:17
>>munifi+Kk
The difference is that it's unquestionable that cigarettes are enormously harmful. To claim that the case against social media is anywhere near as clear-cut as tobacco is to do a disservice to the heroic public health efforts it took to cut down on smoking.

With social media, anecdotal accusations abound of negative impacts on mental health or political polarization. Yet the most carefully conducted research shows no evidence that either[1][2] of these charges are true to any meaningful degree. Simply put the academic evidence is not contagious with the journalistic outrage.

What's more likely is the panic over social media is mirroring previous generations' moral panic over new forms of media. When the literary novel first gained popularity, social guardians in the older generation worried that it would corrupt the youth.[3]

The same story played out with movies, rock music, video games, and porn among other things. The dynamic is propelled by old media having a vested interest in whipping up a frenzy against its new media competitors. In almost every case the concerns proved unfounded or overblown. I'd be pretty surprised if social media proved the exception, when we've always seen the same story again and again.

[1]https://twitter.com/DegenRolf/status/1217307200517033986 [2]https://twitter.com/degenrolf/status/986146855007539201 [3]https://www.economist.com/1843/2020/01/20/an-18th-century-mo...

replies(2): >>munifi+ZF >>kthejo+nH1
◧◩◪
27. nitrog+Uw[view] [source] [discussion] 2020-09-24 18:45:16
>>JoshTk+Wc
Social platform recommendation engines are designed to optimize "revealed preference." I've commented in the past that "revealed preference" is just a new name for exploitation of vice.

People's higher goals are often counter to their day-to-day instinctive behaviors. We should find ways to optimize those goals, rather than momentary metrics.

◧◩◪
28. bognit+By[view] [source] [discussion] 2020-09-24 18:53:24
>>shawn-+Z4
Move fast and break things out another way is: hurry up let’s undermine democracy
◧◩◪◨
29. munifi+ZF[view] [source] [discussion] 2020-09-24 19:36:08
>>dcolki+Uq
> The difference is that it's unquestionable that cigarettes are enormously harmful.

It was certainly questioned for many decades before we got to that point. Meanwhile, millions died. And during that entire time Big Tobacco had no difficulty drumming up doctors and scientists willing to argue against the negative health consequences of smoking.

replies(1): >>karmel+iy1
◧◩
30. evgen+zH[view] [source] [discussion] 2020-09-24 19:45:14
>>ipsum2+68
It is really not that surprising. People talk. When you work for a company like this you end up with a large portion of your circle of friends being current or former co-workers. I knew things there were not NDA-cleared for years after I left various startups because people chat and if you know the right questions to ask or are reasonably good at appearing to know just a bit more than you happen to know then people will often fill in the blanks for you. The well really only runs dry when most of that cohort have also left the company.
◧◩◪
31. notaco+D71[view] [source] [discussion] 2020-09-24 22:19:45
>>Const-+1d
It's the linking or sharing manually that matters. Chronological order doesn't. If a hundred things are shared with you, who cares what order they're shown? If you want to have any real effect, you have to change the distribution patterns. Change what's in the list, not in what order. (Note: recommendation systems are a whole different matter. I'm talking about what's left after they're taken out of the picture.)

Limit shares/retweets. Limit groups sizes. Surface more information as topics/tags/whatever so that users can do more sophisticated filtering themselves or collaboratively. I want to mute my uncle when he talks about politics, not all the time. Facebook already does more sophisticated analyses than just extracting topic information (I know because I work there and I can see the pipelines). Show those results to me so I can use them as I see fit. That's how you make things better. Chronological vs. algorithmic order? Pfft. In fact, I do want the most interesting things out of that set shown first. I just want to have more control over what's in the set.

replies(1): >>Const-+ru1
◧◩◪◨
32. Const-+ru1[view] [source] [discussion] 2020-09-25 01:42:17
>>notaco+D71
> who cares what order they're shown?

Sorting something to 1000-th page is censorship. Legally probably not, it's still available you just need to click page down 1000 times but IANAL and don't care.

I don't want algorithms to do any filtering. If someone shares crap every 10 minutes I can always unfollow. Still, I like your idea about manual filtering with tags.

◧◩◪◨⬒
33. karmel+iy1[view] [source] [discussion] 2020-09-25 02:27:29
>>munifi+ZF
Precisely this. Many people denied the idea that smoking was unhealthy. It sounds hard to believe, but I personally know many people who said these things to me in the 1990s.

Rejection of science in favor of something you personally want to be true isn’t a new internet age development.

◧◩◪◨
34. kthejo+nH1[view] [source] [discussion] 2020-09-25 04:21:28
>>dcolki+Uq
The difference between social media and all the other media you mentioned isn't the format (still mostly just images, text, and video like the old media) or in its content (Sturgeon's Law is universal); it's in the ability to disseminate messages to a global audience instantaneously, and the careful curation of that content to drive engagement.

The clear result of this algorithm has been to happily send lies, misinformation, emotionally manipulative opinions, and other content at a scale and speed that was never achieved by a New York Times bestseller, MTV, or Rockstar Games.

All media has always exploited our cognitive biases and irrationality to its end; but to do it worldwide and simultaneously, 24 hours a day, 7 days a week, without rest or remorse, is pure stochastic terror.

Move fast and break things indeed.

35. antpls+MI4[view] [source] 2020-09-26 12:06:56
>>save_f+(OP)
Having all extremists in one directory must be handy for FBI/police's investigation
[go to top]