That's pretty damning. Facebook execs knew that extremist groups were using their platform and Facebook's own tooling catalyzed their growth, and yet they did nothing about it.
The consciously and proactively blocked attempts to fix it.
I think you misinterpreted this?
Seems like the relevance of that line really depends on answers to both. I.e., if extremist is super narrow we may be talking about 64 people out of 100. If extremist is overly broad, then maybe all the recommendations were for groups that a majority of the population would not find offensive.
Just saying the line by itself without context doesn't convey as much information as it first appears.
What else beside outright banning should have they done ? (I think banning extremists wouldn't have impacted their revenues much so they should have but that's another debate)
Facebook have, perhaps accidentally, created a monster of perverse incentives. Not sure what the solution is, besides regulation (which would be extremely difficult).
Why anyone think capitalists can actually practice morality? That's never been done in the hundreds of years of history of capitalism.
And capitalists can be quite moral personally. Across the history, the rich and powerful have always had a positive image. But their enterprises have always been requiring regulations.
That's almost certainly not what they did. When you see someone ranting about the 5G turning the coronavirus communist or whatever, that person didn't generally come up with that idea themselves; they were exposed to it online, either via friends, or via this.
Their algorithm is likely pushing extremist nonsense on people which it determines are vulnerable to believing it, which isn't the same as having an affinity for it. Obviously this isn't what they set out to do; they presumably set out to increase engagement, and if that happens to increase engagement, well...
The solution is only difficult if you start from the basis that Facebook must continue to exist. If they cannot run a profitable business that isn’t harmful, that’s noone’s problem but theirs.
When you're being reckless on purpose, none of the damage you create is accidental.
I could see an employee giving him that data out of concern, but that's a fair point.
A recommendation engine is just an algorithm to maximize an objective function. That objective function being matching users with content that they enjoy and engage with. The algorithm has no in-built notion of political extremism. It almost assuredly seems to be the case that people with radical opinions prefer to consume media that matches their views. If Bob is a three-percenters, it's highly unlikely he'd prefer to read the latest center-left think piece from The Atlantic.
Unless you're willing to ban recommend engines entirely, the only possible alternative I can see is for Facebook to intentionally tip the scales. Extremist political opinions would have to be explicitly penalized in the objective function.
But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion. It means some humans at Facebook are intentionally deciding what people should and should not read, watch and listen to. Remember Facebook as an organization is not terribly representative of the country as a whole. Fewer than 5% of Facebook employees vote Republican, compared to 50% of the country. Virtually no one is over 50. Males are over-represented relative to females. Blacks and hispanics are heavily under-represented. And that doesn't even get into international markets, where the Facebook org is even less representative.
The cure sounds worse than the disease. I really think it's a bad idea to pressure Facebook into the game of explicitly picking political winners and losers. A social media platform powerful enough to give you everything you want is strong enough to destroy everything you value.
The same thing social networks did before.
If I subscribed to 1000 people, show me whatever the hell they wrote, all of it, in chronological order.
Don't show me what my friends wrote on other pages, if they think that's important of interesting, they will link or share manually.
What should Big Tobacco do? If your business is a net negative for the world... get out of business. This is not hard. Corporations are not precious endangered species that we have some moral obligation to keep alive.
> A recommendation engine is just an algorithm to maximize an objective function.
A cigarette is just dried leaves wrapped in paper. If the use and production of that devices harms the world, stop using and producing it.
> But now you've turned Facebook from a neutral platform into an explicit arbiter of political opinion.
Facebook is already a non-neutral platform. Humans at Facebook chose to use an algorithm to decide recommendations and chose which datasets to use to train that algorithm.
Playing Russian roulette and pointing the gun at someone else before pulling the trigger does not absolve you of responsibility. Sure, the revolver randomly decided which chamber to stop at, but you chose to play Russian roulette with it.
Edit: I see the confusion now. emilsedgh, you, and I all agree. I thought emilsedgh was saying the opposite of what they wrote.
With social media, anecdotal accusations abound of negative impacts on mental health or political polarization. Yet the most carefully conducted research shows no evidence that either[1][2] of these charges are true to any meaningful degree. Simply put the academic evidence is not contagious with the journalistic outrage.
What's more likely is the panic over social media is mirroring previous generations' moral panic over new forms of media. When the literary novel first gained popularity, social guardians in the older generation worried that it would corrupt the youth.[3]
The same story played out with movies, rock music, video games, and porn among other things. The dynamic is propelled by old media having a vested interest in whipping up a frenzy against its new media competitors. In almost every case the concerns proved unfounded or overblown. I'd be pretty surprised if social media proved the exception, when we've always seen the same story again and again.
[1]https://twitter.com/DegenRolf/status/1217307200517033986 [2]https://twitter.com/degenrolf/status/986146855007539201 [3]https://www.economist.com/1843/2020/01/20/an-18th-century-mo...
People's higher goals are often counter to their day-to-day instinctive behaviors. We should find ways to optimize those goals, rather than momentary metrics.
It was certainly questioned for many decades before we got to that point. Meanwhile, millions died. And during that entire time Big Tobacco had no difficulty drumming up doctors and scientists willing to argue against the negative health consequences of smoking.
Limit shares/retweets. Limit groups sizes. Surface more information as topics/tags/whatever so that users can do more sophisticated filtering themselves or collaboratively. I want to mute my uncle when he talks about politics, not all the time. Facebook already does more sophisticated analyses than just extracting topic information (I know because I work there and I can see the pipelines). Show those results to me so I can use them as I see fit. That's how you make things better. Chronological vs. algorithmic order? Pfft. In fact, I do want the most interesting things out of that set shown first. I just want to have more control over what's in the set.
Sorting something to 1000-th page is censorship. Legally probably not, it's still available you just need to click page down 1000 times but IANAL and don't care.
I don't want algorithms to do any filtering. If someone shares crap every 10 minutes I can always unfollow. Still, I like your idea about manual filtering with tags.
Rejection of science in favor of something you personally want to be true isn’t a new internet age development.
The clear result of this algorithm has been to happily send lies, misinformation, emotionally manipulative opinions, and other content at a scale and speed that was never achieved by a New York Times bestseller, MTV, or Rockstar Games.
All media has always exploited our cognitive biases and irrationality to its end; but to do it worldwide and simultaneously, 24 hours a day, 7 days a week, without rest or remorse, is pure stochastic terror.
Move fast and break things indeed.