zlacker

[return to "YouTube’s Algorithm Incentivizes the Wrong Behavior"]
1. roster+gh[view] [source] 2019-06-14 18:23:37
>>furcyd+(OP)
Ah, the classic “think of the children!” argument. It is no one’s responsibility other than the parent to ensure their child isn’t watching inappropriate content (which will be different for every family and individual).

This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.

◧◩
2. rspeer+Vh[view] [source] 2019-06-14 18:27:35
>>roster+gh
There are healthy recommender systems, like Spotify.

YouTube is a _disastrously_ unhealthy recommender system, and they've let it go completely out of control.

◧◩◪
3. ihuman+vi[view] [source] 2019-06-14 18:32:15
>>rspeer+Vh
How is Spotify's different from Youtube?
◧◩◪◨
4. notrid+9z[view] [source] 2019-06-14 20:34:58
>>ihuman+vi
I'm pretty sure all content on Spotify gets manually curated first, so abusive tagging doesn't happen, and some of the worst content simply doesn't get uploaded at all. Spotify also doesn't try to be a news site, so they can afford to have a couple week's lag between uploading a song and having it show up in people's recommendation feed.
[go to top]