zlacker

[return to "YouTube’s Algorithm Incentivizes the Wrong Behavior"]
1. roster+gh[view] [source] 2019-06-14 18:23:37
>>furcyd+(OP)
Ah, the classic “think of the children!” argument. It is no one’s responsibility other than the parent to ensure their child isn’t watching inappropriate content (which will be different for every family and individual).

This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.

◧◩
2. rspeer+Vh[view] [source] 2019-06-14 18:27:35
>>roster+gh
There are healthy recommender systems, like Spotify.

YouTube is a _disastrously_ unhealthy recommender system, and they've let it go completely out of control.

◧◩◪
3. throw2+Bj[view] [source] 2019-06-14 18:39:56
>>rspeer+Vh
Spotify's recommendation system is dealing mostly with artists that have recording contracts and professional production- their problem shouldn't be compared to YouTube's which has to deal with a mix of professional, semi-pro, and amateur created content. Also there's more of a "freshness" aspect to a lot of YT videos that isn't quite the same as what Spotify has to deal with (pop songs are usually good for a few months, but many vlogs can be stale after a week). Not only that, but many channels have a mix of content, some that goes stale quickly and some that is still relevant after many months- how does a recommendation engine figure that out?

It's better to compare Spotify's recommendations to Netflix's recommendations, which also deals with mostly professional content. Those two systems have comparable performance in my opinion.

[go to top]