zlacker

[return to "YouTube’s Algorithm Incentivizes the Wrong Behavior"]
1. roster+gh[view] [source] 2019-06-14 18:23:37
>>furcyd+(OP)
Ah, the classic “think of the children!” argument. It is no one’s responsibility other than the parent to ensure their child isn’t watching inappropriate content (which will be different for every family and individual).

This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.

◧◩
2. rspeer+Vh[view] [source] 2019-06-14 18:27:35
>>roster+gh
There are healthy recommender systems, like Spotify.

YouTube is a _disastrously_ unhealthy recommender system, and they've let it go completely out of control.

◧◩◪
3. ariwil+ri[view] [source] 2019-06-14 18:31:33
>>rspeer+Vh
Spotify has 40M tracks total. On YouTube, more than 5B videos are watched by users every day. Different scales of problem demand different solutions.
◧◩◪◨
4. amphib+Zi[view] [source] 2019-06-14 18:35:26
>>ariwil+ri
I don't know what the comment you are replying to meant, I interpreted it to mean the algo takes you down a rabbit hole to darker content, however for me I miss the days when it actually recommended relevant videos, similar to the one I was watching.

My entire sidebar is now just a random assortment of irrelevant interests. For instance I wanted to learn to play a denser piano chord, I learned it ages ago but I still get like 20 videos that explain how to add extensions to a 7 chord, even if I'm watching a video on the F-35 fighter pilot.

[go to top]