zlacker

[return to "YouTube’s Algorithm Incentivizes the Wrong Behavior"]
1. roster+gh[view] [source] 2019-06-14 18:23:37
>>furcyd+(OP)
Ah, the classic “think of the children!” argument. It is no one’s responsibility other than the parent to ensure their child isn’t watching inappropriate content (which will be different for every family and individual).

This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.

◧◩
2. rspeer+Vh[view] [source] 2019-06-14 18:27:35
>>roster+gh
There are healthy recommender systems, like Spotify.

YouTube is a _disastrously_ unhealthy recommender system, and they've let it go completely out of control.

◧◩◪
3. jasode+sk[view] [source] 2019-06-14 18:45:01
>>rspeer+Vh
>YouTube is a _disastrously_ unhealthy recommender system,

Can you explain with more details?

I use Youtube as a crowdsourced "MOOC"[0] and the algorithms usually recommended excellent followup videos for most topics.

(On the other hand, their attempt at matching "relevant" advertising to the video is often terrible. (E.g. Sephora makeup videos for women shown to male-dominated audience of audiophile gear.) Leaving aside the weird ads, the algorithm works very well for educational vids that interests me.)

[0] https://en.wikipedia.org/wiki/Massive_open_online_course

◧◩◪◨
4. ilikeh+Kl[view] [source] 2019-06-14 18:52:24
>>jasode+sk
Yes. Elsagate is an example - the creepy computer-generated violent and disturbing videos that eventually follow children's content - or the fact that just about every gaming-related video has a recommendation for an far-right rant against feminism or a Ben Shapiro screaming segment. There's also the Amazon problem - where everything related to the thing you watched once out of curiosity follows you everywhere around the site.
◧◩◪◨⬒
5. jasode+xm[view] [source] 2019-06-14 18:56:55
>>ilikeh+Kl
>Elsagate is an example,

Yes, I was aware of Elsagate.[0] I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.

I guess I should have clarified my question. I thought gp's "unhealthy" meant Youtube's algorithm was bad for somebody like me that views mainstream non-controversial videos. (Analogy might be gp (rspeer) warning me that abestos and lead paint is actually cancerous but public doesn't know it.)

[0] https://news.ycombinator.com/item?id=20090157

◧◩◪◨⬒⬓
6. Reedx+3o[view] [source] 2019-06-14 19:07:23
>>jasode+xm
> I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.

They don't. That's confirmation bias at work.

◧◩◪◨⬒⬓⬔
7. smt88+eq[view] [source] 2019-06-14 19:23:57
>>Reedx+3o
It's not 100%, but I'd consider "video games" => "Ben Shapiro" to be a pretty awful recommendation system, regardless of the reasoning behind it. As far as I know, the group "video gamers" doesn't have a political lean in either direction.

I've definitely seen this with comics. I watched a few videos criticizing Avengers: Infinity War, and now I see mostly Ben Shapiro recs. It makes no sense. I never have (and never plan to) seek out political content on YouTube.

[go to top]