This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.
YouTube is a _disastrously_ unhealthy recommender system, and they've let it go completely out of control.
My entire sidebar is now just a random assortment of irrelevant interests. For instance I wanted to learn to play a denser piano chord, I learned it ages ago but I still get like 20 videos that explain how to add extensions to a 7 chord, even if I'm watching a video on the F-35 fighter pilot.
YouTube is a paperclip maximizer (where paperclips correspond to eyeball-hours spent watching YouTube) and at some point optimizing paperclips becomes orthogonal to human existence, and then anticorrelated with it.
I think it's a perfectly fair thing to say that maybe the negatives outweigh the positives at the present.
(This argument doesn't apply solely to YouTube, of course)
It's better to compare Spotify's recommendations to Netflix's recommendations, which also deals with mostly professional content. Those two systems have comparable performance in my opinion.
The parents are the most well placed to know at an individual level. But responsibility is a cop out, if you are just dropping it on someone.
Granted, I agree it is a hard problem. Not even sure it is solvable. :(
Yes, but you _must_ understand that most (no, ALL) of the millennial generation grew up with public content over the airwaves that was curated and had to pass certain guidelines. So many parents think that the YouTube Kids app is the same thing. it's not!
If YouTube want to be the next Television, they're going to have to assume the responsibilities and expectations surrounding the appliances they intend to replace. Pulling a Pontius Pilate and tossing the issue to another algorithm to fail at figuring out is not going to fix the problem.
Thankfully, there's much more out there than YouTube when it comes to children's entertainment, actually curated by human beings with eyeballs and brains, and not algorithms. The problem is that parents don't know these apps even exist, because YouTube has that much of a foothold as "place to see things that shut my kid up, so I can see straight."
Can you explain with more details?
I use Youtube as a crowdsourced "MOOC"[0] and the algorithms usually recommended excellent followup videos for most topics.
(On the other hand, their attempt at matching "relevant" advertising to the video is often terrible. (E.g. Sephora makeup videos for women shown to male-dominated audience of audiophile gear.) Leaving aside the weird ads, the algorithm works very well for educational vids that interests me.)
[0] https://en.wikipedia.org/wiki/Massive_open_online_course
Society has had laws in place to prevent children from viewing things they should not be (inappropriate movies, magazines, etc).
Yes, I was aware of Elsagate.[0] I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.
I guess I should have clarified my question. I thought gp's "unhealthy" meant Youtube's algorithm was bad for somebody like me that views mainstream non-controversial videos. (Analogy might be gp (rspeer) warning me that abestos and lead paint is actually cancerous but public doesn't know it.)
I do not see any Nazi far-right videos in 1.8% of my recommendations ever.
They don't. That's confirmation bias at work.
I've definitely seen this with comics. I watched a few videos criticizing Avengers: Infinity War, and now I see mostly Ben Shapiro recs. It makes no sense. I never have (and never plan to) seek out political content on YouTube.
On the internet it is much more difficult, of course, and we can't realistically expect some shady offshore site from implementing age checks, let alone recommendation algorithms. But Google is a public, respected company from a first world country that claims to be promoting social good (which, of course, is marketing BS, and even if it weren't I would not want their idea of social good, but still). You'd think that they would invest some effort into not showing inappropriate content to kids at least. But no, they throw up their hands and go on ideological witch hunts instead.
It could be the type of games involved, since I usually watch strategy, 4x, city-building, and military sims. I usually get history-channel documentaries or "here's how urban planning works in the real world" videos recommended, which suits me fine. Somebody whose gaming preferences involve killing Nazis in a WW2-era FPS might be more likely to get videos that have neo-Nazis suggesting we kill people.
The article cites actual instances and recurring problems showing that "machine learning and collaborative filtering are incapable of producing healthy recommendations.": Even when YouTube tried to produce child friendly content, they failed. You can't just say "it's fine" after the article shows it not being fine.
If nobody gives a fuck enough to affect business you can give the complete SAW series to 3 year olds and all the offended can do is yelp indignantly.
But then I try to not let my mom on YouTube either. Or myself, for that matter.