Imagine if every time you went to a library to go read and learn history or science or poetry you had to pass through and see Playboy/Maxim/Pornhub version of science and history pulling you away. And if you succumbed and opened a volume like that once, then the next time you visited that aisle you found they had removed 10% of the educational books and added in 10% more softcore porn.
China banned a lot of types of content from their local Tiktok for children, but honestly we need bifurcated apps or ways to filter search results for thirst traps for adults too. It's not any different than putting chocolate bars in a health foods aisle at the grocery store or alcohol vending machines in a rehab facility.
How many of our next would-be Einsteins, Edisons, Teslas, etc are being distracted by Tiktok, mobile games, etc.
I mean, this is a game of cat and mouse. YT is already very aggressive on sexual content. Content creators just find the next line and push it there.You could require everyone making videos to be wearing suits and people would still find ways to be evocative.
At some point your best move is to self moderate. Ignore those thumbnails or aggressively click "not interested" on any content like that.
>But the labeling could be done by users.
In these times? It'd be a bloodbath of political labels being thrown at various ideaologies. It'd be a wreck.
No, this can't be automated. YT would need to pay staff and set a rubric. But I'm guessing that it's not financially impacting YT as is.
It could also go the other way. Add a "misinformation" label and suddenly everything is misinformation, from Jordan Peterson speeches to live recorded NASA space launches to Spongebob clips.