zlacker

[return to "YouTube’s Algorithm Incentivizes the Wrong Behavior"]
1. strike+jh[view] [source] 2019-06-14 18:23:48
>>furcyd+(OP)
"If YouTube won’t remove the algorithm, it must, at the very least, make significant changes, and have greater human involvement in the recommendation process.", man does this person know how many videos and how many users YouTube has? They cannot use anything except an algorithm to recommend videos. They cannot use anything except an algorithm to detect videos inappropriate for children. It seems YouTube is working on this, and this opinion seems like a ill thought out fluff piece to enrage readers and sell this persons book.
◧◩
2. robbro+l01[view] [source] 2019-06-15 02:49:16
>>strike+jh
I would think having humans more involved in training the algorithm could scale much better.

Also, detecting videos that are inappropriate for children is a lot harder than determining certain content creators that are trustworthy to post videos that are appropriate (and to tag them correctly). That can be learned from the user's history, how many times their stuff has been flagged, getting upvotes from users that are themselves deemed credible, and so on. The more layers of indirection, the better, a la PageRank.

So even without analyzing the video itself, it would have a much smaller set of videos it can recommend from, but still potentially millions of videos. You still need some level of staff to train the algorithm, but you don't have to have paid staff look at every single video to have a good set of videos it can recommend. The staff might spend most of their time looking at videos that are anomalous, such as they were posted by a user the algorithm trusted but then flagged by a user that the algorithm considered credible. Then they would tag that video with some rich information that will help the algorithm in the future, beyond just removing that video or reducing the trust of the poster or the credibility of the flagger.

◧◩◪
3. ehsank+B41[view] [source] 2019-06-15 04:36:36
>>robbro+l01
The algorithm works really damn well for 99.999% of the cases. It manages to show me great recommendations from very niche things I'm interested in. But it's the very same behavior that can, in some cases, lead to issues.
◧◩◪◨
4. nrayna+7i1[view] [source] 2019-06-15 10:05:57
>>ehsank+B41
To me it always pulls me towards television or Hollywood garbage. And videos I have already watched, hundreds of them.
[go to top]