I'm reminded of how Google images had an issue where dark skinned people sometimes turned up in a search for gorilla. 99.9% of the time, the image recognition algorithm did really well, but here was a case where an error was really offensive. What was (probably) needed was for there to be a human that comes in and, not tag every gorilla image, but simply to give it some extra training around dark skinned humans and gorillas, or otherwise tweak some things specific to that sort of case, so the chance of it happening was reduced to nearly nothing.
There are probably a ton of situations like that in YouTube, where certain kinds of mistakes are hardly noticed (it shows you a video you weren't remotely interested in), but others can be really bad and need special training to avoid (such as where it shows violent or sexual content to someone who likes nursery rhymes and Peppa Pig).