This is like when Amazon tried to make a hiring bot and that bot decided that if you had "harvard" on your resume, you should be hired.
Or when certain courts used sentencing bots trhat recommended sentencings for people and it inevitably used racial stastistics to recommend what we already know were biased stats.
I agree safety is not "stop the Terminator 2 timeline" but there's serious safety concerns in just embedding historical information to make future decisions.