Having a picture or just a description of the face is one of the most important pieces of information the police has in order to do actual policing. You can be arrested for just broadly matching the description if you happen to be in the vicinity.
Had the guy been convicted of anything just based on that evidence, this would be a scandal. As it is, a suspect is just a suspect and this kind of thing happens all the time, because humans are just as fallible. It's just not news when there's no AI involved.
Faces generated by AI means should not count as 'probable cause' to go and arrest people. They should count as fantasy.
They don't:
https://wfdd-live.s3.amazonaws.com/styles/story-full/s3/imag...
There was further work involved, there was a witness who identified the man on a photo lineup, and so on. The AI did not identify anyone, it gave a "best effort" match. All the actual mistakes were made by humans.