https://pages.nist.gov/frvt/html/frvt11.html?utm_source=chat...
https://www.ftc.gov/news-events/news/press-releases/2023/12/...
https://www.theguardian.com/technology/2020/jan/24/met-polic...
https://link.springer.com/article/10.1007/s00146-023-01634-z
https://www.mozillafoundation.org/en/blog/facial-recognition...
https://surface.syr.edu/cgi/viewcontent.cgi?article=2479&con...
Yeah it's pretty fucking shit, actually.
Here's the science.
LLMs are sycophants, how you ask matters
While doing so can be ok, you should probably do some checking via non-LLM means as well.
Otherwise you'll end up misunderstanding things that you _think_ you've learned about. :(