I think in science fiction it’s one of the most common themes for the talking computer to be utterly horribly wrong, often resulting in complete annihilation of all life on earth.
Unless I have been reading very different science fiction I think it’s definitely not that.
I think it’s more the confidence and seeming plausibility of LLM answers