zlacker
[parent]
[thread]
0 comments
1. wpietr+(OP)
[view]
[source]
2026-01-23 13:38:55
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
[go to top]