zlacker

[parent] [thread] 1 comments
1. anon_a+(OP)[view] [source] 2026-01-23 12:55:25
People's trust on LLM imo stems from the lack of awareness of AI hallucinating. Hallucination benchmarks are often hidden or talked about hastily in marketing videos.
replies(1): >>wpietr+77
2. wpietr+77[view] [source] 2026-01-23 13:38:55
>>anon_a+(OP)
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
[go to top]