zlacker

[parent] [thread] 1 comments
1. myname+(OP)[view] [source] 2025-05-06 18:19:13
I mean, it clearly does based on your comments showing a need for a correctness check to disambiguate between made up "hallucinations" and actual "knowledge" (together, a "consistently correct guess").

The fact that you are humanizing an LLM is honestly just plain weird. It does not have feelings. It doesn't care that it has to answer "is it correct?" and saying poor LLM is just trying to tug on heartstrings to make your point.

replies(1): >>ajross+06
2. ajross+06[view] [source] 2025-05-06 18:56:12
>>myname+(OP)
FWIW "asking the poor <system> to do <requirement>" is an extremely common idiom. It's used as a metaphor for an inappropriate or unachievable design requirement. Nothing to do with LLMs. I work on microcontrollers for a living.
[go to top]