zlacker

[parent] [thread] 3 comments
1. ajross+(OP)[view] [source] 2025-05-06 17:55:14
This is a circular semantic argument. You're saying knowledge is knowledge because it's correct, where guessing is guessing because it's a guess. But "is it correct?" is precisely the question you're asking the poor LLM to answer in the first place. It's not helpful to just demand a computation device work the way you want, you need to actually make it work.

Also, too, there are whole subfields of philosophy that make your statement here kinda laughably naive. Suffice it to say that, no, knowledge as rigorously understood does not have "an objective correctness".

replies(2): >>myname+Q3 >>Volund+4x
2. myname+Q3[view] [source] 2025-05-06 18:19:13
>>ajross+(OP)
I mean, it clearly does based on your comments showing a need for a correctness check to disambiguate between made up "hallucinations" and actual "knowledge" (together, a "consistently correct guess").

The fact that you are humanizing an LLM is honestly just plain weird. It does not have feelings. It doesn't care that it has to answer "is it correct?" and saying poor LLM is just trying to tug on heartstrings to make your point.

replies(1): >>ajross+Q9
◧◩
3. ajross+Q9[view] [source] [discussion] 2025-05-06 18:56:12
>>myname+Q3
FWIW "asking the poor <system> to do <requirement>" is an extremely common idiom. It's used as a metaphor for an inappropriate or unachievable design requirement. Nothing to do with LLMs. I work on microcontrollers for a living.
4. Volund+4x[view] [source] 2025-05-06 21:36:30
>>ajross+(OP)
> You're saying knowledge is knowledge because it's correct, where guessing is guessing because it's a guess.

Knowledge is knowledge because the knower knows it to be correct. I know I'm typing this into my phone, because it's right here in my hand. I'm guessing you typed your reply into some electronic device. I'm guessing this is true for all your comments. Am I 100% accurate? You'll have to answer that for me. I don't know it to be true, it's a highly informed guess.

Being wrong sometimes is not what makes a guess a guess. It's the different between pulling something from your memory banks, be they biological or mechanical, vs inferring it from some combination of your knowledge (what's in those memory banks), statistics, intuition, and whatever other fairy dust you sprinkle on.

[go to top]