Also, too, there are whole subfields of philosophy that make your statement here kinda laughably naive. Suffice it to say that, no, knowledge as rigorously understood does not have "an objective correctness".
The fact that you are humanizing an LLM is honestly just plain weird. It does not have feelings. It doesn't care that it has to answer "is it correct?" and saying poor LLM is just trying to tug on heartstrings to make your point.
Knowledge is knowledge because the knower knows it to be correct. I know I'm typing this into my phone, because it's right here in my hand. I'm guessing you typed your reply into some electronic device. I'm guessing this is true for all your comments. Am I 100% accurate? You'll have to answer that for me. I don't know it to be true, it's a highly informed guess.
Being wrong sometimes is not what makes a guess a guess. It's the different between pulling something from your memory banks, be they biological or mechanical, vs inferring it from some combination of your knowledge (what's in those memory banks), statistics, intuition, and whatever other fairy dust you sprinkle on.