zlacker

[parent] [thread] 0 comments
1. dev2ro+(OP)[view] [source] 2025-10-29 12:22:57
Kinda funny how Grokipedia looks like an encyclopedia but clearly talks like an LLM. Lots of confidence, not so much evidence.

It’s not that it’s trying to lie — it’s just how these models work. They’re great at making language sound right, not necessarily be right. Feels more like a mirror of what the internet “thinks” than an actual source of truth.

If they framed it that way — more experiment, less Wikipedia — I think people would take it a lot better.

[go to top]