- much less resource hungry / planet warming
- auditable chains of inference
But then again, humans "hallucinate" in this sense all the time, too.
The thing is, if we produce something that is worse than humans (and right now LLMs are worse than humans with good search indexes at hand), there's not much point doing it. It's provably less expensive to bear, raise, and educate actual humans. And to educate a human, somehow you don't need to dump the whole internet and all pirated content ever created into their heads.