> But “hallucination” is the first thing developers bring up when someone suggests using LLMs, despite it being (more or less) a solved problem.
really ? what is the author smoking to consider it a solved problem ? This statement alone invalidates the entire article in its casual irreverence for the truth.
I use copilot everyday, and I know where it shines. Please dont try to sell it to me with false advertising.
If it uses a function, then you can be sure that function is real.
Was this not clear? The explanation I'm paraphrasing is right in between the line Aurornis quoted and the line you quoted. Except for the crack at copilot that's up at the top.
Can you show me 1 PR put out by any agent in any open-source repo with wide usage ?
[1] https://www.reddit.com/r/ExperiencedDevs/comments/1krttqo/my...
[1]: https://github.com/Aider-AI/aider/blob/main/HISTORY.md#aider...