It’s like half of the arguments are designed as engagement bait with logical consistency being a distant concern:
> If hallucination matters to you, your programming language has let you down.
This doesn’t even make sense. LLMs hallucinate things beyond simple programming language constructs. I commonly deal with allusions to functions or library methods that would be great if they existed, but the LLM made it up on the spot.
The thing is, the author clearly must know this. Anyone who uses LLMs knows this. So why put such a bizarre claim in the article other than as engagement bait to make readers angry?
There are numerous other bizarre claims throughout the article, like waving away the IP rights argument because some programmers pirate TV shows? It’s all so bizarre.
I guess I shouldn’t be surprised to scroll to the bottom and see that the author is a HN comment section veteran, because this entire article feels like it started as a reasonable discussion point and then got twisted into Hacker News engagement bait for the company blog. And it’s working well, judging by the engagement counts.
> But “hallucination” is the first thing developers bring up when someone suggests using LLMs, despite it being (more or less) a solved problem.
really ? what is the author smoking to consider it a solved problem ? This statement alone invalidates the entire article in its casual irreverence for the truth.
I use copilot everyday, and I know where it shines. Please dont try to sell it to me with false advertising.
If it uses a function, then you can be sure that function is real.
Was this not clear? The explanation I'm paraphrasing is right in between the line Aurornis quoted and the line you quoted. Except for the crack at copilot that's up at the top.