Interns don’t cost 20 bucks a month but training users in the specifics of your org is important.
Knowing what is important or pointless comes with understanding the skill set.
This roughly matches my experience too, but I don't think it applies to this one. It has a few novel things that were new ideas to me and I'm glad I read it.
> I’m ready to write a boilerplate response because I already know what they’re going to say
If you have one that addresses what this one talks about I'd be interested in reading it.
>This roughly matches my experience too, but I don't think it applies to this one.
I'm not so sure. The argument that any good programming language would inherently eliminate the concern for hallucinations seems like a pretty weak argument to me.
It seems obviously true to me: code hallucinations are where the LLM outputs code with incorrect details - syntax errors, incorrect class methods, invalid imports etc.
If you have a strong linter in a loop those mistakes can be automatically detected and passed back into the LLM to get fixed.
Surely that's a solution to hallucinations?
It won't catch other types of logic error, but I would classify those as bugs, not hallucinations.