zlacker

[return to "Cursor IDE support hallucinates lockout policy, causes user cancellations"]
1. cs702+Xk4[view] [source] 2025-04-15 23:32:49
>>scared+(OP)
Yup, hallucinations are still a big problem for LLMs.

Nope, there's no reliable solution for them, as of yet.

There's hope that hallucinations will be solved by someone, somehow, soon... but hope is not a strategy.

There's also hype about non-stop progress in AI. Hype is more a strategy... but it can only work for so long.

If no solution materializes soon, many early-adopter LLM projects/trials will be cancelled. Sigh.

◧◩
2. instag+1J4[view] [source] 2025-04-16 03:34:36
>>cs702+Xk4
One workaround for using RAG, as mentioned in a podcast I listened to, involves employing a second LLM agent to assess the work of the first LLM. This agent evaluates the response or hallucination by requiring the first LLM to cite sources and subsequently locate those sources.
[go to top]