zlacker

[parent] [thread] 0 comments
1. cyanyd+(OP)[view] [source] 2023-11-21 22:34:37
if you're using reddit logic, the user needs to present the wrong answer first, before getting the right answer

anyway, LLMs aren't thinking. they're pattern matching and it's not doing recursion it seems.

I'd say the only way you're getting error correction is taking multiple LLMS And running them through chains and parallel construction.

[go to top]