zlacker

[parent] [thread] 1 comments
1. Rover2+(OP)[view] [source] 2025-12-06 16:14:52
haha fair point, you can get the expected results with the right prompt, but I think it still reveals a general lack of true reasoning ability (or something)
replies(1): >>ithkui+KA
2. ithkui+KA[view] [source] 2025-12-06 21:23:47
>>Rover2+(OP)
Or it just shows that it tries to overcorrect the prompt which is generally a good idea in the most cases where the prompter is not intentionally asking a weird thing.

This happens all the time with humans. Imagine you're at a call center and get all sorts of weird descriptions of problems with a product: every human is expected to not expect the caller is an expert and actually will try to interpolate what they might mean by the weird wording they use

[go to top]