zlacker

[parent] [thread] 3 comments
1. theoa+(OP)[view] [source] 2025-12-06 08:32:42
Draw a millipede as a dog:

Gemini responds:

Conceptualizing the "Millipup"

https://gemini.google.com/share/b6b8c11bd32f

Draw the five legs of a dog as if the body is a pentagon

https://gemini.google.com/share/d74d9f5b4fa4

And animal legs are quite standardized

https://en.wikipedia.org/wiki/List_of_animals_by_number_of_l...

It's all about the prompt. Example:

Can you imagine a dog with five legs?

https://gemini.google.com/share/2dab67661d0e

And generally, the issue sits between the computer and the chair.

;-)

replies(2): >>Rover2+MI >>vunder+5P
2. Rover2+MI[view] [source] 2025-12-06 16:14:52
>>theoa+(OP)
haha fair point, you can get the expected results with the right prompt, but I think it still reveals a general lack of true reasoning ability (or something)
replies(1): >>ithkui+wj1
3. vunder+5P[view] [source] 2025-12-06 17:06:08
>>theoa+(OP)
This is basically the "Rhinos are just fat unicorns" approach. Totally fine if you want to go that route but a bit goofy. You can get SOTA models to generate a 5-legged dog simply by being more specific about the placement of the fifth leg.

https://imgur.com/a/jNj98Pc

Asymmetry is as hard for AI models as it is for evolution to "prompt for" but they're getting better at it.

◧◩
4. ithkui+wj1[view] [source] [discussion] 2025-12-06 21:23:47
>>Rover2+MI
Or it just shows that it tries to overcorrect the prompt which is generally a good idea in the most cases where the prompter is not intentionally asking a weird thing.

This happens all the time with humans. Imagine you're at a call center and get all sorts of weird descriptions of problems with a product: every human is expected to not expect the caller is an expert and actually will try to interpolate what they might mean by the weird wording they use

[go to top]