zlacker

[parent] [thread] 0 comments
1. root_a+(OP)[view] [source] 2024-05-15 17:22:25
Indeed. It's also obvious when the "hallucinations" create contradictory responses that a conceptual understanding would always preclude. For example, "In a vacuum, 100g of feathers and 100g of iron would fall at the same rate due to the constant force of gravity, thus the iron would hit the ground first". Only a language model makes this type of mistake because its output is statistical, not conceptual.
[go to top]