zlacker

[parent] [thread] 1 comments
1. bombca+(OP)[view] [source] 2023-03-18 18:03:46
Wrong is saying 2+2 is five.

Wrong is saying that the sun rises in the west.

By hallucinating they’re trying to imply that it didn’t just get something wrong but instead dreamed up an alternate world where what you want existed, and then described that.

Or another way to look at it, it gave an answer that looks right enough that you can’t immediately tell it is wrong.

replies(1): >>permo-+Yde
2. permo-+Yde[view] [source] 2023-03-22 19:47:04
>>bombca+(OP)
this isn't a good explanation. these LLMs are essentially statistical models. when they "hallucinate", they're not "imagining" or "dreaming", they're simply producing a string of results that your prompt combined with its training corpus implies to be likely
[go to top]