zlacker

[parent] [thread] 3 comments
1. oscarb+(OP)[view] [source] 2024-02-14 08:34:07
Thanks. How do we know none of this is a hallucination?
replies(3): >>simonw+Y9 >>jsemra+gt >>livshi+I32
2. simonw+Y9[view] [source] 2024-02-14 10:33:36
>>oscarb+(OP)
Prompt leaks like this are never hallucinations in my experience.

LLMs are extremely good at repeating text back out again.

Every time this kind of thing comes up multiple people are able to reproduce the exact same results using many different variants of prompts, which reinforces that this is the real prompt.

3. jsemra+gt[view] [source] 2024-02-14 13:37:45
>>oscarb+(OP)
Hallucinations are caused by missing context. In this case enough context should be available. But I haven't kicked all its tires yet.
4. livshi+I32[view] [source] 2024-02-14 21:28:20
>>oscarb+(OP)
if you repeat the process twice and the same exact text is written
[go to top]