zlacker

[parent] [thread] 3 comments
1. NoOn3+(OP)[view] [source] 2023-11-18 09:34:19
I just noticed that when I ask really difficult technical questions, but for which there is an exact answer, It often tries to answer plausibly, but incorrectly instead of answering "I don't know". But over time, It becomes smarter and there are fewer and fewer such questions...
replies(2): >>ben_w+j >>davegu+je2
2. ben_w+j[view] [source] 2023-11-18 09:37:17
>>NoOn3+(OP)
Have you tried setting a custom instruction in settings? I find that setting helps, albeit with weaker impact than the prompt itself.
replies(1): >>NoOn3+ta
◧◩
3. NoOn3+ta[view] [source] [discussion] 2023-11-18 11:03:56
>>ben_w+j
It's not a problem for me. It's good that I can detect chatGPT by this sign.
4. davegu+je2[view] [source] 2023-11-18 23:24:03
>>NoOn3+(OP)
It doesn't become smarter except for releases of new models. It's an inference engine.
[go to top]