zlacker

[parent] [thread] 2 comments
1. uLogMi+(OP)[view] [source] 2024-05-15 14:30:12
Imagine trying to keep something so far above us in intelligence, caged. Scary stuff...
replies(1): >>binary+if
2. binary+if[view] [source] 2024-05-15 15:41:16
>>uLogMi+(OP)
I’m genuinely curious - do you actually believe that GPT is a super intelligence? Because I have the opposite experience. It consistently fails to be correct on following even the most basic instructions. For a little while I thought maybe I’m doing it wrong, and I need better prompts, but then I realized that its zero shot and few shot capabilities are really hit and miss. Furthermore, a superior intelligence shouldn’t need us to conform to its persnickety requirements, and it should be able to adapt far better than it actually does.
replies(1): >>uLogMi+kx
◧◩
3. uLogMi+kx[view] [source] [discussion] 2024-05-15 17:00:21
>>binary+if
GPT does not need super-alignment. This refers to aligning artificial general and super intelligence.
[go to top]