zlacker

[parent] [thread] 0 comments
1. NemoNo+(OP)[view] [source] 2024-10-21 15:55:07
It has everything to do with it. The OP of the comment that you are replying to has correctly identified why LLMs are lacking - bc they have not actually learned, they have actual experience, they have no actual frame of reference to reality and how reality actually functions - what we expect of them is not possible considering how reality works.

I'm not saying we cannot create a self conscious entity - I'm saying that none of the stuff we've made so far can become self aware or conscious as we are bc we haven't made it correctly. Nobody worried about AGI has anything to worry about rn - at best the models we have now may someday be able to trick us into perceiving them as aware but fundamentally they cannot attain that state out of what they are now, so it will be bullshit if one "wakes up" soon.

[go to top]