zlacker

[parent] [thread] 2 comments
1. quonn+(OP)[view] [source] 2025-07-07 09:40:18
> LLMs are not conscious because unlike human brains they don't learn or adapt (yet).

That's neither a necessary nor sufficient condition.

In order to be conscious, learning may not be needed, but a perception of the passing of time may be needed which may require some short-term memory. People with severe dementia often can't even remember the start of a sentence they are reading, they can't learn, but they are certainly conscious because they have just enough short-term memory.

And learning is not sufficient either. Consciousness is about being a subject, about having a subjective experience of "being there" and just learning by itself does not create this experience. There is plenty of software that can do some form of real-time learning but it doesn't have a subjective experience.

replies(1): >>cootsn+TJ
2. cootsn+TJ[view] [source] 2025-07-07 15:20:56
>>quonn+(OP)
You should note that "what is consciousness" is still very much an unsettled debate.
replies(1): >>quonn+7l1
◧◩
3. quonn+7l1[view] [source] [discussion] 2025-07-07 19:04:42
>>cootsn+TJ
But nobody would dispute my basic definition (it is the subjective feeling or perception of being in the world).

There are unsettled questions but that definition will hold regardless.

[go to top]