zlacker

[parent] [thread] 0 comments
1. cperki+(OP)[view] [source] 2025-04-09 16:49:25
I think we are careless in how we use terms. We often say "intelligence" where me mean "sentience". We have studied intelligence for a long time and we have IQ tests that can measure it. The various LLMs (like Chat GPT and Gemini) are scoring pretty well on the IQ tests. So given that, I think we can conclude that they are intelligent, as we can measure it.

But while we have measurements for "intelligence" we don't for "sentience", "agency", "consciousness" or these other things. And I'd argue that there are lots of intelligent life on earth (take crows as an example) that are sentient to a degree that the LLMs are not. My guess is this is because of their "agency" - their drive for survival. The LLMs we have now are clearly smarter than crows and cats but not sentient in the way those animals are. So I think it's safe to say that "sentience" (whatever that is) is not an emergent property of neural net/training data size. If it were, it'd be evident already.

So Gemini/Chat GPT seem to be "intelligence", but in tool form. Very unexpected. Something I would not have believed possible 5 or 10 years ago, but there it is.

As to whether we could create a "sentient" AI, an AGI, I don't see any reason we shouldn't be able to. But it's clear to me that something else is needed, besides intelligence. Maybe it's agency, maybe it's something else (the experience of times passage?). We probably need to ways of measuring and evaluating these other things before we can progress further.

[go to top]