But while we have measurements for "intelligence" we don't for "sentience", "agency", "consciousness" or these other things. And I'd argue that there are lots of intelligent life on earth (take crows as an example) that are sentient to a degree that the LLMs are not. My guess is this is because of their "agency" - their drive for survival. The LLMs we have now are clearly smarter than crows and cats but not sentient in the way those animals are. So I think it's safe to say that "sentience" (whatever that is) is not an emergent property of neural net/training data size. If it were, it'd be evident already.
So Gemini/Chat GPT seem to be "intelligence", but in tool form. Very unexpected. Something I would not have believed possible 5 or 10 years ago, but there it is.
As to whether we could create a "sentient" AI, an AGI, I don't see any reason we shouldn't be able to. But it's clear to me that something else is needed, besides intelligence. Maybe it's agency, maybe it's something else (the experience of times passage?). We probably need to ways of measuring and evaluating these other things before we can progress further.