Good lord we are screwed. And yet somehow I bet even this isn't going to kill off the they're just statistical interpolators meme.
[1] https://www.deepmind.com/blog/tackling-multiple-tasks-with-a...
They’re all fundamentally anthropocentric: people argue until they are blue in the face about what “intelligent” means but it’s always implicit that what they really mean is “how much like me is this other thing”.
Language models, even more so than the vision models that got them funded have empirically demonstrated that knowing the probability of two things being adjacent in some latent space is at the boundary indistinguishable from creating and understanding language.
I think the burden is on the bright hominids with both a reflexive language model and a sex drive to explain their pre-Copernican, unique place in the theory of computation rather than vice versa.
A lot of these problems just aren’t problems anymore if performance on tasks supersedes “consciousness” as the thing we’re studying.