Ilya
Jan Leike
William Saunders
Leopold Aschenbrenner
All gone
“I think AGI will probably be here by 2029, and could indeed arrive this year”
Kokotajlo too.
We are so fucked
I really, really doubt that transformers will become AGI. Maybe I am wrong, I am no expert in this field, but I would love to understand the reasoning behind this "could arrive this year", because it reminds me about coldfusion :X
edit: maybe the term has changed again. AGI to me means truly understanding, maybe even some kind of consciousness, but not just probability... when I explain something, I have understood it. It's not that I have soaked up so many books that I can just use a probabilistic function to "guess" which word should come next.
When you have that feeling of understanding, it is important to recognize that it is a feeling.
We hope it’s correlated with some kind of ability to reason, but at the end of the day, you can have the ability to reason about things without realising it, and you can feel that you understand something and be wrong.
It’s not clear to me why this feeling would be necessary for superhuman-level general performance. Nor is it clear to me that a feeling of understanding isn’t what being an excellent token predictor feels like from the inside.
If it walks and talks like an AGI, at some point, don’t we have to concede it may be an AGI?