zlacker

[return to "Ilya Sutskever to leave OpenAI"]
1. ascorb+6C[view] [source] 2024-05-15 05:45:41
>>wavela+(OP)
Jan Leike has said he's leaving too https://twitter.com/janleike/status/1790603862132596961
◧◩
2. DalasN+BC[view] [source] 2024-05-15 05:51:45
>>ascorb+6C
There goes the so called superalignment:

Ilya

Jan Leike

William Saunders

Leopold Aschenbrenner

All gone

◧◩◪
3. reduce+sF[view] [source] 2024-05-15 06:23:14
>>DalasN+BC
Daniel “Quit OpenAI due to losing confidence that it would behave responsibly around the time of AGI”

“I think AGI will probably be here by 2029, and could indeed arrive this year”

Kokotajlo too.

We are so fucked

◧◩◪◨
4. Otomot+hG[view] [source] 2024-05-15 06:32:48
>>reduce+sF
I am sorry, there must be some hidden tech, some completely different attempt to speak about AGI.

I really, really doubt that transformers will become AGI. Maybe I am wrong, I am no expert in this field, but I would love to understand the reasoning behind this "could arrive this year", because it reminds me about coldfusion :X

edit: maybe the term has changed again. AGI to me means truly understanding, maybe even some kind of consciousness, but not just probability... when I explain something, I have understood it. It's not that I have soaked up so many books that I can just use a probabilistic function to "guess" which word should come next.

◧◩◪◨⬒
5. trucul+RY[view] [source] 2024-05-15 09:53:21
>>Otomot+hG
> truly understanding… when I explain something, I have understood it

When you have that feeling of understanding, it is important to recognize that it is a feeling.

We hope it’s correlated with some kind of ability to reason, but at the end of the day, you can have the ability to reason about things without realising it, and you can feel that you understand something and be wrong.

It’s not clear to me why this feeling would be necessary for superhuman-level general performance. Nor is it clear to me that a feeling of understanding isn’t what being an excellent token predictor feels like from the inside.

If it walks and talks like an AGI, at some point, don’t we have to concede it may be an AGI?

[go to top]