zlacker

[parent] [thread] 2 comments
1. Number+(OP)[view] [source] 2023-05-16 19:01:06
I honestly don't question Altman's motivations that much. I think he's blinded a bit by optimism. I also think he's very worried about existential risks, which is a big reason why he's asking for regulation. He's specifically come out and said in his podcast with Lex Friedman that he thinks it's safer to invent AGI now, when we have less computing power, than to wait until we have more computing power and the risk of a fast takeoff is greater, and that's why he's working so hard on AI.
replies(1): >>collab+9d
2. collab+9d[view] [source] 2023-05-16 19:59:23
>>Number+(OP)
He's just cynical and greedy. Guy has a bunker with an airstrip and is eagerly waiting for the collapse he knows will come if the likes of him get their way

They claim to serve the world, but secretly want the world to serve them. Scummy 101

replies(1): >>Number+wl
◧◩
3. Number+wl[view] [source] [discussion] 2023-05-16 20:43:10
>>collab+9d
Having a bunker is also consistent with expecting that there's a good chance of apocalypse but working to stop it.
[go to top]