zlacker

[parent] [thread] 1 comments
1. tsimio+(OP)[view] [source] 2023-11-20 08:05:46
There are at least three competing perspectives.

One is Sutskever, who believes AI is very dangerous and must be slowed down and closed source (edit: clarified so that it doesn't sound like closed down). He believes this is in line with OpenAI's original charter.

Another is the HN open source crowd who believes AI should be developed quickly and be open to everyone. They believe this is in line with OpenAI's original charter.

Then there is Altman, who agrees that AI should be developed rapidly, but wants it to stay closed so he can directly profit by selling it. He probably believes this is in line with OpenAI's original charter, or at least the most realistic way to achieve it, effective altruism "earn to give" style.

Karpathy may be more amenable to the second perspective, which he may think Altman is closer to achieving.

replies(1): >>dmix+Zh
2. dmix+Zh[view] [source] 2023-11-20 09:33:57
>>tsimio+(OP)
Now regardless, the new CEO Shear is also very much in the current development of AI is dangerous (not just hypothetically in the future as AGI becomes more plausible), comparable to a nuclear weapon, and wants to slow it down. This will definitely pit researchers into camps and have plenty looking at the door.

https://x.com/amir/status/1726503822925930759?s=46&t=

[go to top]