zlacker

[parent] [thread] 4 comments
1. risho+(OP)[view] [source] 2023-11-22 19:15:26
you just don't understand how markets work. if openai slows down then they will just be driven out by competition. that's fine if that's what you think they should do, but that won't make ai any safer, it will just kill openai and have them replaced by someone else.
replies(2): >>zeroha+41 >>Wander+o1
2. zeroha+41[view] [source] 2023-11-22 19:20:02
>>risho+(OP)
you're right about market forces, however:

1) openAI was explicitly founded to NOT develop AI based on "market forces"; it's just that they "pivoted" (aka abandoned their mission) once they struck gold in order to become driven by the market

2) this is exactly the reasoning behind nuclear arms races

3. Wander+o1[view] [source] 2023-11-22 19:21:48
>>risho+(OP)
You can still be a force for decentralization by creating actually open ai. For now it seems like Meta AI research is the real open ai
replies(1): >>insani+F3
◧◩
4. insani+F3[view] [source] [discussion] 2023-11-22 19:32:51
>>Wander+o1
What does "actually open" mean? And how is that more responsible? If the ethical concern of AI is that it's too powerful or whatever, isn't building it in the open worse?
replies(1): >>Wander+G4
◧◩◪
5. Wander+G4[view] [source] [discussion] 2023-11-22 19:38:24
>>insani+F3
Depends on how you interpret the mission statement of building ai for all of humanity. It’s questionable that humanity is better off if ai only accrues to one or a few centralised entities?
[go to top]