zlacker

[parent] [thread] 4 comments
1. bambax+(OP)[view] [source] 2024-05-15 15:09:05
Oh well... It seems at least one of those two things have to be true: either AGI is so far away that "alignment" (whatever it means) is unnecessary; or, as you suggest, Altman et al. have decided it's a hindrance to commercial success.

I tend to believe the former, but it's possible those two things are true at the same time.

replies(2): >>nickle+r1 >>Liquix+Q4
2. nickle+r1[view] [source] 2024-05-15 15:16:05
>>bambax+(OP)
Specifically I am supposing the superalignment people were generally more concerned about AI safety and ethics than Altman/etc. I don't think this has anything to do with superalignment itself.
3. Liquix+Q4[view] [source] 2024-05-15 15:31:48
>>bambax+(OP)
or C) the first AGI was/is being/will be carried away by men with earpieces to a heavily fortified underground compound. any government - let alone the US government - isn't going to twiddle their thumbs while tech that will change human history is released to the unwitting public. at best they'll want to prepare for and control the narrative surrounding the event, at worst AGI will be weaponized against humans before the majority are aware it exists.

if OAI is motivated by money, uncle sam can name any figure to buy them out. if OAI is motivated by power, it becomes "a matter of national security" and they do what the gov tells them. more likely the two parties' interests are aligned and the public will hear about it when It's Time™. not saying C) is what's happening - A) seems likely too - but it's a real possibility

replies(1): >>wins32+G9
◧◩
4. wins32+G9[view] [source] [discussion] 2024-05-15 15:53:21
>>Liquix+Q4
Why do you think that the US government has the state capacity to do anything like that these days?
replies(1): >>Gud+Eh
◧◩◪
5. Gud+Eh[view] [source] [discussion] 2024-05-15 16:28:54
>>wins32+G9
By observing reality.
[go to top]