zlacker

[parent] [thread] 2 comments
1. notabe+(OP)[view] [source] 2023-11-19 20:58:29
Well, now we know that once any putative superintelligence learns how to wave money around, we're doomed. All the preparatory "alignment" isn't going to save us from our own greed.
replies(1): >>pjc50+Y
2. pjc50+Y[view] [source] 2023-11-19 21:02:52
>>notabe+(OP)
I mean, obviously? Any distant, hypothetical, unlikely AI apocalypse is going to need a lot of willing human accomplices.
replies(1): >>ben_w+Hd2
◧◩
3. ben_w+Hd2[view] [source] [discussion] 2023-11-20 11:23:27
>>pjc50+Y
An apocalypse needs power. That power can be offering money to humans, but doesn't have to be: other (thankfully not yet existent) possibilities include hacking a single clanking replicator, as would offering rewards to unrelated non-human uplift project subjects.
[go to top]