zlacker

[parent] [thread] 3 comments
1. frabcu+(OP)[view] [source] 2023-11-19 23:12:35
Worse. Which is exactly why superintelligence is scary - it'll make the humans around it go wild for power, and then it will be impossible (by definition) to predict.
replies(1): >>TeMPOr+P2
2. TeMPOr+P2[view] [source] 2023-11-19 23:30:48
>>frabcu+(OP)
Huh. I imagined many scenarios, including the more obvious and dangerous one, "AI manipulating people unaware of its existence" - but I never considered a scenario in which the AI makes its existence widely known, and perhaps presents itself as more dangerous than it is, and then it just starts slightly nudging all the people racing to take control over it.
replies(1): >>hotnfr+ot
◧◩
3. hotnfr+ot[view] [source] [discussion] 2023-11-20 02:27:03
>>TeMPOr+P2
Both are too complicated. All a true AI has to do to get control of everything is promise 10% annual returns and guaranteed victory in battle. Limited-time offer, sign up today.

Done.

Any actual AI takeover will be boring and largely voluntary. For certain definitions of voluntary.

replies(1): >>jacque+0G
◧◩◪
4. jacque+0G[view] [source] [discussion] 2023-11-20 04:20:12
>>hotnfr+ot
That's the playbook of any dictator. Hitch your horse to my wagon and we'll go places. But stray from the wagon and I'll have you shot by someone who is loyal to me. And it works. Without their henchmen little creeps wouldn't get out of the gate because they are invariably complete cowards.
[go to top]