zlacker

[parent] [thread] 0 comments
1. throwa+(OP)[view] [source] 2023-11-19 15:36:09
That's not really a great encapsulation of the AI safety that those who think AGI poses a thread to humanity are referring to.

The bigger concern is something like Paperclip Maximizer. Alignment is about how to ensure that a super intelligence has the right goals.

[go to top]