zlacker

[parent] [thread] 0 comments
1. goneho+(OP)[view] [source] 2023-07-06 12:37:50
Low probability events do happen sometimes though and a heuristic that says it never happens can let you down, especially when the outcome is very bad.

Most of the time a new virus is not a pandemic, but sometimes it is.

Nothing in our (human) history has caused an extinction level event for us, but these events do happen and have happened on earth a handful of times.

The arguments about superintelligent AGI and alignment risk are not that complex - if we can make an AGI the other bits follow and an extinction level event from an unaligned superintelligent AGI looks like the most likely default outcome.

I’d love to read a persuasive argument about why that’s not the case, but frankly the dismissals of this have been really bad and don’t hold up to 30 seconds of scrutiny.

People are also very bad at predicting when something like this will come. Right before the first nuclear detonation those closest to the problem thought it was decades away, similar for flight.

What we’re seeing right now doesn’t look like failure to me, it looks like something you might predict to see right before AGI is developed. That isn’t good when alignment is unsolved.

[go to top]