zlacker

[parent] [thread] 0 comments
1. staunt+(OP)[view] [source] 2024-01-08 23:39:56
I think human extinction due to climate change is extremely unlikely. However, civilization collapsing is bad enough. We can't be certain whether or not that will actually happen but we do know it will if we do nothing. We even have a pretty good idea when, that's not too late yet, and we have an actionable scientific consensus about what to do about it.

In many ways AI risk looks like the opposite. It might actually cause extinction but we have no idea how likely that is and neither do we have any idea how likely any bad not-quite-extinction outcome is. The outcome might even be very positive. We have no idea when anything will happen and the only realistic plan that's sure to avoid the bad outcome is to stop building AI, which also means we don't get the potential good outcome, and there's no scientific consensus about that (or anything else) being a good plan because it's almost impossible to gather concrete empirical evidence about the risk. By the time such evidence is available, it might be too late (this could also have happened with climate change, we got lucky there...)

[go to top]