Technology changes things though. Things aren't "the same as it ever was". The Napoleonic wars killed 6.5 million people with muskets and cannons. The total warfare of WWII killed 70 to 85 million people with tanks, turboprop bombers, aircraft carriers, and 36 kilotons TNT of Atomic bombs, among other weaponry.
Total war today includes modern thermonuclear weapons. In 60 seconds, just one Ohio class submarine can launch 80 independent warheads, totaling over 36 megatons of TNT. That is over 20 times more than all explosives, used by all sides, for all of WWII, including both Atomic bombs.
AGI is a leap forward in power equivalent to what thermonuclear bombs are to warfare. Humans have been trying to destroy each other for all of time but we can only have one nuclear war, and it is likely we can only have one AGI revolt.
Like if you're truly afraid of this, what are you doing here on HN? Go organize and try to do something about this.
It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t.
I hope for a future of abundance for all, brought to us by technology. But I understand that some existential threats only need to turn the wrong way once, and there will be no second chance ever.
>It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t
Okay, you've laid out two paths here. What are *you* doing to influence the course we take? That's my point. Enumerating all the possible ways humanity faces extinction is nothing more than doomerism if you aren't taking any meaningful steps to lessen the likelihood any of them may occur.