It's unfortunate that the AGI debate still hasn't made it's way very far into these parts. Still have people going, "well this would be bad too." Yes! That is the existential problem a lot of people are grappling with. There is currently and likely, no good way out of this. Too much "Don't Look Up" going on.
What might be scary is using AI for a mass influence operation, propaganda to convince people that, for example, using a weapon is necessary.
Sam Altman:"Development of superhuman machine intelligence is probably the greatest threat to the continued existence of humanity."
In the essay "Why You Should Fear Machine Intelligence" https://blog.samaltman.com/machine-intelligence-part-1
So, more than nukes then...
Elon Musk: "There’s a strong probability that it [AGI] will make life much better and that we’ll have an age of abundance. And there’s some chance that it goes wrong and destroys humanity."