Sam Altman:"Development of superhuman machine intelligence is probably the greatest threat to the continued existence of humanity."
In the essay "Why You Should Fear Machine Intelligence" https://blog.samaltman.com/machine-intelligence-part-1
So, more than nukes then...
Elon Musk: "There’s a strong probability that it [AGI] will make life much better and that we’ll have an age of abundance. And there’s some chance that it goes wrong and destroys humanity."