The problems it raises - alignment, geopolitics, lack of societal safeguards - are all real, and happening now (just replace “AGI” with “corporations”, and voila, you have a story about the climate crisis and regulatory capture). We should be solving these problems before AGI or job-replacing AI becomes commonplace, lest we run the very real risk of societal collapse or species extinction.
The point of these stories is to incite alarm, because they’re trying to provoke proactive responses while time is on our side, instead of trusting self-interested individuals in times of great crisis.
No, there is no risk of species extinction in the near future due to climate change and repeating the line will just further the divide and make the people not care about other people's and even real climate scientist's words.
There is a non-zero chance that the ineffable quantum foam will cause a mature hippopotamus to materialize above your bed tonight, and you’ll be crushed. It is incredibly, amazingly, limits-of-math unlikely. Still a non-zero risk.
Better to think of “no risk” as meaning “negligible risk”. But I’m with you that climate change is not a negligible risk; maybe way up in the 20% range IMO. And I wouldn’t be sleeping in my bed tonight if sudden hippos over beds were 20% risks.