But yeah, Skynet is who we need to watch out for, not biased AIs making decisions today.
It's like saying: "Who cares about the hypothetical effects climate change allegedly has in the far future? Let's focus on the effects that the local highway has on our frog population today."
The existential risk of an AI running amok and enslaving or exterminating all of humanity is based on series of low probability events that we can’t even put error bars on, let alone understand what would be needed to even achieve such a feat.
HOWEVER, the existential risk posed by the sun running out of hydrogen and destroying all life on planet Earth has known a probability (1.0), with known error bars around the best estimation of when that will happen. Furthermore, the problems involved with moving masses through space are understood quite well. As such, this is merely an engineering problem, rather than a problem of inadequate theories and philosophies. Therefore, it is undeniably logical that we must immediately redirect all industrial and scientific output to building a giant engine to move the Earth to another star with a longer lifespan, and any argument otherwise is an intentional ploy to make humanity go extinct. Any sacrifices made by the current and near future generations is a worthy price to pay for untold sextillions of humanity and its evolution all descendants that would be condemned to certain death by starvation if we did not immediately start building the Earth Engine.
This is the only logical, mathematical provable, and morally correct answer.