Idk. Folks much smarter than I seem worried so maybe I should be too but it just seems like such a long shot.
The vast majority of datacenters currently in production will be entirely powered by carbon free energy. From best to worst:
1. Meta: 100% renewable
2. AWS: 90% renewable
3. Google: 64% renewable with 100% renewable energy credit matching
4. Azure: 100% carbon neutral
[1]: https://sustainability.fb.com/energy/
[2]: https://sustainability.aboutamazon.com/products-services/the...
[3]: https://sustainability.google/progress/energy/
[4]: https://azure.microsoft.com/en-us/explore/global-infrastruct...
If imaginary cloud provider "ZFQ" uses 10MW of electricity on a grid and pays for it to magically come from green generation, that means 10MW of other loads on the grid were not powered by green energy, or 10MW of non-green power sources likely could have been throttled down/shut down.
There is no free lunch here; "we buy our electricity from green sources" is greenwashing bullshit.
Even if they install solar on the roofs and wind turbines nearby - that's still electrical generation capacity that could have been used for existing loads. By buying so many solar panels in such quantities, they affect availability and pricing of all those components.
The US, for example, has about 5GW of solar manufacturing capacity per year. NVIDIA sold half a million H100 chips in one quarter, each of which uses ~350W, which means in a year they're selling enough chips to use 700MW of power. That does not include power conversion losses, distribution, cooling, and the power usage of the host systems, storage, networking, etc.
And that doesn't even get into the water usage and carbon impact of manufacturing those chips; the IC industry uses a massive amount of water and generates a substantial amount of toxic waste.
It's hilarious how HN will wring its hands over how much rare earth metals a Prius has and shipping it to the US from Japan, but ask about the environmental impacts of AI and it's all "pshhtt, whatever".