We currently make around 1 TW of photovoltaic cells per year, globally. The proposal here is to launch that much to space every 9 hours, complete with attached computers, continuously, from the moon.
edit: Also, this would capture a very trivial percentage of the Sun's power. A few trillionths per year.
Think about it. Elon conjures up a vision of the future where we've managed to increase our solar cell manufacturing capacity by two whole orders of magnitude and have the space launch capability for all of it along with tons and tons of other stuff and the best he comes up with is...GPUs in orbit?
This is essentially the superhero gadget technology problem, where comic books and movies gloss over the the civilization changing implications of some technology the hero invents to punch bad guys harder. Don't get me wrong, the idea of orbiting data centers is kind of cool if we can pull it off. But being able to pull if off implies an ability to do a lot more interesting things. The problem is that this is both wildly overambitious and somehow incredibly myopic at the same time.
The tale of computers is even more absurd. The first programmable, electric, and general-purpose digital computer was ENIAC. [1] It was built to... calculate artillery firing tables. I expect in the future that the idea of putting a bunch of solar into space to run GPUs for LLMs will probably seem, at the minimum - quaint, but that doesn't mean the story ends there.
However I'm curious how many solar panels you would need to power a typical data center. Are we talking something like a large satellite, or rather a huge satellite with ISS-size solar arrays bolted on? Getting rid of the copious amounts of heat that data centers generate might also be a challenge (https://en.wikipedia.org/wiki/Spacecraft_thermal_control)...
For inferencing it can work well. One satellite could contain a handful of CPUs and do batch inferencing of even very large models, perhaps in the beginning at low speeds. Currently most AI workloads are interactive but I can't see that staying true for long, as things improve and they can be trusted to work independently for longer it makes more sense to just queue stuff up and not worry about exactly how high your TTFT is.
For training I don't see it today. In future maybe. But then, most AI workloads in future should be inferencing not training anyway.