The problem is essentially that everything you do releases waste heat, so you either reject it, or everything continues to heat up until something breaks. Developing useful work from that heat only helps if it helps reject it, but it's more efficient to reject it immediately.
A better, more direct way to think about this might be to look at the Seebeck effect. If you have a giant radiator, you could put a Peltier module between it and you GPU cooling loop and generate a little electricity, but that would necessarily also create some waste heat, so you're better off cooling the GPU directly.
I think I get it. If we could convert 100% of the waste heat into useful power, then all good. And that would get interesting because it would effectively become "free" compute - you'd put enough power into the system to start it, and then it could continue running on its own waste heat. A perpetual motion machine but for computing.
But we can't do that, because physics. Everything we could do to generate useful energy from waste heat also generates some waste heat that cannot be captured by that same process. So there will always be some waste heat that can't be converted to useful energy, which needs to be ejected or it accumulates and everything melts.