https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
But the majority of projects are on a newer JDK than 8 for quite some years now.
Java is the running joke of verbosity, and you are too if you seriously argue that it's not.
Evidence is easy - think of a problem and ask LLM to generate idiomatic examples (leverage Java streams, with functional decomposition, etc) in Go and Java and with error handling. You will find that more often than not, the Java line count is far smaller.