To me it seems so strange that few good language designers and ml folks didn't group together to work on this.
It's clear that there is a space for some LLM meta language that could be designed to compile to bytecode, binary, JS, etc.
It also doesn't need to be textual like we code, but some form of AST llama can manipulate with ease.
Plenty of training data to go on, I'd imagine.
Who will write the useful training data without LLMs? I feel we are getting less and less new things. Changes will be smaller and incremental.
It seems no different in kind to me than image or audio generation.