zlacker

[return to "Which programming languages are most token-efficient?"]
1. protoc+Qj[view] [source] 2026-01-12 04:02:55
>>tehnub+(OP)
I have always had concerns about physical robots making my work less safe in the real world.

But had never considered that a programming language might be created thats less human readable/auditable to enable LLMs.

Scares me a bit.

◧◩
2. make3+Hq[view] [source] 2026-01-12 05:19:43
>>protoc+Qj
LLMs in their current form rely heavily on the vast amount of human data that's available, to learn from it as a first step (the second step is RL).

We're not building a language for LLMs just yet.

◧◩◪
3. jagged+vy[view] [source] 2026-01-12 06:37:42
>>make3+Hq
> We're not building a language for LLMs just yet.

Working on it, actually! I think it's a really interesting problem space - being efficient on tokens, readable by humans for review, strongly typed and static for reasoning purposes, and having extremely regular syntax. One of the biggest issues with symbols is that, to a human, matching parentheses is relatively easy, but the models struggle with it.

I expect a language like the one I'm playing with will mature enough over the next couple years that models with a knowledge cutoff around 1/2027 will probably know how to program it well enough for it to start being more viable.

One of the things I plan to do is build evals so that I can validate the performance of various models on my as yet only partially baked language. I'm also using only LLMs to build out the entire infrastructure, mostly to see if it's possible.

◧◩◪◨
4. zcw100+6G2[view] [source] 2026-01-12 19:27:28
>>jagged+vy
Working on it too. It's actually more like a meta language that is very token efficient.
[go to top]