zlacker

[return to "Which programming languages are most token-efficient?"]
1. protoc+Qj[view] [source] 2026-01-12 04:02:55
>>tehnub+(OP)
I have always had concerns about physical robots making my work less safe in the real world.

But had never considered that a programming language might be created thats less human readable/auditable to enable LLMs.

Scares me a bit.

◧◩
2. make3+Hq[view] [source] 2026-01-12 05:19:43
>>protoc+Qj
LLMs in their current form rely heavily on the vast amount of human data that's available, to learn from it as a first step (the second step is RL).

We're not building a language for LLMs just yet.

◧◩◪
3. energy+ux[view] [source] 2026-01-12 06:28:58
>>make3+Hq
It's worth asking why we haven't had the AlphaZero moment for general learning yet, where no human data is needed.
◧◩◪◨
4. make3+n33[view] [source] 2026-01-12 21:20:53
>>energy+ux
That's easy, AlphaZero had a perfect simulator of the world it existed in (chess, super easy), so it was insanely easy to run simulations of that world ad infinitum, and learn from it.

It's simply not the case for the real world, you can't simulate the world perfectly and see what happens when you do things.

[go to top]