zlacker

[return to "Show HN: GlyphLang – An AI-first programming language"]
1. p0w3n3+dz[view] [source] 2026-01-11 07:07:25
>>goose0+(OP)
I think the gain is very little. Almost every English word is on token, the same with programming language keywords. So you're just replacing one keyword with another. The only gain in the example given is > instead of jsonify() which would be ~4 tokens.

Please check your idea agains tiktokenizer

◧◩
2. p0w3n3+7g1[view] [source] 2026-01-11 14:23:22
>>p0w3n3+dz
I've checked and you get 36->30 tokens decreasal but no human readability. sounds like a poor trade
◧◩◪
3. goose0+3U3[view] [source] 2026-01-12 09:42:13
>>p0w3n3+7g1
Looks like my tokenization review method was incorrect - honestly a little embarrassing on my part. I think it would have been a lot longer before I discovered it, so thanks for the comment!

I did just go through and ran equivalent code samples in the GlyphLang repo (vs the sample code I posted that I'm assuming you ran) through tiktoken and found slightly lower percentages, but still not insignificant: on average 35% fewer than Python and 56% fewer than Java. I've updated the README with the corrected figures and methodology if you want to check: https://github.com/GlyphLang/GlyphLang/blob/main/README.md#a...

◧◩◪◨
4. p0w3n3+Ac7[view] [source] 2026-01-13 09:25:32
>>goose0+3U3
Yeah Java IS verbose. Thanks!
[go to top]