So basically you have a very low density of representable numbers (2^64 / w218), I wonder how quickly it grows as you use more and more 1-bits, and is there even a correlation between the bit pattern and the corresponding number value?
This is now (at least) "The largest number representable by a Turning machine of fixed parameters that can then be squeezed into 64 bit."
(I don't remember my lambda calc, so … eh.)
Edit: to be clear, it's not like I've researched this proposal since I don't work for social media companies. It's just a feature I wish I could have on my posts.
> What if we allow representations beyond plain data types? Since we want representations to remain computable, the most general kind of representation would be a program in some programming language. But the program must be small enough to fit in 64 bits.
If you bring in a whole programming language, you bring in a lot more than 64 bits. This all seems to be the math equivalent of saying you wrote a program in 2k when the first line is "import 'megalibrary'".