Computing is fundamentally about decoding bit strings as different arbitrary representations that are meaningful to humans.
The course on reading and using lambda calculus is similarly longer than than the actual lambda calculus expression
I'm not sure what a "course on reading and using" has to do with description complexity? In any case, it takes 206 bits to implement a binary lambda calculus interpreter (that's Theorem 1 in http://tromp.github.io/cl/LC.pdf )