zlacker

[parent] [thread] 0 comments
1. baron_+(OP)[view] [source] 2020-04-26 22:08:59
Entropy is measured in bits (they exact same bits we talk about in computing), this can help unify the ideas of information and randomness:

Which file can contain more information: a 1.44 MB floppy disk or 1 TB hard disk?

Which password is more random (i.e. harder to guess): one that can be stored only 1 byte of memory or one that is stored in 64 bytes?

Information theory deals with determining exactly how many bits of it would take to encode a given problem.

[go to top]