zlacker

[return to "Ask HN: What scientific phenomenon do you wish someone would explain better?"]
1. pjungw+Jk[view] [source] 2020-04-26 21:48:43
>>qqqqqu+(OP)
Entropy. Sometimes you read that it's a measure of randomness; sometimes, information. Aren't randomness and information opposites?
◧◩
2. baron_+1n[view] [source] 2020-04-26 22:08:59
>>pjungw+Jk
Entropy is measured in bits (they exact same bits we talk about in computing), this can help unify the ideas of information and randomness:

Which file can contain more information: a 1.44 MB floppy disk or 1 TB hard disk?

Which password is more random (i.e. harder to guess): one that can be stored only 1 byte of memory or one that is stored in 64 bytes?

Information theory deals with determining exactly how many bits of it would take to encode a given problem.

[go to top]