Entropy and information
The term entropy is often avoided because it contains a certain complexity that cannot be argued away. But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is. Information is always relative We believe that we can pack information, just as we store bits in a storage medium. Bits are then the information that is objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is






