Entropy and information

The term entropy is often avoided because it contains a certain complexity that cannot be argued away. But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is. Information is always relative We believe that we can pack information, just as we store bits in a storage medium. Bits are then the information that is objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is

By |2025-12-03T14:54:11+00:0011. September 2024|Categories: Information|Tags: , , , , , |0 Comments

Information Reduction 8: Different Macro States

Two states at the same time In my last article I showed how a system can be described at two levels: that of the micro and that of the macro state. At the micro level, all the information is present in full detail; at the macro level there is less information but what there is, is more stable. We have already discussed the example of the glass of water, where  the micro state describes the movement of the individual water molecules, whereas the macro state encompasses the temperature of the liquid. In this paper I would like to discuss how

Information Reduction 7: Micro and Macro State

Examples of information reduction In previous texts we looked at examples of information reduction in the following areas: Coding / classification Sensory perception DRG (Flat rate per case) Opinion formation Thermodynamics What do they have in common? Micro and macro state What all these examples have in common is that, in terms of information, there are two states: a micro state with a great many details and a macro state with much less information. One very clear example that many of us will remember from our school days is the relationship between the two levels in thermodynamics. The two states

Go to Top