What is Entropy?

Definition of Entropy The term entropy is often avoided because it contains a certain complexity. The phenomenon entropy, however, is constitutive for everything that is going on in our lives. A closer look is worth the effort. Entropy is a measure of information and it is defined as: Entropy is the information - known at micro, - but unknown at macro level. The challenge of this definition is: to understand what is meant by the micro and macro states and ​to understand why entropy is a difference. ​What is Meant by Micro and Macro Level? The micro level contains the

By |2025-11-15T12:31:44+00:004. September 2024|Categories: Entropy|Tags: , , , , , , |0 Comments

Paradoxes and Logic (Part 2)

continues Paradoxes and Logic (part 1) "Draw a Distinction" Spencer-Brown introduces the elementary building block of his formal logic with the words ‘Draw a Distinction’. Figure 1 shows this very simple formal element: ​ Fig 1: The form of Spencer-Brown A Radical Abstraction In fact, his logic consists exclusively of this building block. Spencer-Brown has thus achieved an extreme abstraction that is more abstract than anything mathematicians and logicians have found so far. What is the meaning of this form? Spencer-Brown is aiming at an elementary process, namely the ‘drawing of a distinction’. This elementary process now divides the world into

By |2025-11-25T18:26:54+00:0022. August 2024|Categories: Information, Logic, Paradoxes|Tags: , , , |0 Comments
Go to Top