Ijon Tichy meets artificial Intelligence

Stanislaw Lem on entropy (littering) Littering in space had been a concern long before Elon Musk's Starlink programme, and various methods for cleaning up the growing clutter in Earth's orbit are currently under discussion. The task is not easy because - due to the second law, the inevitable increase in entropy - all littering tends to increase exponentially. If one of the thousands of pieces of scrap metal in space is hit by another piece of scrap metal, the one piece that was hit creates many new pieces that fly around at insane speeds. Space pollution is therefore a self-perpetuating

Entropy between Micro- and Macro Level

Two Levels define Entropy: Micro and Macro Two levels Define Entropy The conventional physical definition of entropy characterises it as a difference between two levels: a detail level and an overview level. Example Coffee Cup The thermal entropy according to Boltzmann is classic, using the example of an ideal gas. The temperature (1 value) is directly linked to the kinetic energies of the individual gas molecules (1023 values). With certain adjustments, this applies to any material object, e.g. also to a coffee cup: Thermal macro state: temperature of the liquid in the cup. Thermal micro state: kinetic energy of all

Entropy and information

The term entropy is often avoided because it contains a certain complexity that cannot be argued away. But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is. Information is always relative We believe that we can pack information, just as we store bits in a storage medium. Bits are then the information that is objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is

By |2025-12-03T14:54:11+00:0011. September 2024|Categories: Information|Tags: , , , , , |0 Comments

What is Entropy?

Definition of Entropy The term entropy is often avoided because it contains a certain complexity. The phenomenon entropy, however, is constitutive for everything that is going on in our lives. A closer look is worth the effort. Entropy is a measure of information and it is defined as: Entropy is the information - known at micro, - but unknown at macro level. The challenge of this definition is: to understand what is meant by the micro and macro states and ​to understand why entropy is a difference. ​What is Meant by Micro and Macro Level? The micro level contains the

By |2025-11-15T12:31:44+00:004. September 2024|Categories: Entropy|Tags: , , , , , , |0 Comments

Georg Spencer-Browns Distinction and the Bit

continues paradoxes and logic (part 2) History Before we Georg Spencer-Brown's (GSB's) distinction as basic element for logic, physics, biology and philosophy, it is helpful to compare it with another, much better-known basic form, namely the bit. This allows us to better understand the nature of GSB's distinction and the revolutionary nature of his innovation. Bits and GSB forms can both be regarded as basic building blocks for information processing. Software structures are technically based on bits, but the forms of GSB (‘draw a distinction’) are just as simple, fundamental and astonishingly similar. Nevertheless, there are characteristic differences.  Fig. 1:

By |2025-11-15T12:33:39+00:0023. August 2024|Categories: Information, Logic, Bits|Tags: , , , |0 Comments

Five Preconceptions about Entropy

Which of these Preconceptions do you Share? Entropy is for nerds Entropy is incomprehensible Entropy is thermodynamics Entropy is noise Entropy is absolute Details 1. Entropy is the Basis of our Daily Lives Nerds like to be interested in complex topics and entropy fits in well, doesn't it? It helps them to portray themselves as superior intellectuals. This is not your game and you might not see any practical reasons to occupy yourself with entropy. This attitude is very common and quite wrong. Entropy is not a nerdy topic, but has a fundamental impact on our lives, from elementary physics

By |2025-11-25T18:15:57+00:0016. August 2024|Categories: Information, Entropy|Tags: , , |0 Comments

How real is the probable?

AI can only see whatever is in the corpus Corpus-based systems are on the road to success. They are “disruptive”, i.e. they change our society substantially within a very short period of time – reason enough for us to recall how these systems really work. In previous blog posts I explained that these systems consist of two parts, namely a data corpus and a neural network. Of course, the network is unable to recognise anything that is not already in the corpus. The blindness of the corpus automatically continues in the neural network, and the AI is ultimately only able

Go to Top