
Knowledge, articles, thoughts, contributions
Information
What do signals mean? Is the bit as a signal already information, or does the context determine its meaning? How does the receiver influence the perception and evaluation of signals? What is the connection between information in information theory and entropy in physics?
Selected articles from the topic «Information»
Logic, Paradoxes, Self-Referentiality
Simple instruction for generating paradoxes The trick with which classical logical systems can be invalidated consists of two instructions: 1: A statement refers to itself. 2: The reference or the statement contains a negation. This constellation always results in a paradox. A famous example of a paradox is the barber who shaves all the
Information, Entropy
Which of these Preconceptions do you Share? Entropy is for nerds Entropy is incomprehensible Entropy is thermodynamics Entropy is noise Entropy is absolute Details 1. Entropy is the Basis of our Daily Lives Nerds like to be interested in complex topics and entropy fits in well, doesn't it? It helps them to portray themselves as
Information, Semantics, Bits
What does a bit mean? The question may seem trivial – after all, everyone knows that a bit represents a choice between 0 and 1. Isn't it? So, what’s the problem? The problem is that 0 and 1 are not the only answers. 0 and 1 are just a pair of possible instantiations for a







