Life, Statistics, Biology discovery
Understand knowledge and ignorance in terms of the concept of information as relevant in mathematics, computer science and physics.
- What is the definition of information?
- How are information and entropy related?
- How are information and communication related?
- How does information relate to the levels of knowledge in the foursome?
- How does information entropy relate to a 1x1 random matrix and an nxn random matrix and their eigenvalues?
Information
- The resolution of uncertainty (ambiguity, symmetry).
- Symmetry breaking.
- A bridge from nondeterminism to determinism.
- The reversal of irreversability. Thus related to the duality of body and mind.
Entropy
- Quantifies as {$S=k_{B}\textrm{ln}\Omega$} the number {$\Omega$} of microscopic configurations (microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Here we assume that each microstate is equally probable.
- Expresses the number of possible solutions to a given set of constraints.
- Expresses the ambiguity in the possible explanations of the given situation.
- Expresses the degree of irreversibility.
- Shannon entropy of a random variable: {$H=-\sum_{i}p_i \textrm{log}_2(p_i)$}. The average level of uncertainty-surprise-information in the variable's possible outcomes. The average amount of information contributed by the variable.
Tendency towards maximum entropy
Constructor theory