Introduction

Notes

Math

Epistemology

Search

Andrius Kulikauskas

  • m a t h 4 w i s d o m - g m a i l
  • +370 607 27 665
  • My work is in the Public Domain for all to share freely.

用中文

  • 读物 书 影片 维基百科

Introduction E9F5FC

Questions FFFFC0

Software

Life, Statistics, Biology discovery


Understand knowledge and ignorance in terms of the concept of information as relevant in mathematics, computer science and physics.


  • What is the definition of information?
  • How are information and entropy related?
  • How are information and communication related?
  • How does information relate to the levels of knowledge in the foursome?
  • How does information entropy relate to a 1x1 random matrix and an nxn random matrix and their eigenvalues?

Information

  • The resolution of uncertainty (ambiguity, symmetry).
  • Symmetry breaking.
  • A bridge from nondeterminism to determinism.
  • The reversal of irreversability. Thus related to the duality of body and mind.

Entropy

  • Quantifies as {$S=k_{B}\textrm{ln}\Omega$} the number {$\Omega$} of microscopic configurations (microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Here we assume that each microstate is equally probable.
  • Expresses the number of possible solutions to a given set of constraints.
  • Expresses the ambiguity in the possible explanations of the given situation.
  • Expresses the degree of irreversibility.
  • Shannon entropy of a random variable: {$H=-\sum_{i}p_i \textrm{log}_2(p_i)$}. The average level of uncertainty-surprise-information in the variable's possible outcomes. The average amount of information contributed by the variable.

Tendency towards maximum entropy

Constructor theory

Edit - Upload - History - Print - Recent changes
Search:
This page was last changed on August 29, 2020, at 05:16 PM