Introduction

Notes

Math

Epistemology

Search

Andrius Kulikauskas

  • m a t h 4 w i s d o m - g m a i l
  • +370 607 27 665
  • My work is in the Public Domain for all to share freely.

用中文

  • 读物 书 影片 维基百科

Introduction E9F5FC

Questions FFFFC0

Software

Life, Statistics, Biology discovery


Understand knowledge and ignorance in terms of the concept of information as relevant in mathematics, computer science and physics.


  • What is the definition of information?
  • How are information and entropy related?
  • How are information and communication related?
  • How does information relate to the levels of knowledge in the foursome?
  • How does information entropy relate to a 1x1 random matrix and an nxn random matrix and their eigenvalues?

Information

  • The resolution of uncertainty (ambiguity, symmetry).
  • Symmetry breaking.
  • A bridge from nondeterminism to determinism.
  • The reversal of irreversability. Thus related to the duality of body and mind.

Entropy

  • Quantifies as {$S=k_{B}\textrm{ln}\Omega$} the number {$\Omega$} of microscopic configurations (microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Here we assume that each microstate is equally probable.
  • Expresses the number of possible solutions to a given set of constraints.
  • Expresses the ambiguity in the possible explanations of the given situation.
  • Expresses the degree of irreversibility.
  • Shannon entropy of a random variable: {$H=-\sum_{i}p_i \textrm{log}_2(p_i)$}. The average level of uncertainty-surprise-information in the variable's possible outcomes. The average amount of information contributed by the variable.

Tendency towards maximum entropy

Constructor theory

Information is what you learn. What you learn grows at the boundary, has the shape of the boundary. A shape can be thought of as being created by integrating over these boundaries as they increase.

Gregory Chaitin = Shannon + Turing = Compression-Decompression as understanding.

Philosophy of computation

Information capacity is zero if probability is the same for all cases but also if one case is given 100%. Information transmission requires asymmetry. Otherwise you cannot define choice.

Probability

Quantum computing

https://en.wikipedia.org/wiki/Koopman%E2%80%93von_Neumann_classical_mechanics

https://www.amazon.com/Quantum-Challenge-Foundations-Mechanics-Astronomy/dp/076372470X

In what sense are Feynman diagrams relativistic given that they have directions for time and for space?

Instead of thinking of speed of light, think of a clock that doesn't tick, so that t=0 always. And this is the case for the quantum harmonic osciallator and for the particle-clocks with no steps

Edit - Upload - History - Print - Recent changes
Search:
This page was last changed on November 13, 2024, at 09:09 PM