Introduction
Notes
Math
Epistemology
Search
Andrius Kulikauskas
- m a t h 4 w i s d o m - g m a i l
- +370 607 27 665
- My work is in the Public Domain for all to share freely.
用中文
Software
|
Statistics
Grace and Justice: The Deliberate Ambiguity of Whether We Live in a System Which is Open or Closed
熵 ..... אַנטראָפּי
- Make a diagram of the many concepts that relate to entity and its definition.
- What is intuitively the difference between classical entropy and quantum entropy? Nobody knows.
- How to define increasing stability in terms of probability?
- Entropy = deliberateness - causal state - how instrumental? "interest" as an instrument for a cause.
- How might increasing stability complement increasing entropy?
- How is entropy related to Brownian motion?
- Entropy + information = conservation ?
- How do you get the control back (Dievo valia). Why would there be a return to control?
- Ar tyrimas yra entropijos raiška? Pavyzdžiui, metu kamuolį ant kalno viršūnės ir žiūriu, kaip jisai nuriedės, tada vėl numetu ir taip toliau. Kokia tikimybė, kad tai vyktų atbulai, atvirkščia laiko kryptimi?
- Understand the difference between grace and justice in entropy.
Learn about:
读物
Suggested by John Baez
Suggested by a fan of Peter Neuman
Other
- Information Geometry by John Baez
- D.E.Stevenson. Exploring an Information-Based Approach to Computation and Computational Complexity. Category theory used to describe computation, validation, the trace of a calculation.
- Vigneaux, J., 2019. Topology of statistical systems. A cohomological approach to information theory (Doctoral dissertation, Ph. D. Thesis, Paris 7 Diderot University, Paris, France).
- Roger Penrose. The Road to Reality
- Entropy Demystified
- David Ellerman. Logical Information Theory: New Foundations for Information Theory compare with his thoughts on adjunctions
- David Ellerman. An Introduction to Logical Entropy and Its Relation to Shannon Entropy
- David Ellerman. Counting Distinctions: On the Conceptual Foundations of Shannon's Information Theory
- ncatlab: John Baez, Tobias Fritz, Tom Leinster. Entropy as a functor.
- Tai-Danae Bradley. A New Perspective of Entropy.
定义
- Entropy is the degree of deliberateness, willfulness.
- Intuitively, think of entropy in terms of "deliberateness" and "nondeliberateness". Googling on "entropy deliberateness" doesn't yield much, so perhaps that's novel.
- 维基百科: Entropy expresses the number Ω of different configurations that a system defined by macroscopic variables could assume.
- Deliberateness is unambiguity.
- Entropy is the number of solutions to a particular problem, a particular set of equations or constraints.
- When we have an interaction, (the ambiguity of) the outputs can exceed the (the ambiguity of) inputs. In which case it would require an enormous amount of coincidence to reverse it. So entropy is the measure of that coincidence.
- Entropy is a statistical property.
- Thermodynamical entropy is what is observed.
- On the quantum level, there is entanglement entropy.
- Entropija - informacija yra ko reikėtų atstatyti būklę, perkeisti koordinates.
- Entropy is the amplitude of the possibility of divergence.
- Increase in entropy is a measure (a symptom) of freedom.
- Irreversibility.
- Q/T = S entropy
- Entropy is related to the number of possible arrangements that yield the observed result.
- Entropy is a measure of the precision of measurement.
- Delta-S = Delta-Q/T
- Nondeliberate = uncontrolled
- Dalis terminės energijos sistemoje, kuria negalima dirbti darbą.
- Kaip lygiai energija yra išdalinta sistemoje.
- Definition of entropy depends on how you choose it. Unit of phase space determines your unit of entropy. Thus observer defines phase space.
- Deciding is a subsystem issue - we are between what has been decided and what has not yet been decided - and we can participate in between - and there are subsystems within us and we are within other subsystems - it is a parallel process on hierarchical levels - thus ambiguous.
- John Harland: What is learnable? What is discernible and what is not? Levels like computability, complexity. A classification of dynamical systems.
- John Harland: Coin toss experiment.
节约
- Jeigu entropija mažėja, tai turi būti kitos galimybės, kuriose jinai didėjo. Tad tai tėra galimybė. Ir ši taisyklė - šis išsiskyrimas - yra kaip subliukšta kvantinė funkcija. Mat, kvantiniuose reiškiniuose visos galimybės yra kartu.
- Mažėjanti entropija reikalauja, kad būtų didėjanti. Panašiai, kaip kad norint leisti patikslinti, reikia leisti nuklysti.
- Entropy increases. Over what? Over time? But time as it unfolds become more deliberate, more determined, more established, more stable. Kuom keičiasi visata.
模棱两可
- Entropijoje slypi dviprasmybė - ar Dievas būtinai geras? ar ne? Yra dviprasmybė, ribinis nepastovumas - fluctuation, nes teisingumo išlaikymui reikalinga, kad vadai būtų neteisingu elgesiu su jais gryninami, išbandomi.
- Dvejybės atvaizdai yra jų sąlygos, kur koks požiūris galioja. Pavyzdžiui, vidus ir išorė. Tapatumas ir skirtingumas. Palyginti su entropija.
- Disentropija ir meilė. Kažkas rūpinasi, kažkas saugoja. Kad būtų vienaip ne kitaip. Ar sutampa tikslai. Mylinčiojo tikslas ir mylimojo nauda?
时间
- Deliberateness decreases with time (repetition).
- Navier-Stokes equations: Reynolds number relates time symmetric (high Reynolds number) and time asymmetric (low Reynolds number) situations.
- Lūkesčiais skiriame praeitį ir ateitį - tikimės - tuo pačiu skiriame išorę ir vidų, ką žinome ir ko nežinome, ką pažįstame ir ko nepažįstame.
- Laikas yra tarsi pertvara tarp skirtingų sistemų - vienos akimirkos sistema atskirta nuo kitos akimirkos sistemos. Tai primena atskirtų (ir sujungtų) sistemų entropijos skaičiavimo uždavinį.
- Ribos tikslas yra kurti sąlygas neprieštaravimui. Laikui tekant didėja neprieštaravimas. Atitinkamai erdvėje auga entropija.
- Time - is an external clock - recurring between.
- Ordering of time = acceptance of control.
- Tyrime laikas teka viena kryptimi, atvirkščiai tiesiog neįmanoma.
- The issue of "deciding" necessitates a framework given by "the division of everything into five perspectives": Every effect has had its cause, but not every cause has had its effects. And the boundary/present is where these two causal directions coincide. This framework, cognitively, has two representations: we imagine it either as time (cause in past, effect in future) or space (cause outside a subsystem, effect inside a subsystem).
- Coincidence is a concept that brings together time and space, although I have yet to understand it.
- Unfolding - macrosystem open, time asymmetric; microsystem closed, time symmetric.
- Number systems are cognitive - the laws of nature relate continua but presumably not by way of number systems as we do.
周期性
- Engine - pasikartojanti veikla - varoma jėga. Carnot cycle.
- A) You can one-time convert heat into work by expanding.
- B) But if you want recurring behavior (as with a clock, or a wave) you need to contract.
- Recurring activity can be manifested as a single particle over very long time (so that fluctuations get averaged out); different but similar particle acting in parallel; a hierarchy of combinations of similar particles, across a surface. This gives a range of models.
- Instrumentality - distribution of agency - is one-way disentanglement. Equilibrium - recurring activity. Defines "object" like an "ideal gas". Not just a random meaningless subset. An additional science of recurring activity - deliberateness - interest/intentionality.
- Gap between time-asymmetric macrodynamics (one science - "not every cause has had its effects") and time-symmetric microdynamics (another science - "every effect has had its cause"). The crucial role of the gap as in pertvarkymai.
熵增加
- The entropy of a system increases when you measure the system. Increase in entropy is a symptom of freedom. Measurement is the establishment of such a symptom. Measurement involves external probing of a system.
- Penrose: Entropy increase <=> balanced by expansion of universe's fine scaleness
减少熵
- Prayer is (if it is anything) a way of engineering, of increasing the likelihoods of miracles. I think it does this by increasing the ambiguity required for (God or external forces) to intervene (without breaking any laws too badly). So explaining this dynamics would be my main idea.
- Low entropy = high distinction, differentiation with environment - basis for life.
- Low entropy: increasing ambiguity combined with increasing distinctness of choice - clarity of choice increases.
划分
- Patricia Palacios talk. Gibbs interpretation carves up the domain like Riemannian integration; Boltzmann interpretation carves up the range like Lebesgue integration. Gibbs is human (how, what) and how we think and how our physical quantities that we measure make sense, but Boltzmann is divine (why, whether) and how nature actually is. Relate to the Yoneda lemma.
- Charlotte Werndl and Roman Frigg. When do Gibbsian Phase Averages and Boltzmannian Equilibrium Values Agree?
- Roman Frigg, Charlotte Werndl. Statistical Mechanics: A Tale of Two Theories There are two theoretical approaches in statistical mechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-à-vis the other. We answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical mechanics (GSM) is an effective theory, and we describe circumstances under which Gibbsian calculations coincide with the Boltzmannian results. We then point out that regarding GSM as an effective theory has important repercussions for a number of projects, in particular attempts to turn GSM into a nonequilibrium theory.
- Entropijos perskyros primena visko padalinimus.
- Viskas turi tą pačią temperatūrą - tai žemiausia entropija. Bet jeigu sumažiname mąstą tada temperatūros sąvoka pasikeičia.
- Vietinė savybė: energijos visuma nesikeičia, tik raiška keičiasi. Visuminė savybė: entropija didėja, energija sistemoje išsilygina. Visuminius ir vietinius reiškinius jungia koordinačių sistema.
- The role of the coordinate system - who decides the particular coordinate system used? - because whoever decides can scramble and unscramble the "phase space" at will.
- Coincidence has to do with the relationship between subsystems and systems. In physical modeling it's crucial that we be able to talk about subsystems. But how do those subsystems come back together?
- Same goes to different. To become different requires a coordinate system. Where does the coordinate system come from?
- Petvarkymai - tarpas - tarp viršsistemos ir posistemės.
能源
- Penrose is critical of the kind of open/closed system distinction that I made as regards an external energy source. He notes that the earth reflects the same amount of energy as it receives from the sun. The key point is that the energy coming in from the sun is qualitatively different. It is fewer photons of higher energy. The reflected photons are greater in number and lower in energy. The entropy is lower when there are fewer photons.
- Structure sheds energy.
- I have a bucket of ash in my room which I accidentally knocked over. So that created a mess. That helped me realize what it means that from the point of view of the law of physics, it would be possible for all of the interactions to be reversed so that the ash climbed back into the bucket. It means that there is heat energy - kinetic energy of particles - such that if the momentum was all reversed, then those particles would all coincidentally work together and impart their energy to push all of those specks of ash and knock the bucket vertical with all the ash inside, and knock my foot, too. Such a coincidence is possible but it would be amazing.
- John Harland: Energy and entropy should never have been in the same units.
- John Harland: Masses far apart - you can change very little energy.
- John Harland: Entropy is not fundamental. It is a convenient dynamical assumption. A shortcut principle for predicting the final state. Whether a certain arrangement of mass...
- John Harland: Energy is conserved.
热力学
- What is pressure and how is it affected by a single particle.
子系统
- Cognitively, our emotional lives are driven by expectations, especially the temporal boundary between expecting and learning an outcome, and the spatial boundary between self and world.
- Entropy is important in discussing the ambiguity of open systems (based on grace) and closed systems (based on justice). Yes, locally, at a certain level, we're fueled by the Sun, and yet again, at bigger and smaller levels things are crumbling all the same. So the ambiguity seems very important.
- Two systems act entropically when put together lead to entropic state, bad or good, but otherwise systems for which there is no pair (love and life).
- John Harland: Certain systems are impenetrable under any algorithm that's truly random.
- Penrose: Space time quantum unfolding. Requires self reflection?
- Penrose: If we run an experiment then... chosen by "experimenter" God... coarse graining as in entropy.
- Collapse of wave function - invert it - closed with no symmetry.
生命
- A particular set of atoms, say, may seem meaninglessly chosen. And yet if we study what happens to those atoms - their flow through the system - then we may nevertheless witness signs of life. So the definition of life - for example - as that which can have ("(self)-interest") - is related to entropy. A frog has "self-interest" directly, and a clock (which has a potential owner) has "(self)-interest" on behalf of its owner. Which is to say, life is that which we can be helped or hurt. (In Lithuanian, we have a word "nauda" ("what is useful to us"), which suggest that something can be done on our behalf. And I'm thinking, you can't do anything on behalf of something that's not alive, but only for that which is alive - to whatever degree.)
自由
- Malda išlaisvina iš įpročio - iš pasikartojančios veiklos.
- Economy - bubbles (increasing "useless") freedom.
控制
- Virsmo taškas: kaip maži dalykai daro didelę įtaka - žemos entropijos esmė - valdymo teorijos esmė.
- John Harland: A little bit of energy can make a big difference. How much can you change the long term state? Control theory.
正义与恩典
- Entropy distinguishes the bad kid's point of view (that we live in "justice", a closed system that is zero-sum and can only get worse) and the good kid's point of view (that we live in "grace", an open system that is fed by an external source of love). There is a key ambiguity between these two points of view: Is our system open or closed?
- Teisingumas: Esame visi paskiri, nepriklausomi, kovojame už save, nevieningi. Malonė: esame visi vieningi, vienas kitą palaikome.
全球和本地
- Penrose: Counterintuitively, as regards gravitational force, entropy is lower when matter is spread out in space, and entropy is higher as matter comes together in a small area.
- There should be a very conceptual accord between the global, external geometry of the universe (an ever expanding "big bang") and the local, internal geometry of the universe (an ever refining grid as per an ever shrinking Plank's constant). Conceptually, they should be inverses of each other. This would address many issues.
- The universe unfolds in complexity, manifests as space-time both expanding globally and refining locally, with both of equal importance.
- The universe evolves not from an infinitesimal point but rather from a mid-scale unity of the lowest possible entropy (=1). Which is to say, the universe is most deliberate.
- The universe manifests a clear teleology towards a 3-dimensional Euclidean space. Globally it tends towards infinite height, width, length and locally it tends towards a real number continuum. This is why classical physics works. It is based on the ultimate ideal towards which the universe is tending.
- Whereas the universe that we actually experience is but a finite model that is tending towards the ideal. Locally it is limited by Plank's constant, the Heisenberg uncertainty principle, which however becomes ever more refined as the universe unfolds.
- The act of measurement is what drives the unfolding of the universe. This is an act that needs to be defined. But basically it is an intervention which indicates that a particular possibility manifested among a variety. It really should be thought in terms of a refinement of the local grid of uncertainty. It's not so much that certainty was created but that uncertainty was given structure. Entropy can thus be defined not in terms of particles and their states but rather in terms of the quanta of uncertainty which keep increasing as the grid grows more refined locally.
- Measurement also shifts from the complex valued "current model" to the real valued teleological "ideal".
- We thus live in a world where there is a lot of instability, logically.
- A lot of quantum structures are internal to particles and as such do not participate in the unfolding real space. For example, as Penrose describes based on twistors, the electron consists of a zig and zag, both moving at the speed of light but in opposite direction, coupled by the mass of the electron. Quantum interactions would be rethought in terms of how they give shape to "the edge" of the universe locally and globally.
- The major error in current theories is that they presume an infinite number of states. This is a confusion and conflation of the current-finite-complex state and the ultimate-infinite-real state. The correct theory would have simply a finite series of states of possibly unknown length. The current theories thus yield infinite nonsense which has to be worked around through renormalization.
- Built into the notion of entropy is the idea of a coarse grid, and I imagine, a hierarchy of several grids (perhaps three or four) of increasing refinement. So these grids would have explicit physical meaning.
- The difference between gravitational and quantum perspectives would be fundamental and related to the different ways that they exhibit entropy, where gravity is low entropy when it is spread out, and quantum behavior is vice versa.
- There would be pairing of global and local phenomenon. For example, an electron would pair with a white-dwarf (composed of electron-degenrate matter), a neutron with a neutron star, and so on. Which is to say, both phenomemon would be considered equally fundamental. A neutron locally has the same complexity, conceptually as a neutron star globally. They are at opposite ends of the spectrum and remain so as the universe unfolds.
- Physics could be thought of as describing the unfolding of "the edge" of the universe, which happens both locally and globally.
信息论
- Actual entropy is defined by information theory, by automata theory, and relates to irreversability.
- Physical entropy is probabilistic entropy. Thus it is only approximate. And in a sense it is not real entropy because every state can, in principle, proceed in the reverse direction.
Notes
- Should distinguish between interpretations of apparent entropy based on probability (as in physics) and actual entropy based on irreversibility (as in automata theory).
- Apparent entropy depends on having a particular partitioning that depends on the observer.
- The observer is defined by interpretation of the choice frameworks.
- Actual entropy does not require any partitioning.
- The twosome is an expression of the nature of entropy, the second law of thermodynamics. The mind shifts from a state of greater ambiguity (where opposites coexist) to a state of lesser ambiguity (where all is the same). Correspondingly, the mind shifts from a state of higher energy to a state of lower energy.
- The onesome defines order, whereas the twosome defines entropy, and so the two are related in that way.
- The reversal of entropy is the reversal of the twosome.
- Entropy is related to information and knowledge. How is it related to the foursome? Study Sean Carroll's four different descriptions for entropy. Can I relate them to the Yoneda lemma? The four kinds of entropy could express the four meanings of equality in X=X, with reference to nothing, something, anything, everything.
- Entropy: Measure of uncertainty before the flip.
- Least number of yes and no questions that you need to ask.
- Information: Knowledge you have to gain after the flip.
- A system has two directions in time: weakly increasing and weakly decreasing as regards entropy. Or the entropy may stay the same. But suppose that anytime two systems come together, say a system and a subsystem, what if they do so in the way that the entropy direction is compatible. In that way, a global constraint would arise from constraints on the subsystems. Time would be constructed from causal constraints.
- In physics, global tension leads to local release. This is the direction of entropy.
- Carmen Constantin: A topos-theoretic perspective on entropy
Mindey's ideas on entropy
- More energy is needed for order than disorder. Thus entropy is a force.
- Entropy has an escape velocity whereby information will never be destroyed.
- Erosion of a hill erodes through time, reveals new layers, memories. Entropy acts like erosion in that it reveals information lying below.
- Life reproduces information and thereby fights entropy.
- Time stops when there is no difference between the world and its model. The flow of time is that difference.
- The flow of time indicates the wish to survive, the wish for everything to be.
- Life is an expression of the wish to spread.
- Increase in nondeliberateness - increase in nondeterminism - manifestation of symmetry
- Think in terms of "common" and "uncommon" states. Uncommon states tend to common states simply because they are more frequent. But nature has forces (like gravity) that tend to create uncommon states. So we need to distinguish between the different forces at play and how they relate to what is common and uncommon.
- Tyrimai reikalauja trijų rūšių entropijos. Tyrinėtojas sustato tyrimą. Paskui tyrinėtojas aptveria tyrimą - jisai išlieka objektyvus, jisai pasitraukia, kaip kad Dievas. Toliau tyrinėtojas stebi išdavas.
- Tyrimo išdavos turi galioti paskirame tyrime; bet kokiame tolygiame tyrime; ir taip pat nesant jokio tyrimo.
- Tyrinėtojas būtinai įsiterpia į savo tyrimą, paskui pasitraukia. Tad yra įmanoma ir tikėtina, jog ateis laikas kai visos eigos apsivers ir reiškiniai išeis už tyrimo ribų, paveiks tyrinėtojai ir jį aplenks.
- Entropija susijusi su laikinu tyrinėtojo apsiribojimu (nešališkumas) ir vėlesniu (bei ankstyvesniu) jo įtraukimu.
Second law: A closed subsystem is opened (like a balloon of gas that is burst open) thus we get dispersion.
- How does the foursome express entropy?
- Number of equivalent configurations
- It typically, naturally increases
- Morality - sixsome - align yourself with creation (low entropy) and not destruction (high entropy). Then later - sevensome/eightsome - align yourself with the low entropy point (God).
- Consider the original thermodynamic macroscopic definition of entropy. And consider how it describes the difference between a context (for disequilibrium) and subsequent equilibrium. Thus consider how entropy relates to context, to the relation between a system and a subsystem.
- Shannon entropy - information in terms of measurement how local and global are linked.
- An experiment is highly deliberate - reason why - low entropy - relaxation of particles - local - unique configuration.
- Receiving high frequency light and emitting low frequency light is like a strange clock that slows down time. How is that like the effects of general relativity? And like acceleration?
Understand the principle of maximum entropy in terms of its illustration by E.T.Jaynes using the widget problem.
The Maximum Entropy Principle is similar to the application of quantum symmetries. Given quantum symmetries, we look for an ensemble of random Hamiltonians (matrices) that satisfy them. Similarly, given symmetries of a probablity distribution, or given their conserved quantities, we look for the probability distribution which satisfies them that has maximum entropy.
{$\ln 2=1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+\frac{1}{5}-\frac{1}{6}\dots$} is an important number in information theory because it allows us to consider and calculate binary choices.
Entropy is discrete, which suggests it is a concept of the conscious, and involves reflection. Whereas in the case of the unconscious, there is no reflection, thus no entropy, thus no symmetry, no possibility of different actions yielding the same state. For the unconscious, we have continuity in evolution, we have rotations.
Entropy
- Entropy can be understood independently of a field or with regard to a field. Without a field, everything being in one place is highly deliberate. But with a gravitational field, it is highly non deliberate, whereas having everything spread out uniformly would be highly deliberate. Where is this discussed? Is this a fundamental ambiguity? What is the significance of this ambiguity? What does Penrose say about this?
- Why does deletion of information increase entropy as heat? How is this related to the Turing machine rule {$X\rightarrow\epsilon$}?
- Maximum entropy. Distinguish what is known from what is not known and model the latter using nondeliberateness, thus maximum entropy. What is known is understood as deliberate.
- Entropy is the expected value of surprise, that is, the expected surprise.
Entropy = nondeliberateness. How is this related to unconscious (nondeliberate) and conscious (deliberate)?
- Conscious = nondeliberate nondeliberateness. Not nondeliberate.
- Consciousness = nondeliberate nondeliberate nondeliberateness = deliberate. Not not nondeliberate.
- Nondeliberate = not not not nondeliberate.
Maximum entropy distribution = least informative distribution.
Tai-Danae Bradley: Information is on the Boundary
- Shannon entropy: the amount of surprise
|