What's new in

HYPER-FINITE FACTORS, P-ADIC LENGTH SCALE HYPOTHESIS, AND DARK MATTER HIERARCHY

Note: Newest contributions are at the top!



Year 2017

Hierarchy of Planck constants, space-time surfaces as covering spaces, and adelic physics

From the beginning it was clear that heff/h=n corresponds to the number of sheets for a covering space of some kind. First the covering was assigned with the causal diamonds. Later I assigned it with space-time surfaces but the details of the covering remained unclear. The final identification emerged only in the beginning of 2017.

Number theoretical universality (NTU) leads to the notion of adelic space-time surface (monadic manifold) involving a discretization in an extension of rationals defining particular level in the hierarchy of adeles defining evolutionary hierarchy. The first formulation was proposed here and more elegant formulation here.

The key constraint is NTU for adelic space-time containing sheets in the real sector and various padic sectors, which are extensions of p-adic number fields induced by an extension of rationals which can contain also powers of a root of e inducing finite-D extension of p-adic numbers (ep is ordinary p-adic number in Qp).

One identifies the numbers in the extension of rationals as common for all number fields and demands that imbedding space has a discretization in an extension of rationals in the sense that the preferred coordinates of imbedding space implied by isometries belong to extension of rationals for the points of number theoretic discretization. This implies that the versions of isometries with group parameters in the extension of rationals act as discrete versions of symmetries. The correspondence between real and p-adic variants of the imbedding space is extremely discontinuous for given adelic imbedding space (there is hierarchy of them with levels characterized by extensions of rationals). Space-time surfaces typically contain rather small set of points in the extension (xn+yn2=zn contains no rationals for n>2!). Hence one expects a discretization with a finite cutoff length at space-time level for sufficiently low space-time dimension D=4 could be enough.

After that one assigns in the real sector an open set to each point of discretization and these open sets define a manifold covering. In p-adic sector one can assign 8:th Cartesian power of ordinary p-adic numbers to each point of number theoretic discretization. This gives both discretization and smooth local manifold structure. What is important is that Galois group of the extension acts on these discretizations and one obtains from a given discretization a covering space with the number of sheets equal to a factor of the order of Galois group, typically equal to the order of Galois.

heff/h=n was identified from the beginning as the dimension of poly-sheeted covering assignable to space-time surface. The number n of sheets would naturally a factor of the order of Galois group implying that heff/h=n is bound to increase during number theoretic evolution so that the algebraic complexity increases. Note that WCW decomposes into sectors corresponding to the extensions of rationals and the dimension of the extension is bound to increase in the long run by localizations to various sectors in self measurements (see this). Dark matter hierarchy represents number theoretical/adelic physics and therefore has now rather rigorous mathematical justification. It is however good to recall that heff/h=n hypothesis emerged from an experimental anomaly: radiation at ELF frequencies had quantal effects of vertebrate brain impossible in standard quantum theory since the energies E=hf of photons are ridiculously small as compared to thermal energy.

Indeed, since n is positive integer evolution is analogous to a diffusion in half-line and n unavoidably increases in the long run just as the particle diffuses farther away from origin (by looking what gradually happens near paper basket one understands what this means). The increase of n implies the increase of maximal negentropy and thus of negentropy. Negentropy Maximization Principle (NMP) follows from adelic physics alone and there is no need to postulate it separately. Things get better in the long run although we do not live in the best possible world as Leibniz who first proposed the notion of monad proposed!

For details see the chapter Quantum criticality and dark matter.



Time crystals, macroscopic quantum coherence, and adelic physics

Time crystals were (see this) were proposed by Frank Wilzek in 2012. The idea is that there is a periodic collective motion so that one can see the system as analog of 3-D crystal with time appearing as fourth lattice dimension. One can learn more about real life time crystals here.

The first crystal was created by Moore et al (see this) and involved magnetization. By adding a periodic driving force it was possible to generate spin flips inducing collective spin flip as a kind of domino effect. The surprise was that the period was twice the original period and small changes of the driving frequency did not affect the period. One had something more than forced oscillation - a genuine time crystal. The period of the driving force - Floquet period- was 74-75 μs and the system is measured for N=100 Floquet periods or about 7.4-7.5 milliseconds (1 ms happens to be of same order of magnitude as the duration of nerve pulse). I failed to find a comment about the size of the system. With quantum biological intuition I would guess something like the size of large neuron: about 100 micrometers.

Second law does not favor time crystals. The time in which single particle motions are thermalized is expected to be rather short. In the case of condensed matter systems the time scale would not be much larger than that for a typical rate of typical atomic transition. The rate for 2P → 1S transition of hydrogen atom estimated here gives a general idea. The decay rate is proportional to ω3d2, where ω= Δ E/hbar is the frequency difference corresponding to the energy difference between the states, d is dipole moment proportional to α a0, a0 Bohr radius and α∼ 1/137 fine structure constant. Average lifetime as inverse of the decay rate would be 1.6 ns and is expected to give a general order of magnitude estimate.

The proposal is that the systems in question emerge in non-equilibrium thermodynamics, which indeed predicts a master-slave hierarchy of time and length scales with masters providing the slowly changing background in which slaves are forced to move. I am not a specialist enough to express any strong opinions about thermodynamical explanation.

What does TGD say about the situation?

  1. So called Anderson localization (see this) is believed to accompany time crystal. In TGD framework this translates to the fusion of 3-surfaces corresponding to particles to single large 3-surface consisting of particle 3-surfaces glued together by magnetic flux tubes. On can say that a relative localization of particles occurs and they more or less lose the relative translation degrees of freedom. This effect occurs always when bound states are formed and would happen already for hydrogen atom.

    TGD vision would actually solve a fundamental problem of QED caused by the assumption that proton and electron behave as independent point like particles: QED predicts a lot of non-existing quantum states since Bethe-Salpeter equation assumes degrees of freedom, which do not actually exist. Single particle descriptions (Schrödinger equation and Dirac equation) treating proton and electron effectively as single particle geometrically (rather than independent particles) having reduced mass gives excellent description whereas QED, which was thought to be something more precise, fails. Quite generally, bound states are not properly understood in QFTs. Color confinement problem is second example about this: usually it is believed that the failure is solely due to the fact that color interaction is strong but the real reason might be much deeper.

  2. In TGD Universe time crystals would be many-particle systems having collection of 3-surfaces connected by magnetic flux tubes (tensor network in terms of condensed matter complexity theory). Magnetic flux tubes would carry dark matter in TGD sense having heff/h=n increasing the quantal scales - both spatial and temporal - so that one could have time crystals in long scales.

    Biology could provide basic examples. For instance, EEG resonance frequency could be associated with time crystals assignable to the magnetic body of brain carrying dark matter with large heff/h=n - so large that dark photon energy E=hefff would correspond to an energy above thermal energy. If bio-photons result from phase transitions heff/h=n→ 1, the energy would be in visible-UV energy range. These frequencies would in turn drive the visible matter in brain and force it to oscillate coherently.

  3. The time crystals claimed by Monroe and Lurkin to be created in laboratory demand a feed of energy (see this) unlike the time crystals proposed by Wilzek. The finding is consistent with the TGD based model. In TGD the generation of large heff phase demands energy. The reason is that the energies of states increase with heff. For instance, atomic binding energies decrease as 1/h2eff. In quantum biology this requires feeding of metabolic energy. Also now interpretation would be analogous to this.
  4. Standard physics view would rely in non-equilibrium thermodynamics whereas TGD view about time crystals would rely on dark matter and hierarchy of Planck constants in turn implied by adelic physics suggested to provide a coherent description fusing real physics as physics of matter and various p-adic physics as physics of cognition.

    Number theoretical universality (NTU) leads to the notion of adelic space-time surface (monadic manifold) involving a discretization in an extension of rationals defining particular level in the hierarchy of adeles defining evolutionary hierarchy. heff/h=n has been identified from the beginning as the dimension of poly-sheeted covering assignable to space-time surface. The action of the Galois group of extensions indeed gives rise to covering space. The number n of sheets would the order of Galois group implying heff/h=n, which is bound to increase during evolution so that the complexity increases.

    Indeed, since n is positive integer evolution is analogous to a diffusion in half-line and n unavoidably increases in the long run just as the particle diffuses farther away from origin (by looking what gradually happens near paper basket one understands what this means). The increase of n implies the increase of maximal negentropy and thus of negentropy. Negentropy Maximization Principle (NMP) follows from adelic physics alone and there is no need to postulate it separately. Things get better in the long run although we do not live in the best possible world as Leibniz who first proposed the notion of monad proposed!

For details see the chapter Quantum criticality and dark matter.



Why metabolism and what happens in bio-catalysis?

TGD view about dark matter gives also a strong grasp to metabolism and bio-catalysis - the key elements of biology.

Why metabolic energy is needed?

The simplest and at the same time most difficult question that innocent student can make about biology class is simple: "Why we must eat?". Or using more physics oriented language: "Why we must get metabolic energy?". The answer of the teacher might be that we do not eat to get energy but to get order. The stuff that we eat contains ordered energy: we eat order. But order in standard physics is lack of entropy, lack of disorder. Student could get nosy and argue that excretion produces the same outcome as eating but is not enough to survive.

We could go to a deeper level and ask why metabolic energy is needed in biochemistry. Suppose we do this in TGD Universe with dark matter identified as phases characterized by heff/h=n.

  1. Why metabolic energy would be needed? Intuitive answer is that evolution requires it and that evolution corresponds to the increase of n=heff/h. To see the answer to the question, notice that the energy scale for the bound states of an atom is proportional to 1/h2 and for dark atom to 1/heff2 ∝ n2 (do not confuse this n with the integer n labelling the states of hydrogen atom!).
  2. Dark atoms have smaller binding energies and their creation by a phase transition increasing the value of n demands a feed of energy - metabolic energy! If the metabolic energy feed stops, n is gradually reduced. System gets tired, loses consciousness, and eventually dies.

    What is remarkable that the scale of atomic binding energies decreases with n only in dimension D=3. In other dimensions it increases and in D=4 one cannot even speak of bound states! This can be easily found by a study of Schrödinger equation for the analog of hydrogen atom in various dimensions. Life based on metabolism seems to make sense only in spatial dimension D=3. Note however that there are also other quantum states than atomic states with different dependence of energy on heff.

Conditions on bio-catalysis

Bio-catalysis is key mechanism of biology and its extreme efficacy remains to be understood. Enzymes are proteins and ribozymes RNA sequences acting as biocatalysts.

What does catalysis demand?

  1. Catalyst and reactants must find each other. How this could happen is very difficult to understand in standard biochemistry in which living matter is seen as soup of biomolecules. I have already already considered the mechanisms making it possible for the reactants to find each other. For instance, in the translation of mRNA to protein tRNA molecules must find their way to mRNA at ribosome. The proposal is that reconnection allowing U-shaped magnetic flux tubes to reconnect to a pair of flux tube connecting mRNA and tRNA molecule and reduction of the value of heff=n× h inducing reduction of the length of magnetic flux tube takes care of this step. This applies also to DNA transcription and DNA replication and bio-chemical reactions in general.
  2. Catalyst must provide energy for the reactants (their number is typically two) to overcome the potential wall making the reaction rate very slow for energies around thermal energy. The TGD based model for the hydrino atom having larger binding energy than hydrogen atom claimed by Randell Mills suggests a solution. Some hydrogen atom in catalyst goes from (dark) hydrogen atom state to hydrino state (state with smaller heff/h and liberates the excess binding energy kicking the either reactant over the potential wall so that reaction can process. After the reaction the catalyst returns to the normal state and absorbs the binding energy.
  3. In the reaction volume catalyst and reactants must be guided to correct places. The simplest model of catalysis relies on lock-and-key mechanism. The generalized Chladni mechanism forcing the reactants to a two-dimensional closed nodal surface is a natural candidate to consider. There are also additional conditions. For instance, the reactants must have correct orientation. For instance, the reactants must have correct orientation and this could be forced by the interaction with the em field of ME involved with Chladni mechanism.
  4. One must have also a coherence of chemical reactions meaning that the reaction can occur in a large volume - say in different cell interiors - simultaneously. Here MB would induce the coherence by using MEs. Chladni mechanism might explain this if there is there is interference of forces caused by periodic standing waves themselves represented as pairs of MEs.
Phase transition reducing the value of heff/h=n as a basic step in bio-catalysis

Hydrogen atom allows also large heff/h=n variants with n>6 with the scale of energy spectrum behaving as (6/n)2 if the n=4 holds true for visible matter. The reduction of n as the flux tube contracts would reduce n and liberate binding energy, which could be used to promote the catalysis.

The notion of high energy phosphate bond is somewhat mysterious concept. There are claims that there is no such bond. I have spent considerable amount of time to ponder this problem. Could phosphate contain (dark) hydrogen atom able to go to the a state with a smaller value of heff/h and liberate the excess binding energy? Could the phosphorylation of acceptor molecule transfer this dark atom associated with the phosphate of ATP to the acceptor molecule? Could the mysterious high energy phosphate bond correspond to the dark atom state. Metabolic energy would be needed to transform ADP to ATP and would generate dark atom.

Could solar light kick atoms into dark states and in this manner store metabolic energy? Could nutrients carry these dark atoms? Could this energy be liberated as the dark atoms return to ordinary states and be used to drive protons against potential gradient through ATP synthase analogous to a turbine of a power plant transforming ADP to ATP and reproducing the dark atom and thus the "high energy phosphate bond" in ATP? Can one see metabolism as transfer of dark atoms? Could possible negentropic entanglement disappear and emerge again after ADP→ATP.

Here it is essential that the energies of the hydrogen atom depend on hbareff=n× h in as hbareffm, m=-2<0. Hydrogen atoms in dimension D have Coulomb potential behaving as 1/rD-2 from Gauss law and the Schrödinger equation predicts for D≠ 4 that the energies satisfy En∝ (heff/h)m, m=2+4/(D-4). For D=4 the formula breaks since in this case the dependence on hbar is not given by power law. m is negative only for D=3 and one has m=-2. There D=3 would be unique dimension in allowing the hydrino-like states making possible bio-catalysis and life in the proposed scenario.

It is also essential that the flux tubes are radial flux tubes in the Coulomb field of charged particle. This makes sense in many-sheeted space-time: electrons would be associated with a pair formed by flux tube and 3-D atom so that only part of electric flux would interact with the electron touching both space-time sheets. This would give the analog of Schrödinger equation in Coulomb potential restricted to the interior of the flux tube. The dimensional analysis for the 1-D Schrödinger equation with Coulomb potential would give also in this case 1/n2 dependence. Same applies to states localized to 2-D sheets with charged ion in the center. This kind of states bring in mind Rydberg states of ordinary atom with large value of n.

The condition that the dark binding energy is above the thermal energy gives a condition on the value of heff/h=n as n≤ 32. The size scale of the dark largest allowed dark atom would be about 100 nm, 10 times the thickness of the cell membrane.

For details see the chapter Quantum criticality and dark matter.



NMP and self

The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note ideas related directly to consciousness and cognition are discussed.

  1. Adelic approach strongly suggests the reduction of NMP to number theoretic physics somewhat like second law reduces to probability theory. The dimension of extension rationals characterizing the hierarchy level of physics and defined an observable measured in state function reductions is positive and can only increase in statistical sense. Therefore the maximal value of entanglement negentropy increases as new entangling number theoretic degrees of freedom emerge. heff/h=n identifiable as factor of Galois group of extension characterizes the number of these degrees of freedom for given space-time surfaces as number of its sheets.
  2. State function reduction is hitherto assumed to correspond always to a measurement of density matrix which can be seen as a reaction of subsystem to its environment. This makes perfect sense at space-time level. Higher level measurements occur however at the level of WCW and correspond to a localization to some sector of WCW determining for instance the quantization axes of various quantum numbers. Even the measurement of heff/h=n would measure the dimension of Galois group and force a localization to an extension with Galois group with this dimension. These measurements cannot correspond to measurements of density matrix since different WCW sectors cannot entangle by WCW locality. This finding will be discuss in the following.
Evolution of NMP

The view about Negentropy Maximization Principle (NMP) has co-evolved with the notion of self and I have considered many variants of NMP.

  1. The original formulation of NMP was in positive energy ontology and made same predictions as standard quantum measurement theory. The new element was that the density matrix of sub-system defines the fundamental observable and the system goes to its eigenstate in state function reduction. As found, the localizations at to WCW sectors define what might be called self-measurements and identifiable as active volitions rather than reactions.
  2. In p-adic physics one can assign with rational and even algebraic entanglement probabilities number theoretical entanglement negentropy (NEN) satisfying the same basic axioms as the ordinary Shannon entropy but having negative values and therefore having interpretation as information. The definition of p-adic negentropy (real valued) reads as Sp= -∑ Pklog(|Pk|p), where | . |p denotes p-adic norm. The news is that Np= -Sp can be positive and is positive for rational entanglement probabilities. Real entanglement entropy S is always non-negative.

    NMP would force the generation of negentropic entanglement (NE) and stabilize it. NE resources of the Universe - one might call them Akashic records- would steadily increase.

  3. A decisive step of progress was the realization is that NTU forces all states in adelic physics to have entanglement coefficients in some extension of rationals inducing finite-D extension of p-adic numbers. The same entanglement can be characterized by real entropy S and p-adic negentropies Np, which can be positive. One can define also total p-adic negentropy: N= ∑p Np for all p and total negentropy Ntot=N-S.

    For rational entanglement probabilities it is easy to demonstrate that the generalization of adelic theorem holds true: Ntot=N-S=0. NMP based on Ntot rather than N would not say anything about rational entanglement. For extensions of rationals it is easy to find that N-S>0 is possible if entanglement probabilities are of form Xi/n with |Xi|p=1 and n integer. Should one identify the total negentropy as difference Ntot=N-S or as Ntot=N?

    Irrespective of answer, large p-adic negentropy seems to force large real entropy: this nicely correlates with the paradoxical finding that living systems tend to be entropic although one would expect just the opposite: this relates in very interesting manner to the work of biologists Jeremy England. The negentropy would be cognitive negentropy and not visible for ordinary physics.

  4. The latest step in the evolution of ideas NMP was the question whether NMP follows from number theory alone just as second law follows form probability theory! This irritates theoretician's ego but is victory for theory. The dimension n of extension is positive integer and cannot but grow in statistical sense in evolution! Since one expects that the maximal value of negentropy (define as N-S) must increase with n. Negentropy must increase in long run.
Number theoretic entanglement can be stable

Number theoretical Shannon entropy can serve as a measure for genuine information assignable to a pair of entanglement systems. Entanglement with coefficients in the extension is always negentropic if entanglement negentropy comes from p-adic sectors only. It can be negentropic if negentropy is defined as the difference of p-adic negentropy and real entropy.

The diagonalized density matrix need not belong to the algebraic extension since the probabilities defining its diagonal elements are eigenvalues of the density matrix as roots of N:th order polynomial, which in the generic case requires n-dimensional algebraic extension of rationals. One can argue that since diagonalization is not possible, also state function reduction selecting one of the eigenstates is impossible unless a phase transition increasing the dimension of algebraic extension used occurs simultaneously. This kind of NE could give rise to cognitive entanglement.

There is also a special kind of NE, which can result if one requires that density matrix serves a universal observable in state function reduction. The outcome of reduction must be an eigen space of density matrix, which is projector to this subspace acting as identity matrix inside it. This kind NE allows all unitarily related basis as eigenstate basis (unitary transformations must belong to the algebraic extension). This kind of NE could serve as a correlate for "enlightened" states of consciousness. Schrödingers cat is in this kind of state stably in superposition of dead and alive and state basis obtained by unitary rotation from this basis is equally good. One can say that there are no discriminations in this state, and this is what is claimed about "enlightened" states too.

The vision about number theoretical evolution suggests that NMP forces the generation of NE resources as NE assignable to the "passive boundary of CD for which no changes occur during sequence of state function reductions defining self. It would define the unchanging self as negentropy resources, which could be regarded as kind of Akashic records. During the next "re-incarnation after the first reduction to opposite boundary of CD the NE associated with the reduced state would serve as new Akashic records for the time reversed self. If NMP reduces to the statistical increase of heff/h=n the consciousness information contents of the Universe increases in statistical sense. In the best possible world of SNMP it would increase steadily.

Does NMP reduce to number theory?

The heretic question that emerged quite recently is whether NMP is actually needed at all! Is NMP a separate principle or could NMP reduced to mere number theory? Consider first the possibility that NMP is not needed at all as a separate principle.

  1. The value of heff/h=n should increase in the evolution by the phase transitions increasing the dimension of the extension of rationals. heff/h=n has been identified as the number of sheets of some kind of covering space. The Galois group of extension acts on number theoretic discretizations of the monadic surface and the orbit defines a covering space. Suppose n is the number of sheets of this covering and thus the dimension of the Galois group for the extension of rationals or factor of it.
  2. It has been already noticed that the "big" state function reductions giving rise to death and reincarnation of self could correspond to a measurement of n=heff implied by the measurement of the extension of the rationals defining the adeles. The statistical increase of n follows automatically and implies statistical increase of maximal entanglement negentropy. Entanglement negentropy increases in statistical sense.

    The resulting world would not be the best possible one unlike for a strong form of NMP demanding that negentropy does increaes in "big" state function reductions. n also decrease temporarily and they seem to be needed. In TGD inspired model of bio-catalysis the phase transition reducing the value of n for the magnetic flux tubes connecting reacting bio-molecules allows them to find each other in the molecular soup. This would be crucial for understanding processes like DNA replication and transcription.

  3. State function reduction corresponding to the measurement of density matrix could occur to an eigenstate/eigenspace of density matrix only if the corresponding eigenvalue and eigenstate/eigenspace is expressible using numbers in the extension of rationals defining the adele considered. In the generic case these numbers belong to N-dimensional extension of the original extension. This can make the entanglement stable with respect to state the measurements of density matrix.

    A phase transition to an extension of an extension containing these coefficients would be required to make possible reduction. A step in number theoretic evolution would occur. Also an entanglement of measured state pairs with those of measuring system in containing the extension of extension would make possible the reduction. Negentropy could be reduced but higher-D extension would provide potential for more negentropic entanglement and NMP would hold true in the statistical sense.

  4. If one has higher-D eigen space of density matrix, p-adic negentropy is largest for the entire subspace and the sum of real and p-adic negentropies vanishes for all of them. For negentropy identified as total p-adic negentropy SNMP would select the entire sub-space and NMP would indeed say something explicit about negentropy.
Or is NMP needed as a separate principle?

Hitherto I have postulated NMP as a separate principle. Strong form of NMP (SNMP) states that Negentropy does not decrease in "big" state function reductions corresponding to death and re-incarnations of self.

One can however argue that SNMP is not realistic. SNMP would force the Universe to be the best possible one, and this does not seem to be the case. Also ethically responsible free will would be very restricted since self would be forced always to do the best deed that is increase maximally the negentropy serving as information resources of the Universe. Giving up separate NMP altogether would allow to have also "Good" and "Evil".

This forces to consider what I christened weak form of NMP (WNMP). Instead of maximal dimension corresponding to N-dimensional projector self can choose also lower-dimensional sub-spaces and 1-D sub-space corresponds to the vanishing entanglement and negentropy assumed in standard quantum measurement theory. As a matter fact, this can also lead to larger negentropy gain since negentropy depends strongly on what is the large power of p in the dimension of the resulting eigen sub-space of density matrix. This could apply also to the purely number theoretical reduction of NMP.

WNMP suggests how to understand the notions of Good and Evil. Various choices in the state function reduction would correspond to Boolean algebra, which suggests an interpretation in terms of what might be called emotional intelligence . Also it turns out that one can understand how p-adic length scale hypothesis - actually its generalization - emerges from WNMP.

  1. One can start from ordinary quantum entanglement. It corresponds to a superposition of pairs of states. Second state corresponds to the internal state of the self and second state to a state of external world or biological body of self. In negentropic quantum entanglement each is replaced with a pair of sub-spaces of state spaces of self and external world. The dimension of the sub-space depends on which pair is in question. In state function reduction one of these pairs is selected and deed is done. How to make some of these deeds good and some bad? Recall that WNMP allows only the possibility to generate NNE but does not force it. WNMP would be like God allowing the possibility to do good but not forcing good deeds.

    Self can choose any sub-space of the subspace defined by k≤ N-dimensional projector and 1-D subspace corresponds to the standard quantum measurement. For k=1 the state function reduction leads to vanishing negentropy, and separation of self and the target of the action. Negentropy does not increase in this action and self is isolated from the target: kind of price for sin.

    For the maximal dimension of this sub-space the negentropy gain is maximal. This deed would be good and by the proposed criterion NE corresponds to conscious experience with positive emotional coloring. Interestingly, there are 2k-1 possible choices, which is almost the dimension of Boolean algebra consisting of k independent bits. The excluded option corresponds to 0-dimensional sub-space - empty set in set theoretic realization of Boolean algebra. This could relate directly to fermionic oscillator operators defining basis of Boolean algebra - here Fock vacuum would be the excluded state. The deed in this sense would be a choice of how loving the attention towards system of external world is.

  2. A map of different choices of k-dimensional sub-spaces to k-fermion states is suggestive. The realization of logic in terms of emotions of different degrees of positivity would be mapped to many-fermion states - perhaps zero energy states with vanishing total fermion number. State function reductions to k-dimensional spaces would be mapped to k-fermion states: quantum jumps to quantum states!

    The problem brings in mind quantum classical correspondence in quantum measurement theory. The direction of the pointer of the measurement apparatus (in very metaphorical sense) corresponds to the outcome of state function reduction, which is now 1-D subspace. For ordinary measurement the pointer has k positions. Now it must have 2k-1 positions. To the discrete space of k pointer positions one must assign fermionic Clifford algebra of second quantized fermionic oscillator operators. The hierarchy of Planck constants and dark matter suggests the realization. Replace the pointer with its space-time k-sheeted covering and consider zero energy energy states made of pairs of k-fermion states at the sheets of the n-sheeted covering? Dark matter would be therefore necessary for cognition. The role of fermions would be to "mark" the k space-time sheets in the covering.

The cautious conclusion is that NMP as a separate principle is not necessary and follows in statistical sense from the unavoidable increase of n=heff/h identified as dimension of extension of rationals define the adeles if this extension or at least the dimension of its Galois group is observable.

For details see the chapter Negentropy Maximization Principle or the article Re-examination of the basic notions of TGD inspired theory of consciousness.



WCW and the notion of intentional free will

The preparation of an article about number theoretic aspects of TGD forced to go through various related ideas and led to a considerable integration of the ideas. In this note ideas related directly to consciousness and cognition are discussed.

  1. Adelic approach strongly suggests the reduction of NMP to number theoretic physics somewhat like second law reduces to probability theory. The dimension of extension rationals characterizing the hierarchy level of physics and defined an observable measured in state function reductions is positive and can only increase in statistical sense. Therefore the maximal value of entanglement negentropy increases as new entangling number theoretic degrees of freedom emerge. heff/h=n identifiable as factor of Galois group of extension characterizes the number of these degrees of freedom for given space-time surfaces as number of its sheets.
  2. State function reduction is hitherto assumed to correspond always to a measurement of density matrix which can be seen as a reaction of subsystem to its environment. This makes perfect sense at space-time level. Higher level measurements occur however at the level of WCW and correspond to a localization to some sector of WCW determining for instance the quantization axes of various quantum numbers. Even the measurement of heff/h=n would measure the dimension of Galois group and force a localization to an extension with Galois group with this dimension. These measurements cannot correspond to measurements of density matrix since different WCW sectors cannot entangle by WCW locality. This finding will be discuss in the following.
The notion of self can be seen as a generalization of the poorly defined definition of the notion of observer in quantum physics. In the following I take the role of skeptic trying to be as critical as possible.

The original definition of self was as a subsystem able to remain unentangled under state function reductions associated with subsequent quantum jumps. The density matrix was assumed to define the universal observable. Note that a density matrix, which is power series of a product of matrices representing commuting observables has in the generic case eigenstates, which are simultaneous eigenstates of all observables. Second aspect of self was assumed to be the integration of subsequent quantum jumps to coherent whole giving rise to the experienced flow of time.

The precise identification of self allowing to understand both of these aspects turned out to be difficult problem. I became aware the solution of the problem in terms of ZEO (ZEO) only rather recently (2014).

  1. Self corresponds to a sequence of quantum jumps integrating to single unit as in the original proposal, but these quantum jumps correspond to state function reductions to a fixed boundary of causal diamond CD leaving the corresponding parts of zero energy states invariant - "small" state function reductions. The parts of zero energy states at second boundary of CD change and even the position of the tip of the opposite boundary changes: one actually has wave function over positions of second boundary (CD sizes roughly) and this wave function changes. In positive energy ontology these repeated state function reductions would have no effect on the state (Zeno effect) but in TGD framework there occurs a change for the second boundary and gives rise to the experienced flow of time and its arrow and self: self is generalized Zeno effect.
  2. The first quantum jump to the opposite boundary corresponds to the act of "free will" or birth of re-incarnated self. Hence the act of "free will" changes the arrow of psychological time at some level of hierarchy of CDs. The first reduction to the opposite boundary of CD means "death" of self and "re-incarnation" of time-reversed self at opposite boundary at which the the temporal distance between the tips of CD increases in opposite direction. The sequence of selves and time reversed selves is analogous to a cosmic expansion for CD. The repeated birth and death of mental images could correspond to this sequence at the level of sub-selves.
  3. This allows to understand the relationship between subjective and geometric time and how the arrow of and flow of clock time (psychological time) emerge. The average distance between the tips of CD increases on the average as along as state function functions occur repeatedly at the fixed boundary: situation is analogous to that in diffusion. The localization of contents of conscious experience to boundary of CD gives rise to the illusion that universe is 3-dimensional. The possibility of memories made possibly by hierarchy of CDs demonstrates that this is not the case. Self is simply the sequence of state function reductions at same boundary of CD remaining fixed and the lifetime of self is the total growth of the average temporal distance between the tips of CD.
One can identify several rather abstract state function reductions selecting a sector of WCW.
  1. There are quantum measurements inducing localization in the moduli space of CDs with passive boundary and states at it fixed. In particular, a localization in the moduli characterizing the Lorentz transform of the upper tip of CD would be measured. The measured moduli characterize also the analog of symplectic form in M4 strongly suggested by twistor lift of TGD - that is the rest system (time axis) and spin quantization axes. Of course, also other kinds of reductions are possible.
  2. Also a localization to an extension of rationals defining the adeles should occur. Could the value of n=heff/h be observable? The value of n for given space-time surface at the active boundary of CD could be identified as the order of the smallest Galois group containing all Galois groups assignable to 3-surfaces at the boundary. The superposition of space-time surface would not be eigenstate of n at active boundary unless localization occurs. It is not obvious whether this is consistent with a fixe value of n at passive boundary.

    The measured value of n could be larger or smaller than the value of n at the passive boundary of CD but in statistical sense n would increase by the analogy with diffusion on half line defined by non-negative integers. The distance from the origin unavoidably increases in statistical sense. This would imply evolution as increase of maximal value of negentropy and generation of quantum coherence in increasingly longer scales.

  3. A further abstract choice corresponds to the the replacement of the roles of active and passive boundary of CD changing the arrow of clock time and correspond to a death of self and re-incarnation as time-reversed self.
Can one assume that these measurements reduce to measurements of a density matrix of either entangled system as assumed in the earlier formulation of NMP, or should one allow both options. This question actually applies to all quantum measurements and leads to a fundamental philosophical questions unavoidable in all consciousness theories.
  1. Do all measurements involve entanglement between the moduli or extensions of two CDs reduced in the measurement of the density matrix? Non-diagonal entanglement would allow final states states, which are not eigenstates of moduli or of n: this looks strange. This could also lead to an infinite regress since it seems that one must assume endless hierarchy of entangled CDs so that the reduction sequence would proceed from top to bottom. It looks natural to regard single CD as a sub-Universe.

    For instance, if a selection of quantization axis of color hypercharge and isospin (localization in the twistor space of CP2) is involved, one would have an outcome corresponding to a quantum superposition of measurements with different color quantization axis!

    Going philosophical, one can also argue, that the measurement of density matrix is only a reaction to environment and does not allow intentional free will.

  2. Can one assume that a mere localization in the moduli space or for the extension of rationals (producing an eigenstate of n) takes place for a fixed CD - a kind of self measurement possible for even unentangled system? If there is entanglement in these degrees of freedom between two systems (say CDs), it would be reduced in these self measurements but the outcome would not be an eigenstate of density matrix. An interpretation as a realization of intention would be approriate.
  3. If one allows both options, the interpretation would be that state function reduction as a measurement of density matrix is only a reaction to environment and self-measurement represents a realization of intention.
  4. Self measurements would occur at higher level say as a selection of quantization axis, localization in the moduli space of CD, or selection of extension of rationals. A possible general rule is that measurements at space-time level are reactions as measurements of density matrix whereas a selection of a sector of WCW would be an intentional action. This because formally the quantum states at the level of WCW are as modes of classical WCW spinor field single particle states. Entanglement between different sectors of WCW is not possible.
  5. If the selections of sectors of WCW at active boundary of CD commute with observables, whose eigenstates appear at passive boundary (briefly passive observables) meaning that time reversal commutes with them - they can occur repeatedly during the reduction sequence and self as a generalized Zeno effect makes sense.

    If the selections of WCW sectors at active boundary do not commute with passive observables then volition as a choice of sector of WCW must change the arrow of time. Libet's findings show that conscious choice induces neural activity for a fraction of second before the conscious choice. This would imply the correspondences "big" measurement changing the arrow of time - self-measurement at the level of WCW - intentional action and "small" measurement - measurement at space-time level - reaction.

    Self as a generalized Zeno effect makes sense only if there are active commuting with passive observables. If the passive observables form a maximal set, the new active observables commuting with them must emerge. The increase of the size of extension of rationals might generate them by expanding the state space so that self would survive only as long at it evolves. Self would die and re-incarnate when it could not generate any new observables communicating with those assignable to active boundary to be measured. From personal experience I can say that ageing is basically the loss of the ability to make new choices. When all possible choices are made, all observables are measured or self-measured, it is time to start again.

    Otherwise there would be only single unitary time evolution followed by a reduction to opposite boundary. This makes sense only if the sequence of "big" reductions for sub-selves can give rise to the time flow experienced by self: the birth and death of mental images would give rise to flow of time of self.

The overall conclusion is that the notion of WCW is necessary to understand intentional free will. One must distinguish between measurements at WCW level as localizations, which do not involved measurement of density matrix and measurements space-time level reducible to measurements of density matrix (taking the density matrix to be function of product of commuting observables one can measure all these observables simultaneously by measuring density matrix. WCW localizations correspond to intentional actions - say decision fixing quantization axis for spin and space-time reductions correspond to state function reductions at the level of matter. By reading Krishnamurti I learned that eastern philosophies make a sharp distinction between behavior as mere reactivity and behavior as intentional actions which are not reactions. Furthermore, death and reincarnation happen when self has made all choices.

For details see the chapter Negentropy Maximization Principle or the article Re-examination of the basic notions of TGD inspired theory of consciousness.



Anomalies of water as evidence for dark matter in TGD sense

The motivation for this brief comment came from a popular article telling that a new phase of water has been discovered in the temperature range 50-60 oC (see this ). Also Gerald Pollack (see this ) has introduced what he calls the fourth phase of water. For instance, in this phase water consists of hexagonal layers with effective H1.5O stoichiometry and the phase has high negative charge. This phase plays a key role in TGD based quantum biology. These two fourth phases of water could relate to each other if there exist a deeper mechanism explaining both these phases and various anomalies of water.

Martin Chaplin (see this ) has an extensive web page about various properties of water. The physics of water is full of anomalous features and therefore the page is a treasure trove for anyone ready to give up the reductionistic dogma. The site discusses the structure, thermodynamics, and chemistry of water. Even academically dangerous topics such as water memory and homeopathy are discussed.

One learns from this site that the physics of water involves numerous anomalies (see this ). The structural, dynamic and thermodynamic anomalies form a nested in density-temperature plane. For liquid water at atmospheric pressure of 1 bar the anomalies appear in the temperature interval 0-100 oC.

Hydrogen bonding creating a cohesion between water molecules distinguishes water from other substances. Hydrogen bonds induce the clustering of water molecules in liquid water. Hydrogen bonding is also highly relevant for the phase diagram of H2O coding for various thermodynamical properties of water (see this ). In biochemistry hydrogen bonding is involved with hydration. Bio-molecules - say amino-acids - are classified to hydrophobic, hydrophilic, and amphiphilic ones and this characterization determines to a high extent the behavior of the molecule in liquid water environment. Protein folding represents one example of this.

Anomalies are often thought to reduce to hydrogen bonding. Whether this is the case, is not obvious to me and this is why I find water so fascinating substance.

TGD indeed suggests that water decomposes into ordinary water and dark water consisting of phases with effective Planck constant heff=n× h residing at magnetic flux tubes. Hydrogen bonds would be associated with short and rigid flux tubes but for larger values of n the flux tubes would be longer by factor n and have string tension behaving as 1/n so that they would softer and could be loopy. The portional of water molecules connected by flux tubes carrying dark matter could be identified as dark water and the rest would be ordinary water. This model allows to understand various anomalies. The anomalies are largest at the physiological temperature 37 C, which conforms with the vision about the role of dark matter and dark water in living matter since the fraction of dark water would be highest at this temperature. The anomalies discussed are density anomalies, anomalies of specific heat and compressibility, and Mpemba effect. I have discussed these anomalies already for decade ago. The recent view about dark matter allows however much more detailed modelling.

For details see the chapter Dark Nuclear Physics and Condensed Matter or the article The anomalies of water as evidence for the existence of dark matter in TGD sense.



About number theoretic aspects of NMP

There is something in NMP that I still do not understand: every time I begin to explain what NMP is I have this unpleasant gut feeling. I have the habit of making a fresh start everytime rather than pretending that everything is crystal clear. I have indeed considered very many variants of NMP. In the following I will consider two variants of NMP. Second variant reduces to a pure number theory in adelic framework inspired by number theoretic vision. It is certainly the simplest one since it says nothing explicit about negentropy. Second variant says essentially the as "strong form of NMP", when the reduction occurs to an eigen-space of density matrix.

I will not consider zero energy ontology (ZEO) related aspects and the aspects related to the hierarchy of subsystems and selves since I dare regard these as "engineering" aspects.

What NMP should say?

What NMP should state?

  1. NMP takes in some sense the role of God and the basic question is whether we live in the best possible world or not. Theologists asks why God allows sin. I ask whether NMP demand increase of negentropy always or does it allow also reduction of negentropy? Why? Could NMP lead to increase of negentropy only in statistical sense - evolution? Could it only give potential for gaining a larger negentropy?

    These questions have turned to be highly non-trivial. My personal experience is that we do not live in the best possible world and this experience plus simplicity motivates the proposal to be discussed.

  2. Is NMP a separate principle or could NMP be reduced to mere number theory? For the latter option state function would occur to an eigenstate/eigenspace of density matrix only if the corresponding eigenvalue and eigenstate/eigenspace are expressible using numberes in the extension of rationals defining the adele considered. A phase transition to an extension of an extension containing these coefficients would be required to make possible reduction. A step in number theoretic evolution would occur. Also an entanglement of measured state pairs with those of measuring system in containing the extension of extension would make possible the reduction. Negentropy would be reduced but higher-D extension would provide potential for more negentropic entanglement. I will consider this option in the following.
  3. If one has higher-D eigenspace of density matrix, p-adic negentropy is largest for the entire subspace and the sum of real and p-adic negentropies vanishes for all of them. For negentropy identified as total p-adic negentropy strong from of NMP would select the entire sub-space and NMP would indeed say something explicit about negentropy.

The notion of entanglement negentropy

  1. Number theoretic universality demands that density matrix and entanglement coefficients are numbers in an algebraic extension of rationals extended by adding root of e. The induced p-adic extensions are finite-D and one obtains adele assigned to the extension of rationals. Real physics is replaced by adelic physics.
  2. The same entanglement in coefficients in extension of rationals can be seen as numbers is both real and various p-adic sectors. In real sector one can define real entropy and in various p-adic sectors p-adic negentropies (real valued).
  3. Question: should one define total entanglement negentropy as
    1. sum of p-adic negentropies or
    2. as difference for the sum of p-adic negentropies and real etropy. For rational entanglement probabilities real entropy equals to the sum of p-adic negentropies and total negentropy would vanish. For extensions this negentropy would be positive under natural additional conditions as shown earlier.
    Both options can be considered.

State function reduction as universal measurement interaction between any two systems

  1. The basic vision is that state function reductions occur all the for all kinds of matter and involves a measurement of density matrix ρ characterizing entanglement of the system with environment leading to a sub-space for which states have same eigenvalue of density matrix. What this measurement really is is not at all clear.
  2. The measurement of the density matrix means diagonalization of the density matrix and selection of an eigenstate or eigenspace. Diagonalization is possible without going outside the extension only if the entanglement probabilities and the coefficients of states belong to the original extension defining the adele. This need not be the case!

    More precisely, the eigenvalues of the density matrix as roots of N:th order polynomial with coefficients in extension in general belong to N-D extension of extension. Same about the coefficients of eigenstates in the original basis. Consider as example the eigen values and eigenstates of rational valued N× N entanglement matrix, which are roots of a polynomial of degree N and in general algebraic number.

    Question: Is state function reduction number theoretically forbidden in the generic case? Could entanglement be stable purely number theoretically? Could NMP reduce to just this number theoretic principle saying nothing explicit about negentropy? Could phase transition increasing the dimension of extension but keeping the entanglement coefficients unaffected make reduction possible. Could entanglement with an external system in higher-D extension -intelligent observer - make reduction possible?

  3. There is a further delicacy involved. The eigen-space of density matrix can be N-dimensional if the density matrix has N-fold degenerate eigenvalue with all N entanglement probabilities identical. For unitary entanglement matrix the density matrix is indeed N×N unit matrix. This kind of NE is stable also algebraically if the coefficients of eigenstates do not belong to the extension. If they do not belong to it then the question is whether NMP allows a reduction to subspace of and eigen space or whether only entire subspace is allowed.

    For total negentropy identified as the sum of real and p-adic negentropies for any eigenspace would vanish and would not distinguish between sub-spaces. Identification of negentropy as as p-adic negentropy would distinguish between sub-spaces and´NMP in strong form would not allow reduction to sub-spaces. Number theoretic NMP would thus also say something about negentropy.

    I have also consider the possibility of weak NMP. Any subspace could be selected and negentropy would be reduced. The worst thing to do in this case would be a selection of 1-D subspace: entanglement would be totally lost and system would be totally isolated from the rest of the world. I have proposed that this possibility corresponds to the fact that we do not seem to live in the best possible world.

NMP as a purely number theoretic constraint?

Let us consider the possibility that NMP reduces to the number theoretic condition tending to stabilize generic entanglement.

  1. Density matrix characterizing entanglement with the environment is a universal observable. Reduction can occur to an eigenspace of the density matrix. For rational entanglement probabilities the total negentropy would vanish so that NMP formulated in terms of negentropy cannot say anything about the situation. This suggests that NMP quite generally does not directly refer to negentropy.
  2. The condition that eigenstates and eigenvalues are in the extension of rationals defining the adelic physics poses a restriction. The reduction could occur only if these numbers are in the original extension. Also rational entanglement would be stable in the generic case and a phase transition to higher algebraic extension is required for state function reduction to occur. Standard quantum measurement theory would be obtained when the coefficients of eigenstates and entanglement probabilities are in the original extension.
  3. If this is not the case, a phase transition to an extension of extension containing the N-D extension of it could save the situation. This would be a step in number theoretic evolution. Reduction would lead to a reduction of negentropy but would give potential for gaining a larger entanglement negentropy. Evolution would proceed through catastrophes giving potential for more negentropic entanglement! This seems to be the case!

    Alternatively, the state pairs of the system + complement could be entangled with observer in an extension of rationals containg the needed N-D extension of extension and state function possible for observer would induce reduction in the original system. This would mean fusion with a self at higher level of evolutionary hierarchy - kind of enlightment. This would give an active role to the intelligent observer (intelligence characterized by the dimension of extension of rationals). Intelligent observer would reduce the negentropy and thus NMP would not hold true universally.

    Since higher-D extension allows higher negentropy and in the generic case NE is stable, one might hope that NMP holds true statistically (for rationals total negentropy as sum or real and total p-adic negentropies vanishes).

    The Universe would evolve rather than being a paradize: the number theoretic NMP would allow temporary reduction of negentropy but provide a potential for larger negentropy and the increase of negentropy in statistical sense is highly suggestive. To me this option looks like simplest and most realistic one.

  4. If negentropy is identified as total p-adic negentropy rather than sum of real and p-adic negentropies, strong form of NMP says something explicit about negentropy: the reduction would take place to the entire subspace having the largest p-adic negentropy.

For background see the chapter Negentropy Maximization Principle. or the article About number theoretic aspects of NMP.



To the index page