What's new in
HYPER-FINITE FACTORS, P-ADIC LENGTH SCALE HYPOTHESIS, AND DARK MATTER HIERARCHY
Note: Newest contributions are at the top!
During last decade s I have proposed several examples about systems to which I have assigned non-standard value of Planck constant heff=n× h. If the hypothesis about the connection with criticality is correct they should exhibit criticality and if heff =hgr hypothesis is true, also phase separation. Also the proposed mechanisms to generate dark matter should involve generation of criticality.
In particle physics there are some possible applications for the new view about dark matter.
By its nature condensed matter physics provides rich repertoire of critical phenomena.
Biology is full of critical systems and criticality makes living matter highly sensitive to the external perturbations, gives maximal richness of structure, and makes them quantum coherent in macroscopic scales. Therefore it is not difficult to invent examples. The basic problem is whether the criticality is associated only with the transitions between different systems or with the systems themselves.
If one wants the label of crackpot it is enough to study critical phenomena. Those who try to replicate (or usually, to non-replicate) the claimed findings fail (or rather manage) easily since criticality implies careful tuning of the external parameters to demonstrate the phenomenon. Therefore the tragedy of fringe physicist is to become a victim of the phenomenon that he is studying.
Proposed mechanisms for generating large heff phase
I have proposed several mechanisms, which might generate large heff phase, and an interesting question is whether these mechanisms generate criticality.
Concerning the understanding of criticality one can proceed purely mathematically. Consider first 2-dimensional systems and 4-D conformal invariance of Yang-Mills theories.
Basic building bricks of TGD vision
The details of this generalization are not understood but the building bricks have been identified.
Hierarchy of criticalities and hierarchy breakings of conformal invariance
The TGD picture about quantum criticality connects it to the failure of classical non-determinism for Kähler action defining the space-time dynamics. A connection with the hierarchy of Planck constants and therefore dark matter in TGD sense emerges: the number n of conformal equivalence classes for space-time surfaces with fixed ends at the boundaries of causal diamond corresponds to the integer n appearing in the definition of Planck constant heff = n× h.
A more detailed description for the breaking of conformal invariance is as follows. The statement that sub-algebra Vn of full conformal algebra annihilates physical states means that the generators Lkn, k>0, n>0 fixed, annihilate physical states. The generators L-kn, k>0, create zero norm states. Virasoro generators can be of course replaced with generators of Kac-Moody algebra and even those of the symplectic algebra defined above.
Since the action of generators Lm on the algebra spanned by generators Ln+m, m>0, does not lead out from this algebra (ideal is in question), one can pose a stronger condition that all generators with conformal weight k≥ n annihilate the physical states and the space of physical states would be generated by generators Lk, 0<k<n. Similar picture would hold for also for Kac-Moody algebras and symplectic algebra of δ M4+× CP2 with light-like radial coordinate of δ M4+ taking the role of z. Since conformal charge comes as n-multiples of hbar, one could say that one has heff=n× h.
The breaking of conformal invariance would transform finite number of gauge degrees to discrete physical degrees of freedom at criticality. The long range fluctuations associated with criticality are potentially present as gauge degrees of freedom, and at criticality the breaking of conformal invariance takes place and these gauge degrees of freedom are transformed to genuine degrees of freedom inducing the long range correlations at criticality.
Changes of symmetry are assigned with criticality since Landau. Could one say that the conformal subalgebra defining the genuine conformal symmetries changes at criticality and this makes the gauge degrees of freedom visible at criticality?
Emergence of covering spaces associated with the hierarchy of Planck constants
Another picture about hierarchy of Planck constants is based on n-fold covering space bringing in n discrete degrees of freedom. How does this picture relate to the breaking of conformal symmetry? The idea is simple.
One goes to n-fold covering space by replacing z coordinate by w=z1/n. With respect to the new variable w one has just the ordinary conformal algebra with integer conformal weights but in n-fold singular covering of complex plane or sphere. Singularity of the generators explains why Lk(w), k<n, do not annihilate physical states anymore. Sub-algebra would consist of non-singular generators and would act as symmetries and also the stronger condition that Lk, k≥ n, annihilates the physical states could be satisfied. Classically this would mean that the corresponding classical Noether charges for Kähler action are non-vanishing.
Another manner to look the same situation is to use z coordinate. Now conformal weight is fractionized as integer multiples of 1/n and since the generators with fractional conformal weight are singular at origin, one cannot assume that they annihilate the physical states: fractional conformal invariance is broken. Quantally the above conditions on physical states would be satisfied. Sphere - perhaps the sphere assigned with the light-cone boundary or geodesic sphere of CP2 - would be effectively replaced with its n-fold covering space, and due to conformal invariance one would have n additional discrete degrees of freedom.
These discrete degrees of freedom would define n-dimensional Hilbert space space by the n fractional conformal generators. One can also second quantize by assigning oscillator operators to these discrete degrees of freedom. In this picture the effective quantization of Planck constant would result from the condition that conformal weights for the physical states are integers.
Negentropic entanglement and hierarchy of Planck constants
Also a connection with negentropic entanglement associated with density matrix, which is proportional to unit matrix or direct sum of matrices proportional to unit matrices of various dimensions is natural in dark matter phase, emerges. Negentropic entanglement would occur in the new discrete degrees of freedom most naturally. In special 2-particle case negentropic entanglement corresponds to unitary entanglement encountered in quantum computation: large heff of course makes possible long-lived entanglement and its negentropic character implies that Negentropy Maximization Principle favors its generation. An interesting hypothesis to be killed is that the p-adic prime characterizing the space-time sheet divides n.
Closed magnetic flux tubes (also flux sheets) carrying dark matter are in crucial role in TGD view about criticality. Dark matter withy large Planck constant would be created at criticality and it would generate the the long range correlations.
TGD allows the possibility that the magnetic flux quanta associated with criticality carry monopole flux. In Maxwellian electrodynamics this is not possible. These flux tubes are associated with elementary particles: in this case they have open string like portions at parallel space-time sheets connected at their ends by wormhole contacts to form a closed two-sheeted loop. Since the magnetic monopole flux is conserved along the flux tube, one has full reason to wonder whether these closed magnetic flux tubes can be created from vacuum.
One can imagine two manners to create flux loops: in a continuous energy conserving manner classically or by quantum jump in which quantum sub-Universe associated with given causal diamond (CD) is re-created (recall that causal diamonds define the observable Universes and they have finite size as intersections of future and past directed light-ones)
Consider for simplicity flux tubes which are circular. How the flux tubes can be generated?
This posting as also the consideration of the idea that criticality could involve a phase transition transforming ordinary matter to dark matter was inspired by a link sent by Ulla. The link was to a popular article telling about mysterious looking action at a distance between liquid containers.
For several years it has been that that superfluid helium in reservoirs next to each other with distance of few micrometers acts collectively, even when the channels connecting them are so thin and long that substantial flow of matter between them is not possible. The article mentions a theoretical model developed by a team of scientists include those from the Instute of Physical Chemistry of the Polish Academy of Sciences in Warzaw (IPCPACW). According to the article the model reveals that the phenomenon is much more general than previously thought and could take place also systems, which are usually regarded as classical (what this actually means in quantum world is not quite clear!). The reading of the abstract of the article article shows that only Monte Carlo studies are done so that "predicts" is perhaps more appropriate than "reveals".
According to the article, the first report about "action at a distance" was between superfluid reservoirs was published in 2010 in Nature Physics. The team from the University of Buffalo and the State University of New York created an array of tens of millions of cubical reservoirs containing liquid healiium on a silicon plate. The centres of reservoirs had distance of 6 μ and the reservoirs had edge length of 2 μm so that the width of the horizontal gap between reservours was 4 μm. The reservoirs were covered with another silicon plate with a very thin gap above the reservoirs allowing to fill them with liquid helium. The thickness of this vertical gap was d=32 nm - in TGD language this is d=3.2 L(151), where the p-adic length scale L(151)=10 nm defines the thickness of cell membrane. The gap was so thin that it did not allow a significant flow of liquid helium between neighboring reservoirs.
Remark: To be precise, L(151) should be called the Compton length of electron if it would correspond to Gaussian Mersenne MG,k=151=(1+i)k-1 and is Le(151)= 51/2× L(151), where L(151) would be the genuine p-adic length scale. For brevity I often call Le just p-adic length scale and drop the subscript "e".
The expectation was that different reservoirs would behave like independent systems without interactions. In particular, the specific heat of the whole system would be sum over the specific heats of individual systems, which were identical. This was not the case. An excess of specific heat was observed in the system. The super-fluid helium was acting as a physical whole.
The natural explanation would be in terms of the superfluid character of the systems. Still the absence of the direct contact - say thin "threads" connecting the reservoirs - makes one to wonder whether the situation can be understood in the framework of conventional quantum physics.
In co-operation with Prof. Douglas Abraham from Oxford University, Dr. Maciolek from (IPCPACW) has developed a theory to exlain the observations. The new theory predicts that the effect of "action at a distance" does not require quantum physics and can also occur in classical one-component fluids, as well as its mixtures. The article says that this theory is confirmed by computer simulations carried out by Oleg Vasilyev from the Max-Planck Insitute für Intelligente Systeme. I would be here a little bit skeptical: experiments conform, computer simulations only allow to calculate!
The theory makes certain predictions.
Quantum criticality is one of the corner stone assumptions of TGD. The value of Kähler coupling strength fixes quantum TGD and is analogous to critical temperture. Hence TGD Universe would be quantum critical. What this means is however far from obvious and I have pondered the notion repeatedly.
Mathematical approach to criticality
One can proceed purely mathematically. In 2-dimensional case the behavior of the system at criticality is universal and the dependence of various parameters on temperature and possible other critical parameters can be expressed in terms of critical exponents predicted in the case of effectively 2-dimensional systems by conformal field theory discovered by Russian theoreticians. This is one of the few really significant steps in theoretical physics during last forty years. For instance, string models apply the notions and formalism of conformal field theories. Conformal field theories are known and classified.
In TGD framework, the notion of conformal invariance generalizes to 4-D context. This is partly due to holography following from the fact pairs of 3-surfaces at the ends of causal diamonds define the basic dynamical object. Second reason is effective 2-dimensionality allowing to say that basic dynamical objects are 2-D partonic 2-surfaces and string world sheets. Whethere these objects are dual, is still an open question. Second reason is that the 3-D boundary of 4-D light-cone (and of CD) is effectively 2-D and gives rise to generaliation of 2-D conformal invariance. Also the light-like orbits of partonic 2- surfaces allow extended conformal invariance and string world sheets possess the counterpart of ordinary conformal invariance.
The TGD picture about quantum criticality connects it to the failure of classical non-determinism for Kähler action defining the space-time dynamics. A connection with the hierarchy of Planck constants and therefore dark matte rin TGD sense emerges: the number n of conformal equivalence classes for space-time surfaces with fixed ends at the boundaries of causal diamond corresponds to the integer n appearing in the definition of Planck constant heff=n×h. Also a connection with negentropic entanglement associated with density matrix, which is proportional to unit matrix or direct sum of matrices proportinal to unit matrices of various dimensions is natural in dark matter phase, emerges. In special 2-particle case negentropic entanglement corresponds to unitary entanglement encountered in quantum computation.
Phenomenological approach to criticality
These statements do not have any obvious content for an eperimentalist. One should have also a more concrete view about criticality. Theoretician would call this phenomenology. Phase transitions and criticality are essential piece of being alive. Criticality means high sensitivity to signals and makes sensory perception possible. Criticality implies also long range correlations making us coherent units. The long range correlations between people who have never seen each other, like most of us, make possibly society, and demonstrate that the criticality appears also at collective levels of life and consciousness: usually biologists dismiss this. Self-organization takes place by phase transitions and criticality with long range correlations. Also the synergy is made possible by criticality. Criticality appears only in very narrow range of control parameters and is therefore difficult to produce critical systems tend to fall off from criticality: good example is our society which is all the time at the verge of some kind of catastrophe.
For physicist - at least me - the correlation between behaviors of him and his cat looks like a miracle! One can indeed build refined and highly predictive conformal field theory models but they do not tell what are the microscopic mechanisms behind criticality. What are the space-time correlates for criticality and long range correlations? Something must quite concretely connect the sub-systems, bind them to single coherent unit at criticality. Magnetic flux tubes is of course the TGD based answer! But this is not enough. The long range correlations must be quantal and this requires that Planck constant is large: heff=n/times; h! Dark matter! The emergence of dark matter phase makes system critical! TGD Universe is critical at fundamental level and this implies that this dark matter is present at all length scales.
Long range interactions certainly define a basic characteristic of criticality. How do they emerge? heffand p-adic length cale hypothesis allow to understand this. Weak bosons are effectively massless below weak boson Compton length- about 10-17 meters. When h_eff scales up by n, this Compton length scales up by n also. Weak interactions become long range below much longer length scale, say even cellular scale and among other things explain chiral selection of biomolecules. Similar argument can be carried out for gluons and dark/p-adically scaled down) quarks and gluons would also appear in living matter.
Phase separation is key feature of criticality. How does this separation take place? Is there a universal mechanism as suggested by the fact that at criticality everything is universal. The answer relies again on many-sheeted space-time, heff hierarchy, and the notion of gravitational Planck constant
introduced originally by Nottale. The additional hypothesis heff
This hypothesis brings in gravitational interaction: the gravitational Planck constant is assigned with gravitation mediated by magnetic flux tubes connecting the two dark systems. The hypothesis predicts that heff is proportional to particle mass. This means each particle type is at its own dark flux tube/quantum nicely separated from each other. This would explain the phase separation at criticality even if the phase transformed after criticality to ordinary heff=h phase. Pollacks exclusion zones (EZs) show the effect too: impurities in EZ get put of it. heff= hgr hypothesis implies that the scaled up Compton length becomes GM/v0 and does not depend on particle mass at all: ideal outcome for collective quantum coherence. In living matter with dynamics characterized by phase transitions this phase separation of different biologically important molecules would be in crucial role. The cell would not be anymore a random soup of huge number of different biomolecules but nicely arranged archive.
Critical reader - and even me after 9 ears of work ! - can of course ask what the mass M appearing in the formula for hgrreally is. The logical answer is that it is the portion of matter that is dark: to this dark particles couple. In the Nottale's original model M and in TGD generalization of this model M corresponds to the entire mass of say Sun. This makes sense only if the approximate Bohr orbits in solar system reflect the situation when most of the matter in solar system was dark.Nowadays this is not the case anymore. For Earth the portion of darm matter should be something like 4 × 10- as becomes clear by just looking the values of the energies associated with dark cyclotron photons and requiring that they are in the range of biophoton energies (dark photons would transforming to ordinary photons produce bio-photons). Without this assumption the range of biophoton energies would be above 40 keV.
Besides dark matter also p-adically scaled up variants of weak interaction physics are possible: now weak bosons would be light but not massless above the Compton length which would be scaled up. In the TGD based model of living matter both dark matter and p-adically scaled up variants of particles appear and both are crucial for understanding metabolism. Both kind of phases could appear universally in critical systems. Dark matter would be a critical phenomenon and appear also in thermodynamical phase transitions, not only in quantum phase transitions. Also so called free energy phenomena, cold fusion, remote mental interactions, etc are critical phenomena and therefore very difficult to replicate unless one knows this so that it is very easy to label researchers of these phenomena crackpots. The researchers in these fields could be seen as victims of the phenomenon they are studing! Life of course is also a critical phenomenon but even the vulgar skeptics are living and conscious beings and usually do not try to deny this!
The life of theoretician trying to be worth of his salt is full of worrying: it is always necessary to make internal consistency checks. One of the worries is whether the hypothesis heff=n× h = hgr = GMm/v0 is really consistent with TGD inspired quantum biology or has wishful thinking made its way to the arguments? More precisely, does the nominal value Bend= .2× 10-4 Tesla of "endogenous" magnetic field suggested by the effects of ELF em fields on brain give electron cyclotron energy E= heff eBend/2π m in few eV range for the value of n in question?
First some background.
Does hgr=heff hypothesis predict that the energy range of dark photons is that of biophotons?
Consider now the question whether the predicted value of n is consistent with the assumption that dark cyclotron photons have energies in visible and and UV range.
How to modify the hgr= heff hypothesis?
It seems that one should modify the hypothesis hgr= heff somehow.
To summarize, the hypothesis hgr=heff predicts universal dark cyclotron photon spectrum in bio-photon range only if the dark magnetic flux tubes couple biomatter to dark part of Earth, which should carry a portion of order 2× 10-4 of the Earth's mass. This means a correction to the earlier picture, which however does not change the overall picture in any manner. The fact that one has now precise quantitative estimate for the fraction of dark matter makes it easier to tolerate the feeling of embarrassment due to sloppy estimates.
Superfluids apparently defy gravitational force as so called fountain effect demonstrates. In an arrangement involving a vessel of superfluid inside another one such that the levels of superfluids are different in the two vessels, the superfluid flows spontaneously along the walls of the vessels as a superfluid film. The flow is from the vessel in which the level of superfluid is higher until the heights are equal or all fluid has left the other container. For illustrations see the pictures of the article "Why does superfluid helium leak out of an open container?".
What is strange that all the fluid flows from the vessel to another one it the height of vessel is high enough. According to the prevailing wisdom superfluid actually consists of ordinary fluid and genuine superfluid. The fluid flows from the vessel as a genuine superfluid so that the the process must involve a phase transition transforming the ordinary fluid component present in the fluid to superfluid keeping superfluid fraction constant. A further strange feature is that the superfluid flows as a film covering the inner (and also outer) surface of entire container so that return flow is not possible. This suggests an interpretation as a macroscopic quantum phenomenon.
According to the article of Golovko the existing wisdom about flow is that it corresponds to wetting. This would however predict that the phenomenon takes place also above the critical point (λ point) for the ordinary fluid. This is not the case. Secondly, the force responsible for the sucking the superfluid from the container would act only at the boundary of the film. As the film covers both the interior and exterior walls of the container the boundary vanishes, and therefore also the force so that the flow of the superfluid to another container should stop. The amount of the superfluid leaving the container should be small and equal to the amount of super-fluid in the film: this is not the case. Hence the conventional explanation does not seem to work.
TGD inspired model for the fountain effect
What could be the TGD explanation for fountain effect?
A simple model for the situation would be based on Schrödinger equation at the flux quantum which is locally a thin hollow cylinder turning around at the top of the wall of the container.
What about Sun?
Just for interest one can also look what one obtains in the case of Sun: this also leads to a guess for a general formula for the parameter v0.