čtvrtek 29. ledna 2009

AWT and definition of intelligence

By AWT correct - i.e. physically relevant - definition of intelligence is rather important, as it can give us a clue about direction of psychological time arrow.

From certain perspective every free particle appears like quite intelligent "creature", because it can find the path of the optimal potential gradient unmistakably even inside of highly dimensional field where interactions of many particles overlaps mutually. Whereas single particle is rather "silly" and it can follow just a narrow density gradient, complex multidimensional fluctuations of Aether can follow a complex gradients and they can even avoid a wrong path or obstacles at certain extent. They're "farseeing" and "intelligent". Note that the traveling of particle along density gradient leads into gradual dissolving of it and "death". The same forces, which are keeping the particle in motion will lead to its gradual disintegration of it.

The ability of people to make correct decisions in such fuzzy environment is usually connected with social intelligence. We can say, motion of particle is fully driven by its "intuition". They can react fast in many time dimensions symmetrically (congruently), whereas their ability to interact with future (i.e. ability of predictions) still remains very low, accordingly to low (but nonzero) memory capacity of single gradient particle. Nested clusters of many particles are the more clever, the more hidden dimensions are formed by. Electrochemical waves of neural system activity should form a highly nested systems of energy density fluctuations.

Neverthelles, if we consider intelligence as "an ability to obtain new abilities", then the learning ability and memory capacity of single level density fluctuations still remains very low. Every particle has a surface gradient from perspective of single level of particle fluctuations, so it has an memory (compacted space-time dimensions) as well. Therefore for single object we can postulate the number of nested dimensions inside of object as a general criterion of intelligence. The highly compactified character of neuron network enables people to handle a deep level of mutual implications, i.e. manifolds of causual space defined by implication tensors of high order. Such definition remains symmetrical, i.e. invariant to both intuitive behaviour driven by parallel logics, both conscious behaviour, driven by sequential logics.

Every highly condensed system becomes chaotic, because intelligent activities of individual particles are temporal and they're compensating mutually here. By such way, the behavior of human civilization doesn't differ very much from behavior of dense gas, as we can see from history of wars and economical crisis, for instance. The ability of people to drive the evolution of their own society is still quite limited in general. We can consider such ability as a criterion of social self-awareness. The process of phase transition corresponds learning phase of multi-particle system.

Interesting point is, individual members of such systems may not be aware of incoming phase transition, because theirs space-time expands (the environment becomes more dense) together with these intelligent artifacts. At certain moment the environment becomes more conscious (i.e. negentropic), then the particle system formed by it and phase transition will occur. The well known superfluidity and superconductivity phenomena followed by formation of boson condensate can serve as a physical analogy of sectarian community formation, separated from the needs/feedback of rest of society. Members of community can be internally characterized by their high level of censorship (total reflection phenomena with respect to information spreading) and by superfluous homogeneity of individual stance distribution, followed by rigidity and fragility of their opinions (i.e. by duality of odd and even derivations in space and time) from outside perspective.

AWT explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. Because such environment becomes more dense, the space-time dilatation occurs here and everything vents OK from insintric perspective. As the result, nobody from sectarian community will realize, he just lost control over situation.

For example, people preparing LHC experiments cannot be accused from evil motives - they just want to do some interesting measurements on LHC, to finish their dissertations, make some money in attractive job, nurse children, learn French, and so on… Just innocent wishes all the time, am I right? But as a whole their community has omitted serious precautionary principles under hope, successful end justifies the means.

Particle model explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. For example, nobody of this community has taken care about difference in charged and neutral black holes in their ability to swallow surrounding matter. As a result, nobody of members of such community realizes consequence of his behavior until very end.

And this is quite silly and unscouscios behavior, indeed.

AWT and LHC safety risk

The LHC "black hole" issue disputed (1, 2, 3) and recently reopened (1, 2, 3) is manifestation of previously disputed fact, every close community becomes sectarian undeniably and separated from needs of rest of society like singularity by total reflection mechanism. Ignorance of fundamental ideas (Heim theory) or discoveries (cold fusion, surface superconductivity, "antigravity") on behalf of risky and expensive LHC experiments illustrates increasing gap between priorities of physical community and interests of the rest of society.

The power of human inquisitiveness is the problem here: as we know from history, scientists as a whole never care about morality, just about technical difficulties. If they can do something, then they will do it - less or more lately, undeniably. No matter whether it's nuclear weapon, genetically engineered virus and/or collider. Which makes trouble at the moment, the results of such experiments can threaten the whole civilization. We should know about this danger of human nature and we should be prepared to suffer consequences. Max Tegmark’s “quantum suicide” experiment doesn't say, how large portion of the original system can survive its experiment.

So, what's the problem with LHC experiments planned? Up to this day, no relevant analysis, evaluating all possible risks and their error bars is publicly available. Existing safety analysis and reports (1, 2) are very rough and superficial, as they doesn't consider important risk factors and scenarios, like formation of charged black holes or surface tension phenomena of dense particle clusters. There's an obstinate tendency to start LHC experiments without such analysis and to demonstrate first successful results even without thorough testing phase. Because the load of accelerator was increased over 80% of nominal capacity during first days impatiently, the substantial portion of cooling system crashed due the massive spill (100 tons) of expensive helium and monitoring systems of whole LHC are in extensive upgrade and replacement to avoid avalanche propagation of the same problem over whole accelerator tube in future.

Up to these days, publicity has no relevant and transparent data about probability of supercritical black hole formation during expected period of LHC lifetime and about main factors, which can increase total risk above acceptable level, in particular the risk associated to:

  1. Extreme asymmetry of head-to-head collisions, during which a zero momentum/speed black holes can be formed, so they would have a lot of time to interact with Earth with compare to natural protons from cosmic rays. The collision geometry is has no counterpart in nature, as it's a product of long-term human evolution, not natural processes.

  2. Avalanche-like character of multi-particle collisions. When some piece of matter appears in accelerator line, then whole content of LHC will feed it by new matter incoming from both directions by nearly luminal speed, i.e. in much faster way with compare to collisions of natural cosmic rays appearing in stratosphere

  3. Proximity of dense environment. With compare to stratospheric collisions of gamma rays, the metastable products of LHC collisions can be trapped by gravitational field of Earth and to interact with it in long term fashion. Some models are considering, the black hole can move in Earth core for years without notion, thus changing the Earth into time-bomb for further generations.

  4. Formation of charged and magnetic black hole. As we know from theory, real black holes should always exhibit nonzero charge and magnetic field as the result of their fast surface rotation. While force constant of electromagnetic force is about 10^39 times stronger then those of gravitational interaction (and the force constant of nuclear force is even much higher), the omitting of such possibility from security analysis is just a illustration of deep incompetence of high energy physics and it looks rather like intention, than just omission. It's not so surprising, as every introduction of such risk into safety analysis would lead into increasing of LHC risk estimations in many orders of magnitude, making them unfeasible in the eyes of society.

  5. Formation of dense clusters of quite common neutral particles, which are stable well outside from LHC energy range (presumably the neutrons). This risk is especially relevant for ALICE experiment, consisting of head-to-head collisions of heavy atom nuclei, during which the large number of free neutrons can be released in the form of so called neutron fluid. The signs of tetra-neutron existence supports this hypothesis apparently. The neutron fluid would stabilize neutrons against decay due its strong surface tension by analogous way, like the neutrons inside neutron stars. The risk of neutron fluid formation is connected to possible tendency to expel protons from atom nuclei in contact with neutron fluid, thus changing them into droplets of another neutron fluid by avalanche like mechanism, which was proposed for strangelet risk of LHC originally.

  6. Surface tension effects of large dense particle clusters, like the various gluonium and quarkonium states which CAN stabilize even unstable forms of mater, like neutral mesons and other hadrons up to levels, they can interact with ordinary matter by mechanism above described under formation of another dense particle clusters, so called strangelets (sort of tiny quark stars, originally proposed by Ed Witten). The evidence of these states was confirmed recently for tetra- and pentaquark exotic states. By AWT the surface tension phenomena are related to dark matter and supersymmetry effects observed unexpectedly in Fermilab (formation of di muon states well outside of collider pipe), as we can explain later. If this connection will be confirmed, we aren't expected to worry about strangelet formation anymore - simply because we observed it already!

With compare to black hole formation, the risks of strangelet and neutron fluid aren't connected to collapse of Earth into gravitational singularity, but to release of wast amount of energy (comparable to those of thermonuclear fusion), during which of most of matter would be vaporized and expelled into cosmic space by pressure of giant flash of accretion radiation.

As I explained already, cosmic ray arguments aren’t wery relevant to highly asymmetric LHC collisions geometry, so it has no meaning to repeat them again and again. This geometry - not the energy scale - is what makes the LHC collisions so unique and orthogonal to extrapolations based on highly symmetrical thermodynamics. It’s product of very rare human evolution. Whole AWT is just about probability of various symmetries.

So we are required to reconsider LHC experiments in much deeper, publicly available and peer reviewed security analysis. We should simply apply scientific method even to security analysis of scientific experiments - no less, no more. By my opinion, these objections are trivial and mostly evident - but no safety analysis has considered them so far from apparent reason: not to threat the launch of LHC. So now we can just ask, who is responsible for this situation and for lack of persons responsible for relevant safety analysis of LHC project of 7 billions € in total cost?

Safety is the main concern of LHC experiments. You can be perfectly sure, LHC experiments are safe because of many theories. After all, the main purpose of these experiments is to verify these theories.

Isn't the only purpose of LHC to verify it's own safety at the very end? Is it really enough for everybody?

úterý 27. ledna 2009

AWT and Bohmian mechanics

This post is a reaction to recent L. Motl's comments (1, 2, reactions) concerning the Bohm interpretation of quantum mechanics (QM), the concept of Louis de Broglie pilot wave in particular (implicate/explicate order is disputed here). Bohm's holistic approach (he was proponent of marxistic ideas) enabled him to see general consequences of this concept a way deeper, the aristocratic origin of de Broglie. It's not surprising, Bohm's interpretation has a firm place in AWT interpretations of various concepts, causual topology of implications and famous double slit experiment in particular. After all, we have a mechanical analogy of double slit experiment (DSE) presented already (videos), therefore it’s evident, QM can be interpreted by classical wave mechanics without problem..

Single-particle interference observed for macroscopic objects

AWT considers pilot wave as an analogy of Kelvin waves formed during object motion through particle environment. Original AWT explanation of double slit experiment is, every fast moving particle creates an undulations of vacuum foam around it by the same way, like fish flowing beneath water surface in analogy to de Broglie wave.

These undulations are oriented perpendicular to the particle motion direction and they can interfere with both slits, whenever particle passes through one of them. Aether foam gets more dense under shaking temporarily, thus mimicking mass/energy equivalence of relativity and probability density function of quantum mechanics at the same moment. The constructive interference makes a flabelliform paths of more dense vacuum foam, which the particle wave follows preferably, being focused by more dense environment, thus creating a interference patterns at the target.

By AWT the de Broglie wave or even quantum wave itself are real physical artifacts. The fact, they cannot be observed directly by the using of light wave follows from Bose statistics: the surface waves are penetrating mutually, so they cannot be observed mutually. But by Hardy's theorem weak (gravitational or photon coupling) measurement of object location without violating of uncertainty principle is possible. What we can observe is just a gravitational lensing effect of density gradients (as described by probability function), induced by these waves in vacuum foam by thickening effect during shaking.

Other thing is, whether pilot wave concept supplies a deeper insight or even other testable predictions, then for example time dependent Schrödinger equation does. By my opinion it doesn't, or it's even subset of information contained in classical QM formalism. This doesn't mean, in certain situations pilot wave formalism cannot supply an useful shortcut for formal solution (by the same way, like for example Bohr's atom model) - whereas in others cases it can become more difficult to apply, then other interpretations.

pondělí 26. ledna 2009

AWT and definition of observable reality

When comparing contemporary physical theories, a natural question can emerge immediately: if AWT is proclamativelly more general, then for example various quantum field or quantum gravity theories, shouldn't it lead to even more solutions, then these theories can supply? And if the vagueness is the main objection against these theories, why we should take care about AWT, after then?

The true is, AWT can lead into virtually infinite number of solutions, because even in quite limited particle system the number of possible states increases by extremely fast way. But AWT introduces a gradient driven reality concept, which is probability driven. Many results of particle-particle collisions aren't simply probable, because they're too rare. Therefore we can see only density gradients inside of dense particles system, not a particles or intermediate states as such. The concept of gradient driven reality is apparently anthropocentric, but it can be derived from AWT concept independently, because only artifacts, which were created by long term evolution of high number of mutations, i.e. by causal time events can interact with reality by gradient driven way.

The probability based approach based on particle statistics brings a rather strict restriction into number of possible solutions of every fuzzy theory. String theorists are aware of this opportunity, so they're trying to apply a statistical approach onto landscape of string theory predictions as well. But because the number of predictions of string theory (~10E+500) roughly corresponds the number of particles states inside of observable portion of Universe, then such approach is phenomenologically identical to AWT, if we simply omit whole intermediate step related to tedious string theory formalism (which is serving like random number generator only) - and if we apply Boltzmann statistics to these states directly.

By such way, the AWT wins over formal theories in simplicity (i.e. by Occam's razor criterion), just because it introduces a gradient driven definition of observable reality into physics, thus reducing the number of possible observable states in it: every object can be observed only and if only it contains some space-time gradient from sufficiently general perspective. For example, the (movement of) density gradients inside of condensing supercritical vapor can be observed, while the molecules (motion) itself not. The single Aether concept i.e. material conditional (antecedent) is sufficient for such decision, if we apply an observability criterion (consequent), thus introducing basic implication vector, which the AWT is based on: if Universe is formed by chaotic/particle environment, then every fluctuation evolved/emerged in it via (number of) causal events would see only the (same number of) causal gradients of it. (... and we can predict an appearance of this observable reality by unique way). By such way, we can always see exactly the part of Universe, which has served for our evolution (space-time emergence) and the observable scope of reality expands gradually. This is the way, how Bohm's implicate/explicate order may be undertood in context of AWT, because implication vector defines a time arrow of causual space-time curvature and subsequent compactification of it here.

The testability of AWT insintric perspective is provided by nonscalar implication vector, which is based on nonsingular (zero or infinite) order of axiomatic tensor.  Outside of this perspective AWT remains tautology inherently, whis is given by fact, no assumption can consider itself, or less generally, that no object of observation can serve both like mean, both like subject of the same observation in the same time and space point. Aether concept itself remains a tautology, as it cannot be proven by observation and causual logic without violation of this logic in less or more distant perspective by the same way, like God concept.

It can be demonstrated easily, many conceptual problems of contemporary science simply follows from the fact, the scientists have no clue, what is observable and what not, because of lack of relevant definition of observable reality. By such way, many possible combinations would simply disappear from testable predictions, if we apply the gradient driven statistics or Lagrange/Hamilton mechanics, which is based on it. In particular, the misinterpretation of results of M-M experiment just follows from the fact, scientists didn't realize, the motion of environment isn't observable by waves of this environment. The refusal of deBroglie /Bohmian mechanics is misunderstanding of the same category: scientists didn't realize, deBroglie wave cannot be observable by light wave (so easily), being a wave of the same environment, so that the lack of experimental evidence of deBroglie wave cannot serve as an evidence against Bohmian mechanics.

AWT, emergence and Hardy's paradox

Recently, the fundamental experimental evidence of Hardy's paradox was given, which basically means, quantum mechanics isn't pure statistics based theory following Bell inequalities anymore. The non-formal understanding of this paradox is easy: if every combination of mutually commutable quantities cannot be measured with certainty, how can we be sure about it? Whether some combination exists, which violates such uncertainty? By such way, uncertainty principle of quantum mechanics violates itself on background, thus enabling so called "weak" measurements.

This was demonstrated recently for the case of entangled photon pairs - it can serve as an evidence, even the photons have a distinct "shape", which is the manifestation of the rest mass of photon. This is because the explicit formulation of quantum mechanics neglects the gravity phenomena and the rest mass concept on background: by Schrödinger equation every particle should dissolve into whole Universe gradually - which violates the everyday observations, indeed. Such behavior is effectively prohibited by acceleration following from omni-directional Universe expansion i.e. the gravity potential, so that every locatable particle has a nonzero surface curvature and its conditionally stable at the human scale. From nested character of Aether fluctuations follows, not only single level of "weak" measurement should be achievable here. After all, the fact we can interact with another people and object without complete entanglement can serve as an evidence, the "weak" observation is very common at the human scale.

By AWT every strictly causual theory violates itself in less or more distant perspective due the emergence phenomena. While the classical formulation of general relativity remains seemingly self-consistent (being strictly based on single causality arrow) - the deeper analysis reveals, derivation of Einstein field equations neglects the stress energy tensor contribution (Yilmaz, Heim, Bekenstein and others), which is the result of mass-energy equivalence. This approach makes relativity implicit and infinitely fractal theory by the same way, like the quantum mechanics (which is AdS/CFT dual theory). For example, gravitational lensing, multiple event horizons of charged black holes and/or dark matter phenomena can serve as an evidence of spontaneous symmetry breaking of time arrows and manifestation of quantum uncertainty and super-symmetry in relativity. This uncertainty leads into landscape of many solutions for every theory quantum field or quantum gravity theory, based on combination of mutually inconsistent (i.e. different) postulates.

Such behavior follows Gödel's incompleteness theorems, by which formal proof of rules valid for sufficiently large natural number sets becomes more difficult, then these rules itself - thus remaining unresolvable by their very nature. This is a consequence of emergence, which introduces a principal dispersion into observation of large causal objects and/or phenomena, which cannot be avoided, or such artifacts wouldn't observable anymore. By such way, every strictly formal (i.e. sequential logic based) proof of natural law becomes violated in less or more distant perspective and it follows "More is Different" theorem. AWT demonstrates, this emergence is followed by causal (i.e. transversal wave based) energy spreading through large system of scale invariant symmetry fluctuations (unparticles), which are behaving like soap foam with respect to light spreading and they enable to observe the universe (and all objects inside it) both from excentric, both from insintric perspective simultaneously. The mutual interference of these two perspectives leads to the quantization of observable reality, which is insintrically chaotic, exsintrically causal by its very nature.

In this connection it's useful (..and sometimes entertaining) to follow deductions of formally thinking theorists, like Lubos Motl, whose strictly formal thinking leads him to the deep contradiction/confrontation with common sense and occasionally the whole rest of world undeniably. It may appear somewhat paradoxical, just fanatic proponent of string theory - which has introduced the duality concept into physics - has so deep problem with dual/plural thinking. This paradox is still logical though, if we realize, how complex the string theory is and how strictly formal thinking it requires for its comprehension.

By such way, "emergence group" of dense Aether theory makes understanding of observable reality quite transparent and easy task at sufficiently general level. It still doesn't mean, here's not still a lotta things to understand at the deeper levels, dedicated to individual formal theories.

čtvrtek 15. ledna 2009

AWT, theories and Gödel's incompleteness theorems

By AWT the scientific (i.e. causality logics based) theories are simply density fluctuations of Aether scale invariant environment like others. Human understanding is energy density driven, and the theories are accelerating the speed of energy/information propagation through environment (a human society) by the same way, like the asymmetric density fluctuations (a gradients) are accelerating the asymmetric energy spreading in transversal waves through particle environment.

Being a physical artifacts, even the seemingly abstract theories have independent tangible impact to observable reality. For example, the aerial view bellow illustrates the appearance of two neighboring countries (Austria and former Czechoslovakia), which differs just only by their theories of social arrangement, not by natural conditions. The appearance of landscape in country, which is applying socially oriented theory leading to less diversity is apparently less divergent as well. It still doesn't mean, the more divergent theory is necessarily better, though, because it's suited just for more rich and divergent environment - but this is another story.

Because the scope of density fluctuations inside of nested field of density fluctuations is always limited, the scope of theories must remain limited as well. This is because every theory is based on at least single causal/logical connection between two or more axioms/postulates/assumptions, i.e. an implication tensor definning the cardinality and compactness/consistency of formal logic system built upon implication. But the consistency of two different postulates can be never confirmed with certainty - or we could replace them by single one and we could never have some implication between them anymore, but a tautology. By such way, the scope of every logics is limited, because it remains based on insintrically inconsistent axioms - or we couldn't have some logics at all. In particular, at the moment, when TOE defines a time arrow, it becomes tautological, because validity of every implication depends on time arrow vector of antecedent and consequent. Such conclusion leads us to the understanding, every Theory Of Everything (a TOE based on no assumptions) is necessarily tautological by its very nature by the same way, like dual concept of God - and as such not very useful in causual perspective for the rest of society.

Gödel's incompleteness theorems (GITs) show that, for any sufficiently complex set of mathematical systems, one of the following two statements is true. Either
  1. There are true statements, expressible within the mathematical system, that cannot be proven from the axioms of that mathematical system. Or:
  2. There are false statements, expressible within the mathematical system, that can be proven from the axioms of that mathematical system.
Most mathematicians lean towards (1), because (2) would basically imply that formal math is BS (causual bifurcations related to imaginary numbers or division by zero are particularly good for it). But (1) is just a limitation upon what can be proven by mathematics: There are true statements, which you can perfectly describe in mathematical terms, which cannot be proven by mathematics.

Whole GITs are about this dilemma, but the explanation of AWT appears more intuitive and general. GITs were derived for theory of natural number set based on eleven axioms of Peano algebra, which is supposedly best defined human theory (of countable units) so far. The existence of other theories is based on more fuzzy logic, including the definition of theory itself. We can still consider AWT theory more general, then any other number theory, because the (natural) number concept is based on countable units, i.s. singular zero-dimensional particles colliding mutually in infinitely dimensional space, whereas the differential calculus is based upon concept of Aether density gradients driven observable reality.

Without particle concept the number concept is unthinkable - until we accept, we are composed just from pure numbers - which doesn't appear very probable, because number theory is product of human evolution and as such is much younger, then the Universe - not vice-versa. By such way, the AWT is working even at the case of singular geometry and fuzzy algebras.

Donald Rumsfeld: "As we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns - the ones we don't know we don't know."

sobota 3. ledna 2009

AWT and human scale

By AWT the Universe appears like being formed by infinitely nested field of density fluctuations of Aether. The human brain is one of such fluctuation, due its large time scale it can interact/observe a huge portion of space-time both into past of Universe expansion (the cosmic scale), both into future of it (the Planck scale). Because of symmetry of mutual interaction, human scale appears exactly at the middle of observable space-time scale. The human scale is defined by the average size of neurons inside of human brain (lowest entropy observable inside of our Universe generation) or by wavelength of cosmic microwave background radiation (CMB) (about 1.7 cm), which is apparently chaotic (highest entropy density observable). Under furthersome conditions, the violation of Lorentz symmetry can be observed by naked eye as a Brownian motion at Planck scale or like gravity lensing at cosmic scale - due CPT symmetry violation the Planck scale appears more close to human scale, then the cosmic one.

The wavelength 1.7 cm is invariant with respect to AdS-CFT duality, because it corresponds the wavelength, when the character of energy spreading changes from longitudinal waves to transversal one. From AWT perspective the CMB corresponds the capillary waves at water surface, which are spreading along it by the lowest speed at wavelength of 1.73 cm from exsintric perspective, enabling to interact with as large space-time, as possible and allowing the most advanced evolution of matter inside it. Classical quantum mechanics cannot handle gravity (phenomena) at all and quantum noise blurs in CMB noise above human scale by the same way, like relativity is limited by CMB noise (GZK limit, CMB Doppler anisotropy, etc.) in its predictions.

From cosmological perspective, the wavelength of CMB (1,7 cm) corresponds the outer diameter of Universe or the wavelength of Hawking radiation of tiny black hole, whose lifespan corresponds the age of our Universe generation (13.7 GYrs) - so we can say, the CMB is Hawking radiation of the black hole, which we are living in, i.e. red-shifted radiation of most distant quasars. The foamy character of energy spreading enables to see the event horizons of our Universe both from inside, both from outside via CMB radiation (i.e. the event horizons of most distant quasars observable). The larger (gravitational waves) or shorter waves (gamma radiation) are of limited scope with compare to CMB due the dispersion (GZK limit) in analogy to capillary waves spreading at water surface (compare the celerity curve bellow). The energy density of 3D space-time (roughly given by third power of Planck constant, i.e. 1oE+96 J/m3) corresponds the mass density of black hole, which is forming it.

From AWT follows, every Aether fluctuation of diameter bellow 1.7 cm will dissolve into photons and neutrinos, while the larger objects will collapse into heavier objects and evaporate by the same way. The black holes of diameter bellow 1.7 cm can evaporate via Hawking radiation during observable Universe lifespan, while these larger ones will evaporate by accretion radiation - so we can say, such objects are the most stable objects inside of observable Universe generation and accretion radiation is AdS-CFT dual to Hawking's one (massive objects bellow 1.7 scale falling into event horizon would appear like tiny quantum fluctuations from distant perspective outside of black hole due the immense space-time compactification around it). The density of largest black holes existing inside of observable Universe (with 10+9 times the mass of the sun, which has a radius of event horizon about 109 km) should be comparable with human scale (1 kg per dm3).

Curvature instability is scale invariant. During Big Bang event, all particles were formed by supersymmetric gravitons, the average size of which corresponded the wavelength of CMB photons. During universe evolution the larger gravitons condensed into particles and objects of observable matter, while the smaller fluctuations have evaporated into antiparticles of matter, which were dispersed by its repulsive gravity into clouds of dark matter, surrounding the objects of normal matter. The same criterion can be applied for planet and planetoids formation or even for predators-prey relationship of biosphere. Only pieces larger then some 1.7 cm can serve as a nuclei for accretion and subsequent gravitational growth, or they would become dispersed by radiation pressure of CMB photons. The smaller pieces of matter tends to condense as a whole in large clusters, instead (large means > 1.7 cm).

From AWT follows, the size of photons is given by interference of light wave with graviton background of Planck length scale, forming the quantum foam background of universe. From interference condition follows, the size of wave packet is equal to their wavelength size exactly at the 1.7 cm scale, which effectivelly means, microwave photons are serving both like particles, both like waves, i.e. by the same way, like gravitons in previous generation of Universe, expanded during inflation or like graviton waves in future generation of Universe before its gravitational collapse. The photons of larger wavelength cannot exist, because they tend to condense spontaneously with these smaller ones into solitons of negative rest mass (axions, or so called tachyon condensate).

Even tiny droplets and bubbles in mixtures tends to shrink and evaporate bellow 1.7 cm scale, while larger droplets and bubbles expands and fragments. The least stable droplets of 1,7 cm diameter (liposomes) could started the evolution of life at shallow places of ancient oceans (i.e. inside of multiphase environment of the largest possible complexity). The repeated breakdown by surf waves enabled them to compete for collection and/or (later) production of surfactants, which enabled them to remain as stable, as possible. Whole evolutionary process lasted whole Universe age, because AWT makes no conceptual difference between evolution of inorganic matter and organic life. Therefore it's nothing very strange, the quantum nature and size of neural standing waves corresponds the size of Universe scope, perceivable just by these waves (i.e. quantum gravity standing wave, forming the observable Universe generation). The increasing density of Universe resulting from vacuum foam collapse corresponds the expansion of the scope of human consiusness, capable to comprehend an increasing space-time portion of Aether chaos complexity during time.

The anthropocentric question, whether 1.7 cm distance scale is adjusted by evolution or it just enables the best visibility of Universe remains a tautology by Aether theory, because from AWT follows, every object which is product of less or more long term evolution has tendency to remain adapted to its environment and vice-versa. The scope of observable Universe always depends on entropy density of observer (i.e. number of time events/mutations involved) - the primitive organisms can see their Universe smaller, the more intelligent larger accordingly.

Lord Byron: "Truth is always strange — stranger than fiction."

pátek 2. ledna 2009

Motivations of Aether Wave Theory

AWT isn't based on some mysticism at all - on the contrary. AWT is based on Boltzmann gas model - it's a basic system for definition of thermodynamical energy, instead. Furthermore, this model isn't ad-hoce at all. It's based on the understanding, from sufficiently distant perspective every object appears like pin-point particle. And every complex interactions in such system can be modeled by system of colliding particles. For example, people are complex objects, but if we would observe them from sufficient altitude, they would appear and behave like chaotic 2D gas composed of colliding particles. It's natural reduction of virtually every physical system.

Despite its conceptual simplicity, this system becomes irreducibly complex with increasing of particle density, because it forms fractaly nested density fluctuations composed of density fluctuations. Such behavior can be both simulated by computers, both modeled by dense gas condensation (supercritical fluid at the right picture) and the resulting complexity is limited just by computational power. Which means, AWT principle enables to model systems of arbitrary complexity just by recursive application of trivial mechanism. If nothing else, we should consider this model because of its simplicity and the fact, nobody did propose it for modeling of observable reality, yet.

The main reason for reintroduction of Aether theory back into mainstream physics is better and more consistent and universal understanding of fundamental connections of reality. Most of these motivations weren't never presented by mainstream physics and they're forming the theorems, i.e. testable predictions of AWT at the same moment, because they can be derived from ab-initio simulation of nested density fluctuations of Boltzmann particle gas. This list bellow will be extended by new ideas occasionally.
  1. Explanation of energy spreading by light
    The spreading of inertial energy requires inertial environment. We cannot use the energy concept for light waves spreading, while ignoring mass concept, the mass-energy equivalence in particular.
  2. Explanation of wave character of light.
    Only system of mutually colliding particles can spread energy in waves, vacuum shouldn't be any exception.
  3. Explanation of finite frequency of light.
    Only system of nonzero mass density can spread waves of finite frequency, as follows from wave equation.
  4. Explanation of high light energy density/frequency achievable.
    Classical models of luminiferous Aether were based on sparse gas model of Aether, which cannot spread the waves of energy density corresponding to gamma or cosmic radiation frequency.
  5. Explanation of light speed invariance.
    Light speed invariance is consequence of Aether concept and the fact, the light speed is the fastest energy spreading observable (if wee neglect the gravity waves, which are too faint to be observable), so we can use only light for observation of reality, the light speed/spreading in particular.
  6. Explanation of absence of reference frame for light spreading in vacuum.
    If we use the light for observation of light spreading in luminiferous Aether, it's motion/reference frame can be never locally observed just by using of light waves, because no object can serve as a subject and as a mean of observation at the same moment.
  7. Explanation/prediction of transversal character of light waves.
    In particle environment, only transversal waves can remain independent to environment reference frame by the same way, like motion of capillary waves at water surface.
  8. Explanation/prediction of foamy structure of vacuum.
    Only foam structure composed of "strings" and "(mem)branes" can spread energy in transversal waves through bulk particle environments (string and brane theories) and/or provide the properties of elastic fluid, composed of "spin loops" vortices (LQG theory).
  9. Explanation/prediction of two vector character of transversal light waves.
    Only nested foam structure can promote the light spreading in two mutually perpendicular vectors of electrical and magnetic intensity (1, 2). The formation of nested density fluctuations can be observed experimentally during condensation of supercritical fluid (1).
  10. Explanation/prediction of uncertainty principle.
    The transversal character of surface waves is always violated on behalf of underwater waves. Inside of inhomogeneous particle system the energy is always spreading in both transversal, both longitudinal waves, thus violating the predictability/determinism of energy spreading and introducing an indeterminism into phenomena, mediated/observed by using it.
  11. Explanation/prediction of particle/wave duality.
    Every isolated energy wave (a soliton) increases the Aether foam density temporarily by the same way, like the soap foam gets dense during shaking due the spontaneous symmetry breaking. As the result, every soliton spreads like less or more pronounced gradient/blob of Aether density and it bounces from internal walls of surface gradient of such blob like standing wave packet, i.e. particle (1).
  12. Explanation/prediction of virtual particles.
    The concept of virtual particles, which appear and dissapear temporarily in vacuum is typical behavior of density fluctuations inside of every gas or fluid and physics knows no other way, in which such behavior can be realized.
  13. Explanation/definition of time dimension and space-time concept.

"..People have often tried to figure out ways of getting these new concepts. Some people work on the idea of the axiomatic formulation of the present quantum mechanics. I don't think that will help at all. If you imagine people having worked on the axiomatic formulation of the Bohr orbit theory, they would never have been led to Heisenberg's quantum mechanics. They would never have thought of non-commutative multiplication as one of their axioms which could be challenged. In the same way, any future development must involve changing something which people have never challenged up to the present, and which will not be shown up by an axiomatic formulation..."

Paul A.M. Dirac, in Development of the Physicist's conception of Nature, In The Physicists conception of Nature ed.Jaghdish Metra, D. Reidel, 1973., pp 1-14.