sobota 28. února 2009

Splendors and Miseries of Conservatism

This post is an reaction to public news, by which fourteen years old Jonathan Krohn formulated the manifest of conservatism in his book Define Conservatism. The common criticism of conservatism is based on the vagueness of this philosophy, in relation to liberalism in particular.

Click to View Jonathan's Introductin

By this this page (which is somewhat biased against liberalism) the conservatism is defined by following six apparently nontransparent principles:

  1. Belief in natural law.
    Here we can met with biased stance often, because proponents of conservatism tends to neglect just these laws, which are supporting a synergies and evolutionary advantages of collectivism and strong central government.
  2. Belief in established institutions
    Belief in institutions, the government in particular makes a belief in individual somewhat problematic, just because established institutions tends to subdue individual freedom very often. Here's no intersubjectively accepted criterion of level, by which institutions can affect the life of individual safely without violation of individual freedom.

  3. Preference for liberty over equality.
    This may sound well, but by principle, the freedom of individual begins exactly where freedom of others ends the maximal freedom of individual exists just in completely egalitarian society.

  4. Suspicion of power—and of human nature.
    This is vague stance as well, because just the established institutions are dispersers of true power. Human nature can lead to misuse of conservative principles by the same way, like misuse of libertarian ones.

  5. Belief in exceptionalismus.
    This belief manifest often in biased meritocratic elitism, which defies the individual right very often.

  6. Belief in the individual.
    The general reason, why individuals are organizing itself into "established institutions" is just to promote the collective opinion (groupthink) over the opinion of individuals.

It's evident, all principles of conservatism are rather weekly defined and supersymmetric by their very nature: the abuse of some of them leads to violation of other conservative principles immediately. Generally high tendency to manipulation and hypocrisy in conservative stance manifests itself by various ways. From AWT follows, the only relevant stance from long term perspective is strictly balanced one, which considers both individual, both democratic principles of social arrangements. As the density of society increases, the need of balanced approach becomes more pronounced gradually, and it converges to the duality of conservative and liberal approach in 1:1 ratio. Which practically means, the "Tax Free Day" ("The Cost of Government Day") should converge to end of July, for example.

With respect to the above definition main principles of conservatism, as postulated in Krohn's book appear even more vague and childish, if not manipulative. They're implying on background, the opponents of conservatism doesn't believe in "life" or "personal responsibility" or "founding principle", thus becoming a sort of naive demagogy. For example, communism relies strongly just on personal responsibility and its founding principles, thus becoming an utopistic ideology by the same way, like definition of conservatism by Jonathan Krohn. The idealization/ideologization of conservatism can be perceived as a natural defensive reaction to undergoing financial crisis, which implies a temporal need of public interventions and it illustrates, how deeply guardians of traditional "conservative values" become confused by recent situation.

“...Owners of capital will stimulate the working class to buy more and more of expensive goods, houses and technology, pushing them to take more and more expensive credits, until their debt becomes unbearable. The unpaid debt will lead to bankruptcy of banks, which will have to be nationalized, and the State will have to take the road which will eventually lead to communism...”— Karl Marx, 1867, Das Kapital (a hoax?)

neděle 22. února 2009

AWT and gamma ray dispersion

Markarian (Mkn) 421 and Mkn 501 are two relatively nearby (z=0.034) galaxies about 300-500 light years (0.5 Gly) away. Mkn 501 is a bright optical object (blazar) in the active galactic nuclei (AGN) of a giant elliptical (young) galaxy, containing probably black hole binary, which exhibit the high energy cutoffs predicted to be the result of intergalactic annihilation interactions, primarily with infrared photons. In accordance with this, a MAGIC telescope collaboration published an observation of 30 sec delay of two minutes lasting 10 TeV gamma ray gamma ray bursts (GRB) from summer 2005 with respect to low energy signal, i.e. high energy tail, which On the origin of highest energy gamma-rays from Mkn 501 can be explained by intergalactic cascade of gamma rays. Without presence of external gravity/magnetic field it could be explained by several models of quantum gravity as a result of dispersion of gamma rays by quantum foam structure of vacuum in 1o+7 eV scale.



With compare to this, the GZK limit considers dispersion by CMB photons at 1o+19 eV scale only. It was proven by recent 16 seconds lasting huge burst GRB 080916C from September 2008, as observed by Fermi / GLAST observatory which is working at 300 GeV scale, during which six-second delay separating the highest-energy emissions (red) from the lowest (white dots) was observed. The distance was estimated by wavelength of afterglow drop-off caused by intervening gas clouds (afterglow photometric redshift) to 12.2 Gly (z=4.35), which places GRB 080916C among the top 5% most distant GRBs, and makes it the most energetic GRB known to date.



While these results were interpreted by string theory proponents as a negative result of Lorentz symmetry violation immediately, we should realize, dispersion is exponential function of wavelength, the energy scale of Fermi/GLAST observatory is 30x lower, then MAGIC observatory and the GRB event was 8x shorter - so that the five-times shorter delay is still comfortably acceptable here. But if we still consider, GRB 080916C was 25x more distanced, then Mkn 501 blast it's apparent, the mechanism of high energy tail is still the same, i.e. of intergalactic origin, not a result of gamma ray photons dispersion in extragalactic space without presence of magnetic and gravitational fields.

By AWT this behavior is a consequence of nonlinear behavior of Aether foam (compare some ad-hoced QG models) and the same phenomena we can met at the water surface, where waves of different frequency are moving by different speed (compare the celerity curve), but burst propagates here at large distance like self-reinforcing solitary vortex ring or wave packet, i.e. the kink soliton. For soliton is typical quantization and fragmentation of time arrows (analogous to splitting of event horizon in Kerr metric of rotating black holes). When a wave is too big for the depth of water, it splits into two, one big and one small, which continue independently, i.e. no dispersion occurs here, until whole pulse propagates like single wave packet described by Sine-Gordon/Klein-Gordon wave equation (compare the "rubber band" model of soliton and/or special relativity). All particles including neural waves are sort of hyperdimensional solitons, while these of non-zero rest mass are so-called "breathers".


Soliton character of gamma ray bursts is the reason, why we can observe them at all without apparent dispersion even at cosmological distances, thus violating the Greisen-Zatsepin-Kuzmin limit and they can explain, how X-ray or gamma-ray luminosities are related to peak energy output or the luminosity at lower frequencies alternativelly to dual axion models (axions or gravitophotons are vortex analogy of burst solitons by AWT). In accordance with this apparently weaker burst of AGN Mkn 501 exhibits more apparent dispersion assigned to ALP mixing (i.e. axion coupling), then the sharper GRB 080916C, which exhibits Gaussian distribution, typical for solitons. By this way, AWT can explain even violation of Lorentz symmetry violation, i.e. 2nd order violation of relativity, which manifest itself only during spreading of cosmological energy densities at cosmological distance, thus doubly fooling some special relativity and/or string theory fundamentalists.

sobota 21. února 2009

AWT and bees colony collapse disorder

This post is dedicated to hypothetical emergent connection of mysterious CCD (Colony Collapse Disorder) to spreading of GMO crops (Genetically modified organisms). Recently the evidence of horizontal gene transfer (HGT) at the case of GMO corn was confirmed. I am rather convinced proponent of this connection from many following reasons, some of which weren't presented yet:
  1. The connection of GMO and food induced atopy and allergies and related problems is well known and experimentally proven in short term laboratory experiments in certain cases, too, although final confirmation in long term experiments is still missing here.

  2. The problem of CCD is relatively new, it's apparently related to some latest technology or environmental change, which has become widespread just recently. Although the other man made effects, like the global warming can play a negative role here, too, as they can increase a resistance of natural bee parasites, like Varroa destructor mites through winter period. It's not directly related to parasites specific to domestic colonies, though, as the wild bumble bees population suffers the same problem (it's just less apparent from obvious reasons) - so that the common reason must exist elsewhere.

  3. The symptoms of CCD are unspecific, they manifests like immunodeficiency syndrome, which can manifest itself by various symptoms (at the case of AIDS the reason of death is often quite harmless infection). The CCD agents in apparently hidden and nonspecific as well, as it manifest by various reasons in different locations. Whereas an instinctive reaction of bees is always the same, they tend to leave a bee-hive and escape from area contaminated.

  4. The CCD is linked to new bat "white nose syndrome" (WNS), which is probably manifestation of food allergy (long term rhinitis) as well. Many bat species are significant pollinators like bees) and they have dissappeared recently as well.


  5. The introduction of GMO into Great Britain in 1998 has caused a statistically significant step in generally increasing rate of food allergy, because GMO import was enabled in food market stepwise in legall act, i.e. not gradually in this particular case.



  6. The Cry1A / Cry1Ab proteins are always produced in mixtures in GMO, some of active proteins (Cry1B, Cry3Bb1, Cry9c, EPSPS) in GMO strands aren't completely specific to lepidopthera, they can affect a hymenoptera (i.e. the ants, beatles and bees) and other higher organisms as well and a various neglected toxicity synergies can exist here.

  7. With respect to GMO induced allergies, the active proteins Cry1A / Cry1Ab in GMO food are very similar to bacterial toxins, against which the immune systems of living organisms (including human and bees) are programmed. The Bacillus thuringiensis is very close to anthrax pathogens (Bacillus anthracis) both morphologically, both genetically, so especially high risk of coincidence in immune reactions (atopy, i.e. allergies) exists here.



  8. The correct function of immune system is based on correct distinction between healthy and pathogenic proteins. The higher number of foreign proteins is forced to distinguish, the higher risk of improper response can be expected.


    The increasing rate of food allergy is an civilization problem and it's related to increasing diversity of various components of food (proteins in particular). The first documented case is so called lactose intolerance, because the milk diet was introduced into populations rather recently. China is particularly notable as a place of poor tolerance, whereas in Mongolia and the Asian steppes horse milk is drunk regularly. Even eating of tropical fruits and ocean fish can increase an allergy levels, if we're not adopted to it (typically for inland inhabitants with traditionally low food diversity) .

  9. The low but permanent allergene concentration in food (GMO contaminated pollens at the case of bees) has a sensitization effect even for normal food components, because the correct function of biochemical pathways of adaptive immune system depends on temporary decrease of allergen concentration in reaction with specific antigen. If such decrease doesn't occur (for example because of human or bees are in permanent contact with GMO contaminated food every day), immune system continues in production of another various antigens, until their number and concentration isn't so high, they can initiate an accidental violent atopic reaction ("allergy") even in contact with proteins and various polutants, which can be considered harmless under common situation. After then the positive feedback is established under formation of permanent allergic reaction (chronical rhinitis or urticaria as an example).
An analogous problem can exist in connection to prolonged eating of meat of cloned animals, the cells of which are always genetically older, then the meat of normal young organism, so that the cummulation of prion metabolites, product of apoptosis, immunosupressive agents and mutagenic telomere changes in chromozomes may occur here. Meat of young clones correspond the meat of very old organisms with respect to their cellular metabolism, so they're less resistant to different infection, aging diseases and "Sudden Death Syndrome". As the result, higher unhealthy concentration of antibiotics, histamine and various antibodies should be expected in meat of cloned animals.

Every organism is programmed to certain life-time and food supplies during their million years evolution, the fast violation of such natural equilibrium can always have a unadversed effects to these equilibrium. The effects may appear negligible for humans, but they can lead to deadly synergies at the case of wildly living organisms (like the bees, bumblebees, house sparrows or bats), which are exposed to evolutionary pressure of life environment. These synergies can be considered as a manifestation of emergent phenomena from AWT perspective, as expressed by famous proverb: "A hundred times nothing killed the donkey".

Such reasons and connections are systematically underestimated by mainstream science, which tends to deal with apparent phenomena driven by single causual reasons and time arrow - not with holistic and parallelistic fuzzy logic based on many weak influences and various synergies interacting in hidden dimensions. The ignorance of multiparticle Aether concept is just a pronounced/condensed case of this apparent bias in scientific methodology from this perspective.

Albert Einstein: “If the bee disappears from the surface of the earth, man would have no more than four years to live. No more bees, no more pollination … no more men!

pátek 20. února 2009

AWT and socioeconomic structures

This post is a reaction to defensive stance of many objectivists, monetarists in particular, which are getting into conceptual troubles with contemporary financial crisis. Aether model of particle density fluctuations enables us to understand the mutual relation of socioeconomic structures, the dual character of communism and free market society in particular. By AWT free market society or "laissez-faire" capitalism, as proposed by some subjectivists corresponds the primitive communism society, which is behaving like thin sparse gas with high degree of symmetry (i.e. the space time of low number of spatial dimensions on behalf of temporal ones). Such particles are moving here rather freely and they're collide mutually via low distance interactions by simple and transparent set of individual probability driven rules.

When population density of such particle system increases, an emergent spontaneous symmetry breaking occurs and the energy/value spreading (i.e. the economy) of this system becomes driven by less or more apparent and global microscopical structures (i.e. (mem)branes of foam), which are obtaining a nested hierarchical structure gradually. When the system becomes sufficiently dense, the scope of foamy structures determines the motion of all particles - participants of this system by totalitarian way.

When social pressure inside of such system exceeds certain limit, a new emergent structure of individual density fluctuations inside of such dense system appears, which is serving as a promoter of subsequent phase transition, so called social revolution, which has often avalanche like character. And the whole social cycles repeats again, until society doesn't exhausts its sources. After then the social transition goes by opposite way back into free market or primitive communism economy.

While the analogy of social evolution and phase transitions is quite apparent, just common ignorance of Aether model has prohibit the scientists to model the evolution of society by easy, general and transparent way. This is the reason, why Aether concept was ignored both left, both right camp ideologies, because no ideology can like ballanced, duality based approach of AWT. It's a pity, as it would help to understand the physical mechanism of social crisis, wars and financial instabilities better. For example AWT explains origin of macroeconomic cycle by limited speed of information spreading through society or it explains wars as a consequence of social gradient in space-time, not just in space: the situation, when some country becomes richer or poor fast is much more dangerous, then stabilized neighbourhood of poor and rich countries. As the result, sociology and economics are still in very rudimentary stage of general understanding of human society, despite of high number of less or more formal models, which are trying to describe their subject locally.

From above model follows, AWT considers communism as a dual case of free market society, because utopistic character of both social systems depends on altruistic belief in human ability cooperate fairly without tendency to abuse system. It means, here's nothing wrong on communism or "laissez-faire" free market economy, with the only subtle, but crucial problem: both theories are ignoring social interactions and the "selfish" tendency of people to follow shortest path of energy density gradient due limited speed of information. As the result, both socioeconomic structures requiring support of high level social structures, i.e. government and their interventions to remain free and stable.

By AWT free market is a boson state of society, based on utopistic belief in human ability cooperate fairly without tendency to abuse system, by the same way, like communism. It's a dual ideology to communism. Paradoxically both they can survive just under adequate care of government - the more stronger, the more superfluous the social relationships are expected to be, so that the ratio of profit and taxes converges to 1:1 with increasing of money density exchange and Lafer curve becomes symmetric. Because people are simply undisciplined and self-seeking individuals - that's all. Therefore the tax freedom day defines well the overall state of macrosocioeconomic evolution.

Here's still subtle spontaneous CP symmetry violation, as the free market exhibits a rudimental set of principal microeconomic rules (the principle of supply and demand equilibrium in particular), while communism is only subjectively driven, which makes it more viable from evolutionary reason. Free market operates with current prices only, it doesn't reflect past or future by itself - so it's principally unstable and it undulates between financial boom and crisis. While these undulations may appear fatal for someone, they can drive the evolution of social system forward, until the external sources of society aren't depleted.

On the other hand, communism is dissipating it's sources by short distance interactions, so it behaves well under low energy density conditions, being more primitive. Centrally controlled system handles better long-term problems, like global warming or the life environment protection, because free market economy cannot evaluate well the price of extinct species, until they're not significant for economy (as a source of cancer cure, for example, or various biotechnologies). From this reason, society evolution converges to mixed arrangement, not pure communism or free market economy, despite of wishfull thinking and occasional propaganda of their opponents.

Now we could ask - whether free market didn't exist a well before the very first government has appeared? Yes, it did - but the energy density of such free market exchange was very low with compare to the present state, so its behavior was analogous to primitive capitalism. Now the civilization has collapsed into rather dense money and energy exchange state, so that the free market relationships can persist in very short distance interactions inside of families and close sectarian groups.

The contemporary financial crisis illustrates clearly, the keeping of bursa and highly derived financial products of advanced economy requires the strong patrolling role of state as well. And who didn't realize it is just an dangerous idealist. Most of political troubles of modern civilization were caused just by ideologies supported by various idealists, both proponents of free market society, both communism. Shooting the fanatics is probably not a really good way to expand tolerance, but we do need to keep pushing.

středa 11. února 2009

AWT and Zeta function

From formal math perspective, Riemann Zeta function describes the probability density of prime number distribution in complex plane, i.e. when expressed by Ulam's spiral.



From AWT perspective Riemann Zeta function is significant from two dual reasons. It first, Ulam's spiral is closely related to Fibonacci spiral, which corresponds the patterns of closest packing of colliding particles and Golden mean ratio. Plant leaves and septum of snail/sea shells are minimizing the volume/surface ratio by the same way, like growing sphere formed by mutually coliding particles in diffusive (random walk) process.

Because numbers in vector space can be considered as a formal representation of colliding countable particles in N-dimensional space, we can see, how closely the number theory is related to particle physics and vice-versa.

Zeta function is therefore important in description of supergravity (SUGRA), which is general interaction between density gradients, formed by their shielding effect in N-dimensional space in analogy to Fatio de Duillier - Le Sage theory of gravity. The common gravity and Casimir force are projections of supergravity into 3D space from this perspective. The contributions of supergravity in higher dimensions converge to zero in accordance to surface/volume ratio dependence of N-dimensional hypersphere (i.e. hypersurface area) and the high dimensional residuals are promoting the complexity of observable reality.


The consequence is, inside of dense particle field the nested density fluctuations (clusters) are formed, but the number of cluster generations is strictly limited by geometry of their mutual interactions. After some three generations a less ore more chaotic particle superfluid is restored, as can be demonstrated by computer simulations. This behavior is related to many natural phenomena, from hydraulic sterilization over shaped warhead theory and viscoelastic properties of complex polymers (tixotropy of ketchup or Thinking Putty plasticine) and certain alloys (bismuth) under pressure to boson condensate and HT superconductivity theory. AWT explains the existence of three particle generations and the presence of superfluid condensate forming the vacuum inside of black holes by this way.

úterý 10. února 2009

Danger of positive approach

Few weeks ago, Lubos Motl has won the 2008 Weblog Award price for Best European (Non UK) Blog, which may become a surprise for someone, proponents of dual theories in particular. Whereas AWT is rather invariant/symmetric with respect to string /LQG duality, we can make an attempt for independent analysis of this result. The positive thing may be, Lubos is compatriot of mine, we even both born in the same city - which is not so difficult, after all, as the Czech Republic is really tiny country. Czech Republic is birthplace of many brilliant and intelligent people and beautiful women as well, which is partly due its location in central Europe on the crossroad of many trade routes, along which the mixing of various races can occur.



On the other hand, I'm not very sure, whether Lubos is a typical representative of Czech science, society the less - which is traditionally rather balanced in its opinions, if not opportunistic due it's sensitive geopolitic role of small boundary country between zones of interest of East an West Europe blocks. Buffer countries are often playing a role of branes, which leads to the fragmentation of state boundaries in this area. Aether foam gets more dense at place of density gradient due the potential energy content, where two dual space-time branes/gradients intersects/interfere mutually.



Whereas Weblog Award is rather representative competition, it's still based on results of surprisingly limited number of votes, because the first place in Best European Blog category was a matter of just some 700 voices. Which isn't really too much in the world of anonymous proxies, whereas Goggle is doing a lot more than 1000 queries per second (about 25 queries per second per server). Anyway, Motl's price is well deserved for his frenetic activity and it's even logical in certain extent, because his postings are often quite entertaining and informative and Reference Frame blog is one of few ones, which I'm visiting regularly. Because modern people are basically consumers, Motl's graphomania plays well with their needs, because average visitor can always find something new in his blog everyday.

If so, where's the problem?

Even if we ignore the excentric and subjectively ugly design (typical for Motl's sites) and sometimes unstable behavior of scripts on his site (the purpose of which is to prohibit Motl's opponents in visiting and posting at times), we shouldn't neglect the fact, popularity of this blog is partially based on strongly biased opinions and ad hominem attacks, followed by personally motivated censorship of discussions, which manifest itself by sectarian character of people, which are allowed to post there (simillia simillibus curantur). From these reasons, Motl is often perceived as a controversial person in blogosphere. Personally, I do not believe, most of people, who voted in Weblog Award Poll didn't realize autistic and asocial character of "humble correspondent's" blog - the problem is, the system of voting didn't enable them to express their opinion. Negative voices simply don't count here.

This is a general property of contemporary voting systems, which enable only positive votes, which leads to high degree of populism in side of politicians and ignorance and lack of interests about negative aspects of politics on the side of publicity. Even morally controversial politicians may become successful in this system, if they're is sufficiently active in another areas, in self promotion of personality cult in particular. I believe, this MAY be one of reasons of society problems with its own political representations: voters simply have no veto privilege - they can be only partly responsible. In natural evolution such unbalanced approach to fitness function would suffer consequences, because it violates the equilibrium of supply and demand.

As I'm not expert in social sciences, I'm not informed, whether such approach was proposed or even tested in history and which reasons has lead people to consider only positive voting approach in anonymous elections. Maybe it could have adverse effects and it would lead to undesirable level of opportunism between politicians, I don't know. But as I've met in many cases, most trivial ideas were often ignored for long time just because of their simplicity or generally low asset, which can manifest only under high civilization density. Maybe it could even save Germany from nacism in the mid of 30's of the last century, which was rather inclined to Hitler's populism. If so, maybe the time of more dualistic/symmetric voting system just come up.

pondělí 9. února 2009

Consistence problem of string theory

The understanding, why formal theories, like string theory cannot lead to some particular solution is quite easy in AWT, if we use water surface model for illustration of Lorentz invariance. Beneath water surface the surface wave cannot spread by causal way. With respect to such wave spreading, underwater appears like void and empty space, while such environment definitely exists from more general perspective of some faster reference interaction. For example the motion of surface waves can be followed and observed easily by using of underwater sound waves, i.e. by using of sonar, because sound waves are spreading approximately 1000 x faster through underwater, then the surface waves.

As we can see, the fact, we cannot observe the (motion / reference frame) of environment by its own waves doesn't mean, this environment cannot exist from more distant (nonlocal) perspective. The explanation of wave character of light and many its properties would require us to believe in hypothetical environment for light wave spreading, although such environment cannot be detected by using of light directly.

The existence of such environment is related to existence of so called hidden dimensions of space-time. For real life example, surface waves are spreading along two-dimensional density gradient, which is forming water surface. While underwater is three dimensional environment. AWT explains the existence of surface gradient by compactification of it.



String theory considers as well, hidden dimensions of space-time are somehow compacted. While this assumption is consistent with Aether concept, I newer find an explanation of that claim in string theory literature, the illustration of it the less. It's evident, string theorists somehow guessed it or borrowed this explanation from particle environment concept unconsciously, while ignoring the rest of connections. The punishment was undeniable.

The existence of Aether corresponds the existence of hidden dimensions for surface waves, because underwater space exhibits an additional dimension, with respect to surface. Therefore every theory (like string theory), which is postulating existence of such additional dimensions is postulating the existence of some hidden environment as well - despite the fact, some proponents of these theories doesn't realize it apparently. Hidden dimensions for energy spreading through vacuum are equivalent the underwater dimensions for waves at water surface.

From AWT follows, Lorentz invariance is a result of strictly local perspective, every nonlocal perspective would violate Lorentz invariance, because such violation is just, what the existence of hidden dimensions means. In real life example, surface waves are dragged by underwater whenever we can consider the existence of such underwater, which becomes a reference frame.
Such conclusion makes string theory deeply inconsistent conceptually. It tries to prove the existence of hidden dimensions on background of Lorentz invariance, which is violating them. This is a simple consequence of fact, string theory proponents didn't understood the subject of their own research, trying to replace the understanding by formal regression of reality based on formal postulates and ideas, collected blindly from another theories.

No wonder, resulting theory has no meaningful solution, because it's based on assumptions, which are mutually contradicting each other from their very beginning for most of nonlocal perspectives (if not all). Instead of this, it leads to huge landcape of nearly infinite number of solutions, thus serving like ineffective and quite costly random number generator. String theorists can only hope, for some limited volume and nontrivial space-time topology the effects of Lorentz invariance will compensate the effects of hidden dimensions - but this is not exact just the approach, which we could expect from proclamativelly strictly rigorous physical theory.

user posted image

Despite of this, many string theorists are apparently quite proud to their formal approach, tirelessly filling publications by various complex equations. I can tell you, today it's nearly impossible to publish string theory article in peer reviewed press without some formal equations. But for laymans should be warning the fact, we never met with some graphics representation or simulations of their results from obvious reasons - simply because such simulations can never exist! Their equations were be never solved explicitelly, neither plotted in their rigorous formal state. Believe it or not - even after forty years of intensive development nobody has seen even single one example of string modeled by string theory - only some pathetic hand-drawn illustrations copied from first textbooks. And we are talking just about numerical models by now, not about some testable predictions, relevant for physics. But string theorists somehow managed their situation for whole long forty years like alchemists of medieval era, promising Philosopher's stone (Lapis philosophorum) for the rest of society.

Isn't it amazing? I can assure you, this is a true real story of contemporary physics.

Unfortunately, as deeper analysis reveals, other formal theories like LQG theory suffers the same conceptual problems, just in less apparent way - as we can illustrate later. The frontier status of string theory only makes it's internal inconsistency more apparent, that's all. This is partially because string theory is based on special relativity, which is easier to comprehend - then more advanced postulates of general relativity, used in other quantum gravity theories.

The optimist sees the doughnut. Pessimist sees the hole in doughnut. Popper's methodology is apparently based on pessimist approach - it requires us rather doubt then to believe and to see the holes in every theory first. Beauty is always somewhat impractical an violated in symmetry - this is what makes it attractive. From single postulate we cannot construct a vector of logical implication. The theory based on fully consistent postulate set would become tautology undeniably. We could derive each postulate from another, thus effectivelly decrease their count to single one, after then. Therefore no formal math based theory can be completely selfconsistent and as such correct more, then observable reality.

Someone is saying, string theory is beauty and elegant theory. By my opinion, its product of complexity and predictability is suboptimal, as we have a more powerful formal theories already, like the ingenious Heim theory, which handles the concept of hidden dimensions as well, if not better. This doesn't change the fact, every theory brings a new perspectives into our understanding of reality and string theory is no exception. Anyway, from AWT follows, every theory, which expects Lorentz invariance and hidden dimensions at the same moment remains deeply inconsistent, simply because hidden dimensions manifest itself just by Lorentz invariance violation.

Are we getting dumber in our Universe?

This post is a reaction to recent report about a new study by James Flynn, by which teenagers in Britain should have lower IQ scores than their counterparts did a generation ago (via Reference Frame blog log, where dual opinion of Lubos Motl is presented), you can consider another study. It brought me another few recent reports in mind: the information about evaporation of kilogram meter prototype, dilatation of iridium meter prototype and Pamela Gay's report about violation of standard candle supernovae and many other speculations, too.

How all these apparently different areas MAY be related together? In concept of omnidirectional space-time expansion a strange thing appears: such expansion accelerates gradually, until light isn't able to follow it anymore. We can consider the remote Universe boundary or black hole event horizon as a place, where the light effectively freezes in gradually expanding universe.

By AWT we can imagine a consequences of Universe expansion easier from exsintric perspective dual to space-time expansions: every piece of matter appears like less or more dense blob of Aether density here, i.e. like piece of pre-expanded space-time from classical insintric perspective. The space-time expansion can be interpreted like compactification of vacuum foam as well. After then the difference between density of matter and density of vacuum would decrease during space-time expansion, the observable matter would dissolve in vacuum, because the surface tension forces keeping the particle together will not be strong enough to keep the particle of matter together anymore.



Such insight has a number of testable implications. Every material object will expand in gradually expanding space-time the more, the more compacted space-time it contains, i.e. the more dense is as well. The speed of light, as measured by standing photon waves in solid objects would decrease with compare to values, measured by standing wave in vacuum. This can be interpreted as a slowing of time. And every material object would dissolve into energy of vacuum, i.e. radiation in less or more distant perspective: the faster, the more heavy is - which is commonly observed at the case of massive stars. We can consider a nonzero speed of proton decay, as predicted by some theories in this connection.

As the result, observable matter would become more lightweight with compare to matter in distant past and it would expand as a consequence of gradual decrease of gravity constant. The Expanding Earth theory may emerge immediately, but this concept can be verified even by look into distant past. The frequency if cepheid supernovae explosions, considered as a standard candles will increase gradually, while luminosity of will decrease, because it proportional to total mass converted into radiation in each explosion. This dependence can explain the dark energy phenomena naturally, because the Hubble constant measurement depends on the measurement of relative brightness of cepheid supernovae explosions.



The above phenomena can be related to IQ declination problem of English population. Maybe IQ didn't decrease, but it was transformed into higher dimensions/low distance scope interactions accordingly to increased civilization mass/energy density. Inside of dense gas the interaction scope of particle decreases too: the particle interactions are much faster, but more temporal as well. The social and family relations of members in society would become weaker as well and more promiscuite.

The fast reactions and social IQ becomes more important for survival of each individual, then in the more sparse society. For example, most of teenagers can chat and to send SMS a much faster, then I can do. But theirs knowledge is both more superficial, both more specialized. Our civilization changes into single large nest of giant ants.



After all, we can observe it even in theoretical physics. Many contemporary physicists are brilliant specialists - but their knowledge of various subtleties of Victorian physics are very schematic and superficial. They're separated from reality by the same way, like from the rest of society. Without math support they cannot estimate and predict (nearly) anything correctly. The ability of non formal thinking is sadly lacking. Which is sad sometimes, as they're believe, their level of specialization gives them a qualification to advice people in other areas (like politics, economy, global warming and so on). While exactly the opposite becomes true often from obvious reasons.

By my opinion, this specialization is not a result of laziness or dumbness, but rather sort of adaptation to increasing volume of informations. Nearly nobody requires hand writing or mental arithmetic like square rooting today, to solve algebraic equations the less. After all, these tasks can be handled by hand computers and software tool easily, so I don't see a large problem here - we cannot simply handle everything. This situation increases importance of simple general ideas, from which many subtleties can be derived easily on demand. The importance of overcomplicated theories will decline by the same way, like the significance of command line and assembly languages decreased under income of moder high level object languages and GUI based operational system.

So experts have developed various IQ tests, the purpose of which is to measure an IQ in general way, but their metrics didn't change with time so fast, like the preferences of society. Therefore we are facing the same effects inside of our society like inside of our Universe - just in perspective few dimensions closer to human brain perspective. From internal perspective we can see, how our IQ grows, while it still collapses from exsintric perspective.

neděle 8. února 2009

AWT and string theory

This post is a reaction to defense of string theory by Lubos Motl. It should be pointed out, such Don Quichotte stance is typical neither in context of limited string theorists group, whereas the most intelligent and productive ones - like Ed Witten - are already combining various concepts borrowed from other theories rather freely and without false sentiment to some particular theory. Simply because they're both technically, both mentally capable to do so. By AWT the optimal strategy in reality understanding is always balancing the formal and non-formal approach and its very pluralistic by its very nature.

Which is definitely not the strictly black and white view of people like Lubos Motl. Who even didn't understood, that the xkcq's cartoon picture advocated in his essay was ment as an apparent critique of his own defensive approach, based on arrogant mixture of attemps for biased demagogy and personal attacks often. His belief in selfevidence of string theory approach is simply so undeniable, it cannot be shaken by any logical argument, which should be punished with no mercy due "antiwhore" strategy. It's apparent, we are facing a very biased and secular (local) stance here - a sort of mental singularity.



This singular approach is an crystalline example of typical immature / expert stance, which makes Mr. Motl very useful for illustration of various boundary aspects of unparticle environment, the duality and supersymmetry principles in particular. From this principle follows, every revolution devours its own children in less or more distant perspective, string theory revolutions is no exception. Lubos stance is analogous the role of observer, who falls into black hole while applying a local Lorentz symmetry perspective in belief, it's just the space-time, what is deforming during this, not the path of light. By such way, he is allowed to keep is belief even though he is already spinning together with light in dense vacuum around black hole.



In real life such observational perspective would vaporize into accretion radiation together with its proponent, because laws of emergent geometry are undeniable: every isolated stance violates itself or the stance of other experts in less or more distant perspective, or they couldn't be more local, then the any other stance. By such way, the stances of many experts always tends to vanish mutually like interaction action of many isolated particles, only their common points (ironically just these most superficial ones) can cumulate under emergence of new ideas.

So, what problem really is with string theory? Well, none basically - with only exception, string theory is (not) a TOE by the same way, like any other strictly explicit and formal theory. In AWT the string concept has a strong and fundamental resemblance to dense particle gas fluctuations. By such way, AWT explains, what these strings and branes are and how they can emerge in our perspective. The stringy shape appearance of Aether density fluctuations is just a product of distant space perspective, which vanishes all higher dimensional geometries into quantum chaos, so that only 1D strings or 2D branes remains. The same effect we can observe inside of dense supercritical fluid, which appears to be full of strings. By such way, string theorists have guessed the geometry of fundamental pieces of observable reality without having a single idea about its emergent nature, required by less or more hidden causality of things.



No wonder, the idea of string was revealed just during study of interactions in very dense nuclear fluid (quark gluon plasma), where the stringy character of both particle fluctuations, both their interactions is most apparent. The strong point of string theory is, it revealed its highly dimensional character quite soon. The weak point of string theory was, it didn't recognized their emergent and implicit nature, by which each string consist of many daughter strings, which appears like pin-point particles from more distant perspective.



By AWT we aren't required to bother with formal details of whatever formal theory, because such theory is always defined by its postulate set and the formal predictions of it can be estimated by logical way. The another problem of string theory is, it was never strong in definition of its postulate set - it's rather a vague cluster of mutually less or more consistent sub-theories, weakly related by common methodology. Because it served as a grant and money source for many brilliant mathematicians, nobody has care, whether these methodologies are truly consistent mutually for many years. Nevertheless, some common aspects can still be traced here. For example, because string theory is proclamativelly based on Lorentz invariance postulate of special relativity theory, it can never predict Lorentz invariance violation by strictly rigorous way. From this reason the obstinate tendency of Lubos Motl to defeat the concept of Lorentz invariance appears a much more readable: every apparent violation of Lorentz symmetry would violate the string theory concept as well.


But string theory isn't just about Lorentz symmetry, it's a matter of quantum mechanics too, being one of dozens of quantum field theories, in fact - no less, no more. By AWT quantum mechanics is dual to relativity theory in Lorentz invariance postulate, which is strictly radiative time arrow based, so it violates it by introduction of many time arrows in concept of quantum uncertainty. If we accept, special relativity is less general approach, then general relativity, then the string theory based on combination of quantum mechanics and special relativity is apparently less general approach then for example quantum gravity approach based on combination of of quantum mechanics and general relativity. But the more general position of general relativity over special relativity remains questionable, though. We should rather talk about less or more distant perspective, then in terms of globality and locality here, because they could induce a false idea, more global theory is always better. Which is not generally true: more general theory is more separated for directly testable reality as well. And this relation has become another problem of string theory.



While experimental evidence of hidden variables theory was experimentally refused by violation of Bell inequalities, it's even quite strange, the very same quantum theorists are promoting theory, based on parameters in hidden dimensions, because this approach is exactly, what theory of hidden parameters means.

The relative success of string theory in mid of 80's of last century was rather product of good viral marketing due presence of some attractive persons in it (Eduard Witten, Brian Greene), then the testability and predictability, because every theory based on mutually inconsistent postulate set becomes poorly conditioned and it will diverge into tautology singularity or into landscape of infinitely many alternative solutions. This problem is common for every quantum field theory, quantum gravity in particular and string theory is facing it by the same way, like loop quantum gravity by introduction of ad hoced lower dimensional artifacts like strings, branes, quantum loops and spin networks. While string theory was first theory, which has used such approach, it plays a ungrateful role of pathfinder, who is predestined to become overcomed by more general approaches soon or later.



Dual approach of LQG - which is about twenty years younger, then string theory, by the way - is slightly more distant (but still not general) from this point of view. LQG is based on exsintric perspective, while string theory remains insintric, describing only properties of condensed space-time artifacts (i.e. particles), whereas LQG handles even the structure vacuum as well and it can predict the Lorentz symmetry violation. But LQG remains more adherent to limited ad hoced number of dimensions of our space-time in its formalism. As we can see, the strong point of every theory becomes a weakest point of other plural theories at the same area. Both string theory, both LQG theory appears like forks of quantum gravity theory and AWT can be considered both as a zero dimensional string field theory, both infinitely dimensional loop quantum gravity theory from this perspective.

The physicists soon recognized redundancy of low dimensional geometric constrains in gauge group theories (no matter, if they're called strings, branes or spin loops or foam) and they converted in less or more non-compact reformulations of quantum field theory based on various SO subgroups of Lie exceptional group, which are of pronouncedly emergent character. This trend culminated by Lisi Garrett's proposal of his ES TOE from end of 2007. AWT just makes another step further in this approach by logical elimination of ad hoced of gauge group concept from field theory, thus making description of reality fully driven by pure emergence.

čtvrtek 5. února 2009

AWT and evolution of life

By AWT the life is the highly organized form of matter existence, whose properties and abilities are determined by extremely high degree of nested condensation from space-time perspective. Therefore the life formation occurs always near phase interface, where the highest density of space-time gradients can occur due mutual interference of energy waves constituting both phases. The highest concentration of gradients promotes the evolution of maximal complexity, so we can expect the life formation exactly at the middle of dimensional scale of Universe on high dimensional fractal coast of lakes at islands of ancient oceans, covering surface of planets inside of galaxies forming fractal surface of black holes, where solid, fluid and gaseous phase can met together.

Because life is space-time artifact, not just spatial one, the high density of temporal events, i.e. mutations is required to enable the gradual evolution of complexity. This requires an environment, capable of periodic changes and enabling the dissipation of energy in each step. Periodic and tidal waves of ancient oceans can provide such dissipation, because they're paced slowly enough to enable natural selection. Earth rotation and rotational axis inclination, presence of sufficiently massive Sun and Moon provides another level of periodicity due tidal forces, thus increase randomness of evolutionary process.

By AWT the life evolution follows an ancient Oparin's coacervate theory. Coacervates are tiny oily droplets, which are precipitating spontaneously from saturated solutions of various organic compounds, the racemic mixtures of amino-acids and sugars in particular. Under high concentration and some shaking so called reverse micelles or even double layered liposomes can be formed. Such liposomes can behave like walking droplets, described recently:

We can imagine, such droplets were precipitated from waves of ancient lakes at places, where organic compounds were pre-concentrated by wind and solar radiation and they were thrown at coast surface, covered by various surfactants. The droplets are attracted to them, so they started to climb around coast, collecting these materials in their cells. The most successful droplets become so large by such way, they fragmented into smaller ones under impact of next breaker wave, and whole process has repeated many times. Blastulation can be considered as a rudiment of this process by now.


During this the less successful ("low fitness") droplets disappeared gradually on behalf of those better ones, which have collected the proper surfactants into their liposome bodies. Later the concurrence has lead into preference of droplets, which were not only able to collect surfactants, but even to collect the chemicals, able to synthesize them inside of cells. These droplets has become able to digest food after then, so they become hunters of less successful droplets, not just passive collectors of matter from outside. Of course, such competition has accelerated the evolution a much.

And this saga continues till now...

Note that in this early stage of life evolution the inheritance was provided by physical mechanism completely, simply by dividing of cells together with their interior and surface membranes. By AWT the evolution of life follows exactly the evolution of inorganic matter in more nested dimensional scales, i.e. no ribonucleic acids, chromosomes or other contemporary subtleties were required here. We can consider, this mechanism could be reproduced in vitro under proper conditions without problem. Recently living examples of walking droplets were found: a single-celled giant amoebas of very ancient origin.



From AWT follows, such amoebas were first unicellular organisms by the same way, like sponges of foamy structure can be considered as a first multicellular animals. After all, the tissue of higher organisms is rudiment of foam with flat surface as well. The smaller structures (structures bellow human scale of about 1,7 cm) are having concave structures (organelles), while larger tends to become convex (trees, fungi), because they're kept together by surface tension forces. Therefore first organisms were relativelly large from their very beginning, because electromagnetic interaction itself doesn't provide necessary level of complexity and inheritance at molecular level.

Concerning the creationist approach to life formation, the "intelligent constructer" idea is dual to Aether concept and it can be replaced by it easily. From remote space-time perspective every gradualistic evolution becomes discontinuous stepwise artifact by the same way, like event horizon of black holes, when observed from large distance. Every logical explanation is concentrating non-causal assumptions on background, so it becomes a sort of religion. The belief doesn't differ from adherence to causal logics too much, because both approaches tends to tautology by gradual elimination of postulates.

Deism can be understood as an religious approach to Occam's razor criterion, whereas AWT is driven by causual logic. For deeper understanding of God concept we should understand creator paradigm better. Currently it seems, it's just our civilization, which created the black hole, where we are living now. Maybe the moment of final understanding of God becomes the very end of civilization at the same moment, maye quantum uncertainty will protect us from such destiny. Should we kill people like Zephir a well before they can bring an apple of ultimate understanding of reality? Or can just these people prohibits us from destiny of quantum suicide experiment?

AWT, Genesis and precambrian explosion

By AWT elementary particles are small living creatures, which follows energy density gradients (food) of their life environment. Bosons are males, whereas fermions are females. They've a genetic information encoded in helical structure of density gradients inside their body like other living organisms, they consist of foamy tissue composed of bilayers with different surface tension and superhydrophobic behavior, they're tactile and sensitive to heat and mechanical stimulation like other animals.



In general, the she-fermions are more communicative particles, usually rather attractive having mass (some can become quite corpulent). In general, they're loving company and most of all they prefer to exchange food & energy with bosons.

Instead of this, bosons are a movable, unstable and volatile particles. They usually bouncing from one she-fermion to another by high speed. Whenever boson obtains a sufficient energy (fitness), it succeede in mating and it is allowed to exchange its information with fermion. After such collisions a new small particles can emerge, which have structure and property signatures of both parents at the same time.



From this point of view it seems, atom nuclei or black holes are nested closely packed globular colonies of these creatures, similar to "Globe animalcule" (Volvox globator) chlorophytes. This algae can serve as a brane model of strings, being formed by 2D foam.

By Genesis formation of life occurred in six steps, non-uniformly distributed in space-time scale, but equidistantly separated in entropy density scale ("days"). The first stage was a formation of space and time ("heavens and the earth") inside of graviton condensate ("darkness over the deep and God's breath (Aether) hovering over the waters" (waves?)). Gravitons are ambivalent particles, serving both like boson, both like fermions due the supersymmetry.

During Big Bang event (''let there be light") phase transition of space-time has occurred, followed by separation of first generation of bosons, i.e. photons ("God separated the light from the darkness") in process of so called inflation, which resulted into condensation of black hole dome, forming observable generation of Universe ("let there be a dome in the midst of the waters"), i.e. the vacuum in particular ("God called the dome Sky").

By AWT cambrian explosion was a result of analogous phase transition, a condensation of genes following from fast cooling. Around 530 million years ago Earth passed by so called "Snowball Earth" episode, i.e. by cryogenian period of strong cooling by the same way, like the Universe during inflation. During this a existing oceans were covered by thick layer of ice. This shock change of climate was followed by massive extinction, during which remaining organisms were forced to increase speed of their evolution and to exchange genes even in diaspora. The diaspora has lead into evolution of sexual reproduction, which is effective (and quite pleasant) method, how to increase gene mixing speed.

The speed of evolution and mutation must remain always balances in accordance to life conditions. Prokaryota still rely to horizontal gene transfer, simply because they can divide fast. Sexual reproduction is too mutagenic and energetically expensive for tiny organisms with fast paced live cycle (protozoa), so they using it only in under unfavorable conditions.

Large organisms can reproduce sexually, but sometimes tend to parthenogenesis under good life conditions: for example sharks are living in very stable conditions, so they don't evolve fast, they don't require mutations, so they're cancer resistant and hammerhead shark can reproduce asexually. A endometriosis and/or male associated infertility can be understood as an attempt for evolutionary adaptation of human organism to wealthy life conditions, where the sexual reproduction leads to unnecessary high mutagenity. Good social conditions leads to unisex life style and male population will decline gradually in analogy to mixture of particles, which undergoes the gradual evaporation of smaller particles on behalf of large ones with lower social tension.

pondělí 2. února 2009

AWT and plicate topology

By AWT every causal (logical) theory becomes part of physical reality by the same way, like physical artifact itself, which are dual to it. We tend to consider only reproducible repeating events/artifact as real, thus fulfilling the theories and their causal implication. The random fluctuation of Aether density isn't real for us, until we recognize it like electron, photon, etc., i.e. until we assign it into some conceptual group. Only the fact, consciousness is forming environment for these waves, we tend to consider ideas a non-material artifacts by the same way like the Aether, which is dual approach to consciousness from this perspective.

By this way, the observable reality is forming a brane manifold between our ideas and observations (perceptions) mediated by wits and it has both objective, both subjective character of belief. The symmetry of this duality is broken toward multiplicity of casual approach and intersubjective opinon due the emergence paradigm (more is really more).

The physical theories are ideas formalized by group of nested logical implications, which are connected mutually by correspondence principle. Each implication is defined by its causal time arrow, defining the causality. The time arrow is defined by root system of higher order tensors describing the gradient of space-time compactification, which can be furthermore interpreted by a rotation by Lorentz/Poincaré group in causal space. Implication tensor defines a time arrow of causal space-time curvature and subsequent compactification of it. Therefore antecedent /consequent components of every implication are defining time arrow of theory, thus forming a manifolds of causal space and conceptual basis of every theory.

At less abstract level, ideas/concepts are low energy nested density gradients ("strings" or "membranes") of compacted space-time formed by gradients of electrochemical activity inside of human brain. By holographic theory we can consider them as a supersymmetric low energy density projection of the observable reality into our consciousness. Every idea is represented by dense cluster of standing waves of electrochemical activity inside of our brain, which can become shared and entangled between brains of many members of human society. The process of understanding/sharing of such ideas corresponds the collapse of their wave functions: as the result, these ideas aren't chaotic and invariant for us anymore, they become a component of more general order, characterized by higher level of ideas.

This concept was presented first by Bohm in form of it's implicate/explicate order and it was extrapolated later by "Holographic Brain Theory" of Karl Pribram and by theory of Quantum consciousness of David Chalmers and Sir Roger Penrose. These concepts have all robust physical meaning in context of Heim's theory and AWT.

neděle 1. února 2009

Lorentz symmetry and String theory

This post is a polemic to Motl's somewhat nervous defense of Lorentz symmetry (LS), as quoted by italics. It hope, it may be interesting for someone. By AWT the confrontation of ideas in dialectic discussion is driving tensor of new ideas: full agreement cannot serve as a both subject, both object of further thinking and extrapolations.

Moshe Rozali wrote a very sane text about the importance of LS for the search for the fundamental laws of Nature: The Universe is probably not a quantum computer. I agree with every word he wrote. He says that many people who are following the physics blogosphere want to believe that their area of expertise is actually sufficient to find a theory of everything.

.. by the same way, like string theorists and many others.. By AWT whatever theory of your personal preference can become a TOE, if you make it infinitely implicit, i.e. if you compose it from as from minimal number of postulates, as possible. The complex theories mixed from high number of postulates, like string theory would be strongly handicapped by such way, of course.

So Seth Lloyd of the quantum computing fame wants to believe that the world is a quantum computer. Robert Laughlin wants to imagine that quantum gravity is an example of the fractional quantum Hall effect. Other people have their own areas of expertise, too. Peter Woit wants to believe that a theory of everything can be found by mudslinging and defamations while Lee Smolin wants to believe that the same theory can be found by selling caricatures of octopi to the media (following some subtle and not so subtle defamations, too).

..and string theorists are believing in vibrating strings. And so? Live and let live. The world of coexisting theories illustrates the space-time world, being a low energy density projection of it into causual space.

Moshe Rozali correctly tells them that if they are going to ignore the Lorentz symmetry, a basic rule underlying special relativity, they are almost guaranteed to fail. Lorentz symmetry is experimentally established and even if it didn't hold quite accurately, it holds so precisely that a good theory must surely explain why it seems to work so extremely well in the real world.

Lorentz symmetry is violated by quantum mechanics heavily, it's simply based on dual approach be more specific. By AWT even gravitational lensing is rather quantum mechanics phenomena, then the relativity phenomena. To defend Lorentz symmetry you're simply required to fight against quantum mechanics and vice-versa.

It still doesn't mean, Universe computes something for somebody.


Moreover, the state-of-the-art theories of the world are so constrained - i.e. so predictive - exactly because they are required to satisfy the Lorentz symmetry.

Quantum mechanics is based on zero or infinite many radiative time arrows. It's invariant to LS (and other postulates of relativity, based on radiative time arrow causality), while still remains predictive. Aether theory is invariant to both, while still remains predictive. In fact, just because both LS, both quantum mechanics are mutually inconsistent apparently, here's a question, why not to start once again from complete beginning.

Because of this symmetry, quantum field theories only admit a few marginal or relevant deformations. If you assume that they make sense up to extremely high energy scales, you may accurately predict all of their low-energy physics as long as you know a few important parameters. Such a "complete knowledge" of physics in terms of a few parameters would be impossible in non-relativistic theories.

The same is true for relativistic theories. The emergence concept is still required to seamlessly connect both these branches of physics.

String theory is even more constrained than quantum field theory: it has no adjustable dimensionless non-dynamical parameters whatsoever. In some sense, you may view string theory as a tool to generate privileged quantum field theories with some massless spectrum and infinitely many very special, selected massive fields with completely calculable interactions. So all the Lorentz constraints that apply to quantum field theory can do the analogous job in string theory, too.

String theory is like every other quantum field theory in this point. It's true, most of formalism was developed under cover of string theory, because string theory has a good marketing, best experts and some nice faces in front of it. But these approaches can be used in many other theories and the best string theorists, like Ed Witten are doing so without any frustrations.

However, in string theory, the character of LS is even more direct. The very short distance physics of string theory is pretty much guaranteed to respect the LS. Whenever you look at regions that are much smaller than all the curvature radii of a D+1-dimensional spacetime manifold, the dynamics of a closed string reduces to a collection of D+1 free scalars on the worldsheet which manifestly preserves the Lorentz symmetry. And one can show that the interactions respect it, too.

String theory is based on combination of quantum mechanics and special relativity. From this point of view is apparently less general, then any theory based on combination of quantum mechanics and general relativity, like LQG. It's just one of evolutionary steps of physics, no less, no more. It opened many research perspectives, while quantum gravity has opened others.

Open strings may violate the LS spontaneously, for a nonzero B-field or a magnetic field on the brane, and one can enumerate a couple of related ways to spontaneously break the Lorentz symmetry with the presence of branes and their worldvolume fields. But none of these pictures ever hides the fact that the fundamental theory behind all these possibilities is Lorentz-invariant.

This is just one of many perspectives possible. Some others can see an infinitely fractal Universe based upon quantum mechanics units or even particle units. But fractal geometrodynamics, as expressed by double relativity based on Poincare, Cartan and deSitter groups is still in the game as well.

There's a lot of confusion in the public about the fate of the LS in general relativity. Be sure that the LS is incorporated into the very heart of general relativity. General relativity generalizes special relativity; it doesn't deny it. General relativity can be defined as any collection of physical laws that respect the rules of special relativity (including Lorentz invariance) in small enough regions of spacetime - regions that can, however, be connected into a curved manifold. All breaking of LS in general relativity can always be viewed as a spontaneous breaking by long-distance effects and configurations.

Every generalization is predestined to violate its roots less or more lately. My personal understanding is, general relativity has nothing to do with LS at all, being even much more general, then many relativists (specially those special ones) may be willing to admit. Anyway, general relativity has nothing to do with string theory, which doesn't uses postulates of general relativity at all. This belongs into realm of quantum gravity.

In fact, even in spacetimes with a lot of curved regions - such as spacetimes with many neutron stars or even black holes - one can use the tools of special relativity in many contexts: either in very small regions that are much smaller than all the curvature radii, or in regions that are much larger than stars and black holes. In the latter description, the stars and black holes may be viewed as local point masses or tiny disturbances that follow the laws of relativistic mechanics at much longer distances, anyway.

That's perfectly right. And the large systems of such particles are following a quantum or newton mechanics at another distances, and so on.

So if someone completely neglects Lorentz invariance, the player that became so essential in 1905, he shouldn't be surprised if theoretical physicists simply ignore him or her. It is not necessary for a theory to be Lorentz-invariant from the very beginning. But a theory only starts to be interesting as a realistic theory of our world after one proves that Lorentz invariance holds exactly (or almost exactly).

It was just Einstein in 1917, who completely omitted Lorentz invariance from further thoughts. Just because string theory has chosen Lorentz invariance as one of its postulates doesn't means, this approach is the only universal approach to physics. Even Einstein has recognized it - so why not some string theorists?

I am personally convinced that theories that try to break Lorentz invariance by small effects are not well-motivated. But even if I insist on the things that have been established only, the "at least almost accurate" Lorentz symmetry that has been demonstrated is an extremely powerful constraint on any theory. If you invent a random theory for which no reason why it should be Lorentz-invariant is known, it is extremely likely that the LS doesn't work at all and the theory is therefore ruled out.

The small breaking of Lorentz invariance we can observe as a quantum chaos. It's not a consequence of violating it, rather applying it in many concurrent time arrows. Because every particle itself is Lorentz invariant, the mutual interaction of many particles brings a causal uncertainty into global view. The theory based on small effects is Kostelecky theory, for example.

There are actually approaches to string theory that are not manifestly Lorentz-invariant. For example, the BFSS matrix model, or M(atrix) theory, is a 0+1-dimensional quantum field theory - a U(N) gauge theory with 16 supercharges. You can also say that it is a quantum mechanical model with many degrees of freedom organized into large Hermitean matrices. It resembles non-relativistic quantum mechanics, with some extra indices and a quartic potential.

Every theory should be defined by its postulate tensor, string theory is no exception. No theory, which is based on Lorentz symmetry can derive the violation of this symmetry by rigorous way.

There is no a priori reason to think that such a seemingly non-relativistic theory - whose symmetry actually includes the Galilean symmetry known from non-relativistic physics - should be Lorentz-invariant. Except that one can defend and "effectively prove" this relativistic symmetry by arguments based on string dualities. Although it can't be completely obvious from the very beginning, the original BFSS matrix model describes a relativistic 11-dimensional spacetime of M-theory. But the relevance of the matrix model for M-theory only began to be studied seriously when arguments were found that these two theories were actually equivalent. You simply can't expect your non-relativistic model to be equally interesting for physicists if you don't have any evidence that your model respects Lorentz invariance - or if it even seems very likely that it cannot respect it. Physicists would be foolish to treat your theory on par with QED or the BFSS matrix model because it seems excessively likely that your theory can't agree with some of the basic properties of the spacetime we know.

This is not true. In AWT the LS is provided by fact, no object can serve both like subject, both like mean of observation at the same space and time (a singular case of observation, based on zero degree causal tensor). Therefore Aether concept cannot violate Lorentz symmetry locally by its definition.

Emergence and the role of Lorentz symmetry in the grand scheme of things.

That's right, but the emergence has no relevant explanation in physics without Aether concept, not a string theory. And they're both theorems of AWT. Aether concept doesn't uses neither require any other ad hoced concepts. While emergence is required both for explanation of relativity, both quantum mechanics, I believe, we can avoid LS safely for future by the same way, like prof. Einstein did.   

The comments above should be completely uncontroversial. But let me add a few more speculations. Because space is emergent in string theory, the LS - a symmetry linking space and time - has to be emergent, too. This symmetry of special relativity is telling us that things can't move faster than light in the newly emergent geometry. What is this constraint good for? Is Nature trying to tell us something deeper than that?

The claim "space is emergent in string theory" simply mean, space is composed of many tiny strings. If you cannot realize it, then you simply don't know, what the emergence is based on. The Nature is just trying to tell us, it doesn't matter, which concept you're use in large quantity, it always loses its conceptual subtleties and becomes a pin-point singularity, i.e. "particle" from sufficiently distant space time perspective. This is what the Aether approach is based on: on particle abstract. The symmetry you're disputing just illustrates, the LS has its principal limits in anti deSitter space. From perspective of observer sitting inside of dense fluctuation of Aether the energy will spread outside of black hole by superluminal speed without problem.

Well, I am confident that special relativity is important for life as we know it because motion is very helpful for animals and the equivalence of all inertial frames is the simplest (and maybe the only plausible) method for Nature to guarantee that the very motion won't kill the animals. Imagine that you would feel any motion - you would probably vomit all the time and die almost instantly. ;-)

Stop trolling. Special relativity is important for life of (special) relativists and some fundamentalist string theorists only. Some people can become quite naturalistic, when defeating their pet theories...;-)

The Lorentz symmetry and the Galilean symmetry were the two most obvious realizations of the equivalence of all inertial frames that Nature could choose from, and She chose the LS because it treats space and time more democratically than the Galilean symmetry. (I could probably construct more robust anthropic arguments even though they would probably not be based on the motion of animals only - simply because the low value of "v/c" for animals indicates that the finiteness of "c" is not necessary for life itself.)

Nature doesn't choose the LS, the Prussian academy under Planck leadership has chosen it as its paradigm to avoid influence of Poincare's Sorbonne. This is a difference...;-)

But in the previous two paragraphs, we were talking about the 3+1 large dimensions of spacetime only. String theory has additional dimensions that can emerge in various ways and that are dual to each other - and the LS applies to all these dimensions as long as they become larger than the curvature (and compactification) radii. In some sense, that's quite shocking.

Emergence isn't miracle, it has very simple reason in AWT. Some physicists are becoming a cocooned creationists apparently, because they tend to use the concepts without their firm reasoning. This is a consequence of less or more hidden belief into reality, not the reality understanding by logical implications based on analogies.

The conclusion is, LS violation isn’t supposed to be weak at all. If we consider, particles of matter are all formed by the same vacuum, like the rest of cosmic space, then the LS violation is responsible for refraction index of both black holes, both elementary particles, everything. If LS would be complete universal, we would see anything from Universe - simply because it would be nothing to deflect path of light.

We can call this missunderstanding by proverb “The darkest place is under the candlestick”. Many scientists are spending money and their lives by obstinate search for LS violation - whereas they’re virtually sitting on it all the time. This just illustrates, why is it so important to understand subject at nonformal, conceptual level. It can save the money for all of us.

Every quantum mechanics phenomena is just a manifestation of nearly singular Lorentz symmetry violation from this perspective. Not saying about weaker effects, like CMB, gravitational lensing, photon-photon interactions and pair formation, GZK limit, dark matter… Virtually, if we can observe at least something, then the LS is violated there. We can see just this portion of curved space-time, because the places, where LS remains valid well are transparent for us by definition.

The same, just dual problem exists with quest for hidden dimensions. Because scientists are refuting Aether concept, we are forced to pay them for development of alternative models and for proposal of experiments, which could confirm the presence of hidden dimensions, albeit every quantum chaos or complex long distance interaction is demonstrating them clearly. Such ignorance may appear funny, but it's an innefective and expensive game for the rest of society, because these scientists can get involved in more usefull things.

To be sarcastic regarding string theory, I’d say, it tryies to describe by using of LS just this part of Universe, which violates it most pronouncedly. But this paradox is logical, because we can never use the same aspect of reality both the object of  observation/ description, both the mean of observation/description. We can see, the same logics, which introduces the Aether can be used even for Lorentz symmetry at the another level of reasoning. Theoretical description is dual to experimental observation in this sense. The reality is partly real, partly the consequence of theories and observable reality forms the boundary of both approaches.

Anyway, quantum gravity suffers the same conceptual problem, being dependent on equivalence principle instead of LS. It just means, it becomes wrong/singular in different part of conformal space-time: it can describe the LS violation of free space, assuming a “stringy structure” for it, while it’s missing the complex multidimensional structure of particles.

Whereas string theory depends on LS, it cannot predict the LS violation phenomena by rigorous way, because it doesn’t care about vacuum structure (with exception of string field theory and some other boundary approaches) . But it can describe well the complex structure of particles as such. These nice theories are AdS/CFT dual in fact, being separated by one derivation of Aether gradient in its description (they're mutualy orthogonal each other via Lorentz symmetry group).