This monograph grew out of a number of lectures by the author at the Imperial College in London and publications in various physics journals. It is intended as a challenging alternative to the various string models for a unified theory of elementary particles, considered by some of the most outstanding physicists our best hope for a "final theory," while rejected by other equally well recognized experts. The author of this monograph received his Ph.D under Heisenberg, and in conversations he was privileged to have with Heisenberg, this great scientist always expressed the opinion that it makes no sense to think of objects as composed of some constituents once one goes beyond the hadronic energy scale. His argument was that the concept "to be composed of" becomes meaningless above this scale, with the sum of the masses of the fragments into which the object is supposedly broken up larger than the mass of the object. Accordingly, it would be wrong to think of elementary particles to be composed of smaller particles, as we are permitted to think of molecules to be composed of atoms, atoms of electrons and nuclei, and nuclei of protons and neutrons. However, not only has Heisenberg's view been contradicted by the discovery of the subnuclear quark structure, but more important by the indication that the strong, the weak and electromagnetic force become equal at an energy of ~1016 GeV. Because this energy is (on a logarithmic scale) not too far away from the Planck energy of ~1019 GeV where the gravitational force becomes strong, it supports Einstein's conjecture that all forces of nature are rooted in gravity. Einstein may there be right, but not necessarily in his belief that this unification would have to be found in the geometry of space and time. But how can Heisenberg's argument against the concept of compositeness beyond the hadronic energy scale be reconciled with a unification energy of ~1016 GeV if not ~1019 GeV. I believe the answer is that in nature there are negative besides positive masses, with the negative masses hidden from observation below the Planck energy scale. One may then conjecture that elementary particles are composed of very large positive and negative masses, possibly as large as the Planck mass. This would be somewhat analogous to condensed matter physics with regard to the existence of positive and negative electric charges. There the positive and negative charge, obtained by separating in a solid piece of matter all the positive from the negative charges, would be enormous in comparison to what a solid piece of matter can actually be charged before it disintegrates. The hypothetical existence of negative masses may also solve the problem of the vanishing cosmological constant. It is equivalent to a mass-neutral vacuum, which is analogous to the charge-neutral state of condensed matter. As small departures from charge neutrality can occur in condensed matter physics, so would small departures from the mass neutrality of the vacuum, felt through the presence of elementary particles with a mass small compared to the Planck mass. However, the existence of negative masses seems to be possible only if Lorentz invariance is given up as the fundamental kinematic symmetry of nature, because a relativistic theory would make unstable a vacuum composed of positive and negative masses. Only in a nonrelativistic theory, where the Hamilton operator commutes with the particle number operator, can the number of positive and negative mass particles be conserved. It is for this reason that the hypothetical existence of negative masses implies the Galilei group with an absolute space and time as the fundamental kinematic symmetry of nature, with the presumed unification of all forces at the Planck scale suggesting that the Galilei group is dynamically broken into the Lorentz group below this scale. Disregarding translations, the Lorentz group - unlike the Galilei group - is noncompact, and which is the cause of all the divergencies in relativistic quantum field theories2, absent if the Galilei group is the more fundamental symmetry.

Einstein believed that the laws of nature should be unified by a noneuclidean geometry of space and time in analogy to his gravitational field theory. Heisenberg, in contrast, believed that the unification could be reached by a sufficiently large group to accommodate all the forces and symmetries found in the spectrum of elementary particles. Taken to the extreme these attempts have led to string theories with very large groups, to be formulated in a higher dimensional space, not the four space-time dimensions of physics. In the opinion of the author it is more plausible that nature works like a computer with finite size elements in three-dimensional space and one-dimensional time. Accepting the Copenhagen interpretation and the von Neumann theorem against hidden variables, quantum mechanics - unlike Newton's mechanics - cannot work like a computer. But this is possible if both quantum mechanics and special relativity are no more than asymptotic theories, approximately valid for energies small compared to the Planck energy. Assuming that nature works like a computer with finite size elements can be seen as the atomistic hypothesis of matter, space and time. It is a return to Newtonian mechanics but with the important modification admitting negative besides positive masses.

At the turn of the 19th century Kelvin proclaimed his famous clouds of physics: The failure of the Michelson-Morley experiment to detect an aether wind and the violation of the classical mechanical equipartition theorem in statistical thermodynamics. The removal of these clouds led to the two great breakthroughs of 20th century physics, the theory of relativity and quantum mechanics. At the turn of the 20th century there again are clouds which so far have withstood all attempts for their removal. These clouds are:

1. The riddle of quantum gravity.

2. The (observationally established) vanishing (resp.very small) cosmological constant.

3. The superluminal (experimentally over 10 meters verified) quantum correlations.

None of the theories so far proposed has removed these clouds, but all of them postulate as ultimate gospel truths the theory of relativity and quantum mechanics. Departing from these postulates I show that both special relativity and quantum mechanics can be derived from few conceptually very simple assumptions at the Planck scale. I believe that only such radical approach can bring us closer to removal of these clouds.

I strongly believe in the reductionist goal to deduce the laws of physics, if not from world formula, but from few fundamental principles. Such theory, be it some string theory, or the alternative theory offered here, would have to deduce the mathematical structure of the Lorentz group and of quantum mechanics, the properties of all elementary particles as well as the laws of celestial mechanics. As it was emphasized by von Weizsacker (reference [25], . 192), we are here not permitted to be modest.

With the Planck length as smallest length, the theory can be formulated in finitistic i.e. Non Archimedean way, in the sense of Archimedes' conjecture that the transcendental number can be reached as the circumference of polygon in the limit it has an infinite number of sides, which obviously is not possible if there is smallest length.

Following the completion of this book m colleague Professor . J. Treder brought to my attention some little known but highly interesting papers by Mathisson [90] and von Weyssenhoff [91], which shed light on Einstein's hope to derive quantum mechanics from his vacuum matter-free gravitational field equations of general relativity. Einstein's idea was that the motion of singularity in the mater-free gravitational field equations might reproduce the motion of particle in quantum mechanics, but paper towards this goal he had published with Grommer [92] led only to the classical Newtonian motion. However, it was shown by Mathisson that the motion of mr general multipole singularity had striking resemblance with the motion of particle in quantum mechanics. Following up on Mathisson's work, it was shown by von Weyssenhoff that Mathisson's multipoles were actually mass multipoles involving both positive and negative masses. Because Einstein and Grommer considered only unipole singularities (that is mass monopoles), explains why they failed, but also demonstrates the importance negative masses might have in unlocking the mysteries of the universe.

I would like to add a few remarks explaining how and why I got interested in this problem. There were three things which motivated me. First, the quantum mechanical zero point energy of the vacuum, reintroducing a kind of aether into physics I had thought was permanently expurgated by Einstein. Second, my getting acquainted with the dynamic pre-Einstein theory of relativity by Lorentz and Poincare. And third, the apparent gross violation of Einstein's no faster than light velocity postulate of special relat ivity in the quantum correlation Einstein-Rosen-Podolsky type experiments. With regard to the zero point energy I was struck by the fact that it is Lorentz invariant even though it was derived solely by the laws of quantum mechanics, not relativity. Could t his perhaps signify a deeper relationship between quantum mechanics and relativity, I repeatedly asked myself. The zero point energy had to be Lorentz invariant to complete Lorentz's derivation of the Lorentz contraction, and hence the entire body of special relativity, because Lorentz had not taken into account the "repulsive" quantum force caused by thezero point energy without which matter would not be stable. However, since the zero point energy is divergent it had to be cut off at some length, most likely the Planck length. Such a cut-off would destroy Lorentz invariance in approaching this length, defining a distinguished reference system in which the zero point energy is at rest. The Planck length, of course, corresponds to the incredible energy of ~ 1019 GeV, far beyond the reach of any particle accelerator. But there remained the faster than light quantum correlations, and it occurred to me that they could be a manifestation of unknown physics at the Planck length. I thus came to the conviction that what is called collapse of the wave function, in the quantum correlation experiments demonstrated to go with superluminal speed, had to involve the Planck mass. (This conviction I share with Mr. Penrose but whose ideas are very different.)

The assumption of negative besides positive masses is consistent with the assumption that the fundamental group of nature is SU2, and that nature works like a computer with a binary number system. The fact that SU2 is isomorphic to SOS (the rotation group in three dimensional space) makes it understandable why space is three dimensional.

I have chosen the name Planck Aether because it was Planck who in 1911 [96] had recognized that quantum theory leads to a vacuum filled with a zero point energy.

Finally, my special thanks goes to Prof. Dr. A. Klemm, editor of the Zeitschrift fur Naturforschung, Prof. Dr. David Finkelstein, editor of the International Journal of Theoretical Physics, and to the late Prof. Dr. R. Raczka, editor of the Acta Physica Polonica, for their courage to give in their journals a forum for the kind of iconoclastic ideas I strongly believe the scientific community should be confronted with.

F. Winterberg, Reno, Nevada, USA 2002