Unified Field Mechanics, the topic of the 9th international symposium honoring noted French mathematical physicist Jean-Pierre Vigier cannot be considered highly speculative as a myopic critic might surmise. The 8th Vigier Symposium proceedings "The Physics of Reality" should in fact be touted as a companion volume because of its dramatic theoretical Field Mechanics in additional dimensionality. Many still consider the Planck-scale zero-point field stochastic quantum foam as the 'basement of reality'. This could only be considered true under the limitations of the Copenhagen interpretation of quantum theory. As we enter the next regime of Unified Field Mechanics we now know that the energy-dependent Einstein–Minkowski manifold called spacetime has a finite radius beyond which a large-scale multiverse beckons. So far a battery of 14 experiments has been designed to falsify the model. When the 1st is successfully performed, a revolution in Natural Science will occur! This volume strengthens and expands the theoretical and experimental basis for that immanent new age.
Sample Chapter(s)
Foreword (144 KB)
Iterants, Fermions and Majorana Operators (409 KB)
https://doi.org/10.1142/9789814719063_fmatter
The following sections are included:
https://doi.org/10.1142/9789814719063_0001
Beginning with an elementary, oscillatory discrete dynamical system associated with the square root of minus one, we study both the foundations of mathematics and physics. Position and momentum do not commute in our discrete physics. Their commutator is related to the diffusion constant for a Brownian process and to the Heisenberg commutator in quantum mechanics. We take John Wheeler's idea of It from Bit as an essential clue and we rework the structure of that bit to a logical particle that is its own anti-particle, a logical Marjorana particle. This is our key example of the amphibian nature of mathematics and the external world. We show how the dynamical system for the square root of minus one is essentially the dynamics of a distinction whose self-reference leads to both the fusion algebra and the operator algebra for the Majorana Fermion. In the course of this, we develop an iterant algebra that supports all of matrix algebra and we end the essay with a discussion of the Dirac equation based on these principles.
https://doi.org/10.1142/9789814719063_0002
A fundamental problem in the theory of computing concerns whether descriptions of systems at all times remain tractable, that is whether the complexity that inevitably results can be reduced to a polynomial form (P) or whether some problems lead to a non-polynomial (NP) exponential growth in complexity. Here, we propose that the universal computational rewrite system that can be shown to be responsible ultimately for the development of mathematics, physics, chemistry, biology and even human consciousness, is so structured that Nature will always be structured as P at any scale and so will be computationally tractable.
https://doi.org/10.1142/9789814719063_0003
We re-examine the “Regge-Tolman paradox” with reference to some recent experimental results. It is straightforward to find a formula for the velocity v of the moving system required to produce causality violation. This formula typically yields a velocity very close to the speed of light (for instance, v/c > 0.97 for X-shaped microwaves), which raises some doubts about the real physical observability of the violations. We then compute the velocity requirement introducing a delay between the reception of the primary signal and the emission of the secondary. It turns out that in principle for any delay it is possible to find moving observers able to produce active causal violation. This is mathematically due to the singularity of the Lorentz transformations for β →1. For a realistic delay due to the propagation of a luminal precursor, we find that causality violations in the reported experiments are still more unlikely (v/c > 0.989), and even in the hypothesis that the superluminal propagation velocity goes to infinity, the velocity requirement is bounded by v/c > 0.62. We also prove that if two macroscopic bodies exchange energy and momentum through superluminal signals, then the swap of signal source and target is incompatible with the Lorentz transformations; therefore it is not possible to distinguish between source and target, even with reference to a definite reference frame.
https://doi.org/10.1142/9789814719063_0004
Dimensionality has been a much discussed subject since Minkowski formalized special relativity by extending 3D space to 4D space-time. However, there has never been any consensus on the number of dimensions that nature requires and there has been no explanation of why dimensions are needed at all. It is proposed here that dimensions originate in the theory of numbers, that extending the number of dimensions beyond the 3 required by Euclidean space necessarily requires a fundamental change in the meaning of the concept, and that, although various algebraic techniques allow such extension of dimensionality, the structures required always ensure that the number of dimensions and their fundamental characteristics remain ambiguous, leaving the final question unanswerable.
https://doi.org/10.1142/9789814719063_0005
This paper will first survey the hyperincursive and incursive algorithms to discretize the classical harmonic oscillator. These algorithms show stable orbital with the conservation of energy. This paper will then apply these hyperincursive and incursive algorithms to the quantum harmonic oscillator. The hyperincursive quantum harmonic oscillator is separable into two incursive quantum harmonic oscillators. Numerical simulations confirm the stability of these hyperincursive and incursive algorithms.
https://doi.org/10.1142/9789814719063_0006
In this paper we develop solutions to the Dirac equation in complex relativistic Minkowski 8-space and nilpotent space-antispace quaternionic form and discuss various implications and applications. We note unique solutions in complex spin space. Tachyonic/tardonic signaling also appears to arise from this formalism as well as extensions to micro and macro nonlocality. Complex 8-space Dirac equation solutions exhibit additional nonlinear terms which may yield extension of the formalism from Special to General Relativity.
https://doi.org/10.1142/9789814719063_0007
Modeling the ‘creation/emergence’ of matter from spacetime is as old as modern cosmology itself and not without controversy within each model such as Static, Steady-state, Big Bang or Multiverse Continuous-State. In this paper we present only a brief primitive introduction to a new form of ‘Exciplex-Zitterbewegung’ dual space-antispace vacuum Particle Creation applicable especially to Big Bang alternatives which are well-known but ignored; Hubble discovered ‘Redshift’ not a Doppler expansion of the universe which remains the currently popular interpretation. Holographic Anthropic Multiverse cosmology provides viable alternatives to all seemingly sacrosanct pillars of the Big Bang. A model for Multiverse Space-Antispace Dual Calabi-Yau ‘Exciplex-Zitterbewegung’ particle creation has only become possible by incorporating the additional degrees of freedom provided by the capacity complex dimensional extended Yang-Mills Kaluza-Klein correspondence provides.
https://doi.org/10.1142/9789814719063_0008
Though we often refer to 3-D vector space as constructed from points, there is no mechanism from within its definition for doing this. In particular, space, on its own, cannot accommodate the singularities that we call fundamental particles. This requires a commutative combination of space as we know it with another 3-D vector space, which is dual to the first (in a physical sense). The combination of the two spaces generates a nilpotent quantum mechanics/quantum field theory, which incorporates exact supersymmetry and ultimately removes the anomalies due to self-interaction. Among the many natural consequences of the dual space formalism are half-integral spin for fermions, zitterbewegung, Berry phase and a zero norm Berwald-Moor metric for fermionic states.
https://doi.org/10.1142/9789814719063_0009
This paper will examine a physical principle that has been used in making valid predictions and generalizes established conservation laws. In a previous paper it was shown how Rowlands' zero-totality condition could be viewed as a generalization of Newton's third law of motion. In this paper it will be argued that Rowlands' Duality Principle is a generalization of Noether's Theorem and that the two principles taken together are truly foundational principles that have tamed Metaphysics.
https://doi.org/10.1142/9789814719063_0010
Science has advanced dramatically in recent years and with the proliferation of academic journals the amount of material to be read and absorbed in any field has grown to the point where few, if any, can keep up with it all. In all of this, the rise of the power of mathematics to influence much thought in the sciences - both physical and other - has proceeded rapidly, no doubt in part due to so many being wary of mathematics and mathematicians. Here the problem is assessed and the true toll of mathematics in the formulation of scientific theories is defined.
https://doi.org/10.1142/9789814719063_0011
Vibrating media offer an important testing ground for reconciling conflicts between General Relativity, Quantum Mechanics and other branches of physics. For sources like a Weber bar, the standard covariant formalism for elastic bodies can be applied. The vibrating string, however, is a source of gravitational waves which requires novel computational techniques, based on the explicit construction of a conserved and renormalized energy-momentum tensor. Renormalization (in a classical sense) is necessary to take into account the effect of external constraints, which affect the emission considerably. Our computation also relaxes usual simplifying assumptions like far-field approximation, spherical or plane wave symmetry, TT gauge and absence of internal interference. In a further step towards unification, the method is then adapted to give the radiation field of a transversal Alfven wave in a rarefied astrophysical plasma, where the tension is produced by an external static magnetic field.
https://doi.org/10.1142/9789814719063_0012
A charged photon and its light-speed helical trajectory form a surprising new solution to the relativistic electron's energy-momentum equation. This charged photon model is a new model for the electron, and quantitatively resembles the light-speed electron described by Dirac. The charged photon model of the electron is found to quantitatively predict the relativistic de Broglie wavelength of the free electron. This suggests a new interpretation of quantum mechanics, where the electron is seen as a charged photon, and the quantum mechanical wave equations for the electron are generated by the waves produced by the circulating charged photon that composes the electron.
https://doi.org/10.1142/9789814719063_0013
Physicists' understanding of relativity and the way it is handled is up to present days dominated by the interpretation of Albert Einstein, who related relativity to specific properties of space and time. The principal alternative to Einstein's interpretation is based on a concept proposed by Hendrik A. Lorentz, which uses knowledge of classical physics alone to explain relativistic phenomena. In this paper, we will show that on the one hand the Lorentz-based interpretation provides a simpler mathematical way of arriving at the known results for both Special and General Relativity. On the other hand, it is able to solve problems which have remained open to this day. Furthermore, a particle model will be presented, based on Lorentzian relativity and the quantum mechanical concept of Louis de Broglie, which explains the origin of mass without the use of the Higgs mechanism. It is based on the finiteness of the speed of light and provides classical results for particle properties which are currently only accessible through quantum mechanics.
https://doi.org/10.1142/9789814719063_0014
A possible cause of the finiteness of the velocity of tangible objects is demonstrated without reference to the provisions of the special theory of relativity. A condition is formulated on the basis of which the assumption of the movement of tangible objects at any prescribed velocities proves to be self-contradictory in instances when the prescribed velocities of the objects exceed a certain value. This condition consists of the presence of interaction signals and carrier particles in material bodies that (signals and particles) are propagated at a velocity greater than any prescribed velocity of the material bodies.
https://doi.org/10.1142/9789814719063_0015
Here it is intended to reconsider briefly some of the objections which have arisen over the years to both the Special and General Theories of Relativity before raising the question of whether or not either of these two theories is actually required by modern physics.
https://doi.org/10.1142/9789814719063_0016
Recent experimental results, which are not easy to explain at the light of the current commonly accepted theories, can find an explanation in the framework of a theory of locally deformed space-time. Small cavities inside pressed solids and bubbles inside cavitated liquids are assumed as micro-reactors where deformed space-time reactions can take place.
https://doi.org/10.1142/9789814719063_0017
In the former article “General Relativity Theory – well proven and also incomplete?” with a few arguments it was proven that general relativity (GRT) makes contradictory predictions about the total energy of a particle resting in the gravitational field. With a few further arguments it was proven that this contradiction is resolved by expanding general relativity. General relativity is contradictious in energy questions since on one side the total energy of a particle resting in the gravitational field is lower than its rest mass (there is energy needed to pull out the particle from the gravitational field) while on the other side it is equal to its rest mass (this is a consequence of the equivalence principle). In the following article these considerations are generalized to a moving particle. A particle moving in the gravitational field has a total energy less than its rest mass times the relativistic γ-factor since there is energy needed to pull the particle out without changing its velocity. On the other side total energy of a moving particle is equal to its rest mass times the relativistic γ-factor (this is a consequence of the equivalence principle, too). This contradiction is resolved by expanding general relativity in the same manner as above. The other fact: Though it is not the aim of the author to reject general relativity but to expand it, he is treated as some uncritical anti-relativist - since the start of his considerations and meanwhile for more than 20 years.
https://doi.org/10.1142/9789814719063_0018
Currently, the formula E = mc2 is often misinterpreted as the unconditional equivalence between inertial mass and each type of energy (i.e., m = E/c2). However, according to Einstein's general relativity, such a claim is incorrect. Clearly, a necessary condition for any type of energy (or a combination of many types of energy) to be equivalent to mass is that the combined energy results in a tensor having a non-zero trace. For instance, the energy of photons include the energy of its gravitational wave component as demanded by E = mc2. It is pointed out that the Reissner-Nordstrom metric illustrated that electromagnetic energy and mass are different in terms of gravity. The misinterpretation of E = mc2 being unconditional is responsible for overlooking the repulsive charge-mass interaction. Such an interaction of repulsive gravitation implies that the unification of electromagnetism and gravitation is necessary and correct. Moreover, this analysis shows that both the interpretations of Wilczek and 't Hooft on m = E/c2 being unconditionally true is incorrect.
https://doi.org/10.1142/9789814719063_0019
The density of a body is the physical quantity that expresses the relationship between its mass and its volume. Considered in thermodynamics terms, the density of a body varies in proportion to changes in pressure and the temperature of the body. Considered in kinematic terms, according to Galilean-Newtonian mechanics, it is known that the density of a body does not change, because the mass and volume of a body do not depend on its velocity. This paper will examine the behavior of the density of a body in kinematic terms, according to the theory of special relativity (TSR). According to TSR, the mass and volume of a body vary depending on its velocity, giving rise to the question of the behavior of a body's density according to TSR. After a very simple theoretical analysis it is concluded that the density of a body (considered in kinematic terms) is an invariant quantity in relative motion even according to the TSR. The behavior of the density of bodies in relative motion is treated only marginally or not at all in the works and texts that address this theory.
https://doi.org/10.1142/9789814719063_0020
Recently we hear more and more physicists saying, ‘spacetime is doomed’, ‘spacetime is a mirage’, the ‘end of spacetime’, ‘spacetime is not fundamental but emergent’ etc. “Henceforth space by itself and time by itself are doomed to fade into the mere shadows, and only a union of the two will preserve an independent reality.” – 1908 Hermann Minkowski. We have come full circle from the time of Minkowski's 1908 statement to the brink of an imminent new age of discovery. The basis of our understanding of the natural world has evolved in modern times from Newtonian Mechanics to the 2nd regime of Quantum Mechanics; and now to the threshold of a 3rd regime - Unified Field Mechanics (UFM). The Planck scale stochastic quantum realm can no longer be considered the ‘basement’ or fundamental level of reality. As hard as quantum reality was to imagine so is the fact that the quantum domain is a manifold of finite radius; and that the ‘sacrosanct - indelible’ Quantum Uncertainty Principle can now be surmounted. For decades main stream physicists have been stymied by efforts to reconcile General Relativity with Quantum Mechanics. The stumbling block lies with the two theories conflicting views of space and time: For quantum theory, space and time offer a fixed backcloth against which particles move. In Einstein's relativities, space and time are not only inextricably linked, but the resultant spacetime is warped by the matter within it. In our nascent UFM paradigm for arcane reasons the quantum manifold is not the regime of integration with gravity; it is instead integrated with the domain of the unified field where the forces of nature are deemed to unify. We give a simplistic survey of the fundamental premises of UFM and summarize experimental protocols to falsify the model at this stage of the paradigm's development.
https://doi.org/10.1142/9789814719063_0021
The authors propose that the structure of a micro-vacuum plenum acts as a driving mechanism of the universe, which determines what we observe as macroscopic topological conditions on cosmogenesis and cosmological evolution. The detailed structure of the vacuum plenum is based on a quantum gravity model. At the micro scale quantum is formulated in terms of non-abelian algebras. Through the pervasive vacuum structure, particle and atomic processes can be reconciled with the larger structures of cosmogenic evolution, that is, matter is continuously created throughout the universe. The constraints of a Schwarzschild-like criterion can also be applied at each point in the evolutionary process which appears not to be linear in all aspects. The source of new matter is considered to be the activity of and interaction through vacuum effects describable in terms of creation and destruction of operators in the Feynman graphical techniques. In the early stages of cosmogenic processes, an inflationary like or multi big bangs may have occurred and were driven by the vacuum plenum. In addition the vacuum effects occur where quantum gravitational processes were much more dominant. Our approach may lead to an understanding of the observed acceleration of distant High-z bright supernovas of over 6. The current Hubble expansion requires a modification in the model of the early universe conditions and may be more appropriate in current cosmological model considerations in local astrophysical phenomena.
https://doi.org/10.1142/9789814719063_0022
Too many physicists believe the ‘phallacy’ that the quantum is more fundamental than relativity without any valid supporting evidence, so the earliest attempts to unify physics based on the continuity of relativity have been all but abandoned. This belief is probably due to the wealth of pro-quantum propaganda and general ‘phallacies in fysics’ that were spread during the second quarter of the twentieth century, although serious ‘phallacies’ exist throughout physics on both sides of the debate. Yet both approaches are basically flawed because both relativity and the quantum theory are incomplete and grossly misunderstood as they now stand. Had either side of the quantum versus relativity controversy sought common ground between the two worldviews, total unification would have been accomplished long ago. The point is, literally, that the discrete quantum, continuous relativity, basic physical geometry, theoretical mathematics and classical physics all share one common characteristic that has never been fully explored or explained – a paradoxical duality between a dimensionless point (discrete) and an extended length (continuity) in any dimension – and if the problem of unification is approached from an understanding of how this paradox relates to each paradigm, all of physics and indeed all of science could be unified under a single new theoretical paradigm.
https://doi.org/10.1142/9789814719063_0023
It is shown that the pre-metric approach to Maxwell's equations provides an alternative to the traditional Einstein- Maxwell unification problem, namely, that electromagnetism and gravitation are unified in a different way that makes the gravitational field a consequence of the electromagnetic constitutive properties of spacetime, by way of the dispersion law for the propagation of electromagnetic waves.
https://doi.org/10.1142/9789814719063_0024
The principle contradiction that exists between Quantum Mechanics and Relativity Theory constitutes the biggest unresolved enigma in modern Science. To date, none of the candidate theory of everything (TOE) models received any satisfactory empirical validation. A new hypothetical Model called: the ‘Computational Unified Field Theory’ (CUFT) was discovered over the past three years. In this paper it will be shown that CUFT is capable of resolving the key theoretical inconsistencies between quantum and relativistic models. Additionally, the CUFT fully integrates the four physical parameters of space, time, energy and mass as secondary computational features of a singular universal computational principle (UCP) which produces the entire physical universe as an extremely rapid series of spatially exhaustive ‘Universal Simultaneous Computational Frames’ (USCF) embodied within a novel ‘Universal Computational Formula’ (UCF). An empirical validation of the CUFT as a satisfactory TOE is given based on the recently discovered ‘Proton Radius Puzzle’, which confirms one of the CUFT ‘differential-critical predictions’ distinguishing it from both quantum and relativistic models.
https://doi.org/10.1142/9789814719063_0025
A discussion of the various aspects and spectrum of Cognitive Science with an attempt of forming a unified field for the variety of arenas.
https://doi.org/10.1142/9789814719063_0026
I use spherical-numbers to model and study interacting wave functions, and recover known physical laws. A wavefunction interacts with and changes space; the natural forces and quantum properties emerge. The study describes an absolute reality that withstands the tests of relativity. A Bohr-like model of the hydrogen atom dilates the transition frequencies. This alternate approach could provide an ansatz for a unified field theory, however it has a price; most present-day accepted truths need revision.
https://doi.org/10.1142/9789814719063_0027
Instead of the spacetime postulated in general relativity, this paper postulates a fluid aether that fills the Euclidean three-dimensional space where our universe exists; the postulated aether is formed by energy-like extended objects called sagions, thus avoiding shortcomings inherent to an aether formed by material particles. The sagion aether is described by the standard four-dimensional equation of fluids, and thus obeys the two basic laws of physics: conservation of linear momentum and conservation of total energy. All forces of Nature arise from local imbalances of pressure in the sagionic fluid, imbalances associated with the presence of material bodies —mechanism implicit in Le Sage's generation of gravitational attraction. By analogy with classical collisions of material particles, an analysis of the sagion-sagion interaction leads to identifying (1) the primigenial breaking of symmetry, and (2) the microscopic mechanism eventually leading to local agglomerations of matter. It is noteworthy that the structure of the tensor equation of general relativity is quite similar to the 4D-equation describing macroscopic fluids, and the sagion aether. The connection between aether and electromagnetism (EM) is implicit in Maxwell's equations; and the similarity of EM and gravity was noted by Faraday and Heaviside in the 19th century. It is manifest that the mathematical structure of the spatial part of the classical homogeneous wave equation (CHWE) —deeply connected to EM— resembles the spatial part of Schrödinger's equation —basis of quantum mechanics (QM). As shown here for the first time, Boscovich's unified force of nature (BUFN) is a particular case of the novel solutions for the CHWE found by the present author in the mid-1990s; this provides a deeper connection between QM and the CHWE, that, quite surprisingly, leads to classical quantization, thus explaining the ad hoc quantum rules introduced a hundred years ago by Bohr, Wilson and Sommerfeld. Gravitation and nuclear structure are connected in the Le Sagian gravitational model recently reported elsewhere by the present writer.
https://doi.org/10.1142/9789814719063_0028
HÉCTOR A.A brief introductory survey of Unified Field Mechanics (UFM) is given from the perspective of a Holographic Anthropic Multiverse cosmology in 12 ‘continuous-state’ dimensions. The paradigm with many new parameters is cast in a scale-invariant conformal covariant Dirac polarized vacuum utilizing extended HD forms of the de Broglie-Bohm and Cramer interpretations of quantum theory. The model utilizes a unique form of M-Theory based in part on the original hadronic form of string theory that had a variable string tension, TS and included a tachyon. The model is experimentally testable, thus putatively able to demonstrate the existence of large-scale additional dimensionality (LSXD), test for QED violating tight-bound state spectral lines in hydrogen ‘below’ the lowest Bohr orbit, and surmount the quantum uncertainty principle utilizing a hyperincursive Sagnac Effect resonance hierarchy.
https://doi.org/10.1142/9789814719063_0029
Recently, several discussions on the possible observability of 4-vector fields have been published in literature. Furthermore, several authors recently claimed existence of the helicity=0 fundamental field. We re-examine the theory of antisymmetric tensor fields and 4-vector potentials. We study the massless limits. In fact, a theoretical motivation for this venture is the old papers of Ogievetskiĭ and Polubarinov, Hayashi, and Kalb and Ramond. Ogievetskiĭ and Polubarinov proposed the concept of the notoph, whose helicity properties are complementary to those of the photon. We analyze the quantum field theory with taking into account mass dimensions of the notoph and the photon. It appears to be possible to describe both photon and notoph degrees of freedom on the basis of the modified Bargmann-Wigner formalism for the symmetric second-rank spinor. Next, we proceed to derive equations for the symmetric tensor of the second rank on the basis of the Bargmann-Wigner formalism in a straightforward way. The symmetric multispinor of the fourth rank is used. Due to serious problems with the interpretation of the results obtained on using the standard procedure we generalize it and obtain the spin-2 relativistic equations, which are consistent with the general relativity. Thus, in fact we deduced the gravitational field equations from relativistic quantum mechanics. The relations of this theory with the scalar-tensor theories of gravitation and f(R) are discussed. Particular attention has been paid to the correct definitions of the energy-momentum tensor and other Nöther currents in the electromagnetic theory, the relativistic theory of gravitation, the general relativity, and their generalizations. We estimate possible interactions, fermion-notoph, graviton-notoph, photon-notoph, and we conclude that they can probably be seen in experiments in the next few years.
https://doi.org/10.1142/9789814719063_0030
Previous work by the author [David Sands 2008 Eur. J. Phys. 29 129 doi:10.1088/0143-0807/29/1/013] has compared the Boltzmann entropy, k lnW, with the thermodynamic entropy, ClnT, in the case of a temperature-independent heat capacity, C. The simple form of both entropy expressions allows for a simple expression for W and a physical interpretation on the accessible states behind this expression was presented. For the case when C varies with temperature, as in a solid below the Debye temperature, it is not so obvious that this interpretation applies. If there is no simple interpretation of the accessible states it is by no means clear that the Boltzmann entropy is the same as the thermodynamic entropy. In this paper the accessible states of more complex systems is examined. It is shown that the same idea of the accessible states can be applied, but that a real difference between the Boltzmann and thermodynamic entropies is implied by this interpretation.
https://doi.org/10.1142/9789814719063_0031
Our goal is to integrate the objective and subjective aspects of our personal experience into a single complete theory of reality. To further this endeavor we replace elementary particles with elementary events as the building blocks of an event oriented description of that reality. The simplest event in such a conception is an adaptation of A. Wheeler's primitive explanatory--measurement cycle between internal observations experienced by an observer and their assumed physical causes. We will show how internal forces between charge and mass are required to complete the cyclic sequence of activity. This new formulation of internal material is easier to visualize and map to cognitive experiences than current formulations of sub-atomic physics. In our formulation, called Cognitive Action Theory, such internal forces balance the external forces of gravity-inertia and electricity-magnetism. They thereby accommodate outside influences by adjusting the internal structure of material from which all things are composed. Such accommodation is interpreted as the physical implementation of a model of the external physical world in the brain of a cognitive being or alternatively the response mechanism to external influences in the material of inanimate objects. We adopt the deBroglie-Bohm causal interpretation of QT to show that the nature of space in our model is mathematically equivalent to a field of clocks. Within this field small oscillations form deBroglie waves. This interpretation allows us to visualize the underlying structure of empty space with a charge-mass separation field in equilibrium, and objects appearing in space with quantum wave disturbances to that equilibrium occurring inside material. Space is thereby associated with the internal structure of material and quantum mechanics is shown to be, paraphrasing Heisenberg, the physics of the material that knows the world.
https://doi.org/10.1142/9789814719063_0032
The late Professor Kazuo Kondo (Department of Mathematics, Tokyo University, Japan) left a hitherto unknown a priori particle theory which provides predictions of massive particles which may be detected by the Large Hadron Collider (LHC) and related apparatus. This article briefly introduces Kondo's work and documents the derivation and masses of his expected hyper-mesons, hyper-hadrons, heavy leptons and massive neutrinos. Several particles in these classes may have already been detected.
https://doi.org/10.1142/9789814719063_0033
Non-locality, i.e., some sort of instantaneous interaction or correlation determination, has been identified with the theory of Quantum Mechanics in recent times. Being in direct conflict with the basic principles of Relativity Theory, it posses a challenge. Herein various critical arguments raised in the past and judged to be particularly incisive are reviewed. These include, the identification of an error in the derivation of Bell Inequalities, the observation that Bohm inadvertently selected a non-quantum venue for experimental tests of Bell Inequalities and finally, an examination of the complexities that have rendered classical simulations of these experiments unsatisfactory.
https://doi.org/10.1142/9789814719063_0034
Werner Heisenberg's well known requirement that Physical Science ought to occupy itself solely with entities that are both observable and measurable, is almost universally accepted. Starting from the above thesis and accepting Albert Einstein's second fundamental hypothesis, as stated in his historical article “On the Electrodynamics of moving Bodies”, we are led to the conclusion that the kinematics of a material point, as measured and described by a localized real-life Observer, always refers not to its present position but rather to the one it occupied at a previous moment in time, which we call Conjugate Position, or Retarded Position according to Richard Feynman. From the experimenter's point of view, only the Conjugate position is important. Thus, the moving entity is observed and measured at a position that is different to the one it occupies now, a conclusion eerily evocative of the “shadows” paradigm in Plato's Cave Allegory. This, i.e. the kinematics of the Conjugate Position, is analytically described by the “Theory of Harmonicity of the Field of Light”. Having selected the Projective Space as its Geometrical space of choice, an important conclusion of this theory is that, for a localized Observer, a linearly moving object is possible to appear simultaneously at two different positions and, consequently, at two different states in the Observer's Perceptible Space. This conclusion leads to the formulation of at least two fundamental theorems as well as to a plethora of corollaries all in accordance with the notions of contemporary Quantum Mechanics. A new form of the Unified Field of Light is presented.
https://doi.org/10.1142/9789814719063_0035
It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, macroscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.
https://doi.org/10.1142/9789814719063_0036
The following paper attempts to reconstruct the theory of physical operators as presented by a Polish physicist, L. Silberstein, and displaying consequences resulting from comparing such a depiction of operators with a depiction introduced to quantum mechanics by M. Born, N. Wiener, and H. Weyl. The article is to verify and give grounds to the argument presented by M. Jammer, that the theory of L. Silberstein is to some extent an anticipation of formal aspects of operational approach in contemporary quantum mechanics.
https://doi.org/10.1142/9789814719063_0037
We have studied the properties of Variable Chaplygin gas (VCG) model in the context of its thermodynamical stability with the help of an equation of state. We have found that VCG satisfies the two basic characteristics of thermodynamic stability. Using the best fit value of n = −3.4 as previously found by Guo et al. gives that the fluid is thermodynamically stable throughout the evolution. The effective equation of state for the special case of, n = 0 goes to ΛCDM model. Again for n < 0 it favors phantom-like cosmology which is in agreement with the current SNe Ia constraints like VCG model. The deceleration parameter is also studied in the context of thermodynamics and the analysis shows that the flip (deceleration to acceleration) occurs for the value of n < 4. Finally the thermal equation of state is discussed which is found to be an explicit function of temperature only. We also observe that the third law of thermodynamics is satisfied for this model. In conformity with our expectation we also find that for an isoentropic system as volume expands the temperature falls.
https://doi.org/10.1142/9789814719063_0038
Although the general theory macroscopic quantum entanglement of is still in its infancy, consideration of the matter in the framework of action-at-a distance electrodynamics predicts for the random dissipative processes observability of the advanced nonlocal correlations (time reversal causality). These correlations were really revealed in our previous experiments with some large-scale heliogeophysical processes as the source ones and the lab detectors as the probe ones. Recently a new experiment has been performing on the base of Baikal Deep Water Neutrino Observatory. The thick water layer is an excellent shield against any local impacts on the detectors. The first annual series 2012/2013 has demonstrated that detector signals respond to the heliogeophysical (external) processes and causal connection of the signals directed downwards: from the Earth surface to the Baikal floor. But this nonlocal connection proved to be in reverse time. In addition advanced nonlocal correlation of the detector signal with the regional source-process: the random component of hydrological activity in the upper layer was revealed and the possibility of its forecast on nonlocal correlations was demonstrated. But the strongest macroscopic nonlocal correlations are observed at extremely low frequencies, that is at periods of several months. Therefore the above results should be verified in a longer experiment. We verify them by data of the second annual series 2013/2014 of the Baikal experiment. All the results have been confirmed, although some quantitative parameters of correlations and time reversal causal links turned out different due to nonstationarity of the source-processes. A new result is displaying of the advanced response of nonlocal correlation detector to the earthquake. This opens up the prospect of the earthquake forecast on the new physical principle, although further confirmation in the next events is certainly needed. The continuation of the Baikal experiment with expanded program is burning.
https://doi.org/10.1142/9789814719063_0039
Mass is one of the most important concepts in physics and its real understanding represents the key for the formulation of any consistent physical theory. During the past years, a very interesting model of inertial and gravitational mass as the result of the reaction interaction between the charged particles (electrons and quarks) contained in a given body and a suitable “fraction” of QED Zero Point Fields confined within an ideal resonant cavity, associated to the same body, has been proposed by Haish, Rueda and Puthoff. More recently, the author showed that this interpretation is consistent with a picture of mass (both inertial and gravitational) as the seat of ZPF standing waves whose presence reduces quantum vacuum energy density inside the resonant cavity ideally associated to the body volume. Nevertheless so far, the ultimate physical origin of such resonant cavity as well as the mechanism able to “select” the fraction of ZPF electromagnetic modes interacting within it, remained unrevealed. In this paper, basing on the framework of QED coherence in condensed matter, we'll show mass can be viewed as the result of a spontaneous superradiant phase transition of quantum vacuum giving rise to a more stable, energetically favored, macroscopic quantum state characterized by an ensemble of coherence domains, “trapping” the coherent ZPF fluctuations inside a given volume just acting as a resonant cavity. Our model is then able to explain the “natural” emergence of the ideal resonant cavity speculated by Haish, Rueda and Puthoff and its defining parameters as well as the physical mechanism selecting the fraction of ZPF interacting with the body particles. Finally, a generalization of the model to explain the origin of mass of elementary particles is proposed also suggesting a new understanding of Compton's frequency and De Broglie's wavelength. Our results indicates both inertia and matter could truly originate from coherent interaction between quantum matter-wave and radiation fields condensed from quantum vacuum and also give novel and interesting insights into fundamental physical questions as, for example, the structure of elementary particles and matter stability.
https://doi.org/10.1142/9789814719063_0040
The possible unification between electromagnetism and gravity is one of greatest challenges in Physics. According to the so-called “Zero-Point Field Inertia Hypothesis” inertia and gravity could be interpreted, through a semi-classical approach, as the electromagnetic reaction force to the interaction between charged elementary particles contained in a body and quantum vacuum fluctuating electromagnetic modes interacting with them. In a late paper this author, sharing this idea as a starting point but moving within the framework of QFT, proposed a novel model in which inertia emerges from a superradiant phase transition of quantum vacuum due to the coherent interaction between matter-wave and em fields quanta. In both the approaches a resonant-type mechanism is involved in describing the dynamic interaction between a body and ZPF in which it is “immersed”. So it is expected that if a change in the related resonance frequency is induced by modifying the boundary conditions as, for example, through the introduction of a strong electromagnetic field of suitable frequency, the inertial and gravitational mass associated to that body will also be modified. In this paper we have shown, also basing on previous results and starting from the assumption that not only inertia but also gravitational constant G could be truly a function of quantum vacuum energy density, that the application of an electromagnetic field is able to modify the ZPF energy density and, consequently, the value of G in the region of space containing a particle or body. This result particularly suggests a novel interpretation of the coupling between electromagnetic and gravitational interaction ruled by the dynamical features of ZPF energy. Apart from its theoretical consequences, this model could also proposes new paths towards the so-called ZPF-induced gravitation with very interesting applications to advanced technology.
https://doi.org/10.1142/9789814719063_0041
Gravitation is still the less understood among the fundamental forces of Nature. The ultimate physical origin of its ruling constant G could give key insights in this understanding. According to the Einstein's Theory of General Relativity, a massive body determines a gravitational potential that alters the speed of light, the clock's rate and the particle size as a function of the distance from its own center. On the other hand, it has been shown that the presence of mass determines a modification of Zero-Point Field (ZPF) energy density within its volume and in the space surrounding it. All these considerations strongly suggest that also the constant G could be expressed as a function of quantum vacuum energy density somehow depending on the distance from the mass whose presence modifies the ZPF energy structure. In this paper, starting from a constitutive medium-based picture of space, it has been formulated a model of gravitational constant G as a function of Planck's time and Quantum Vacuum energy density in turn depending on the radial distance from center of the mass originating the gravitational field, supposed as spherically symmetric. According to this model, in which gravity arises from the unbalanced physical vacuum pressure, gravitational “constant” G is not truly unchanging but slightly varying as a function of the distance from the mass source of gravitational potential itself. An approximate analytical form of such dependence has been discussed. The proposed model, apart from potentially having deep theoretical consequences on the commonly accepted picture of physical reality (from cosmology to matter stability), could also give the theoretical basis for unthinkable applications related, for example, to the field of gravity control and space propulsion.
https://doi.org/10.1142/9789814719063_0042
Sonoluminescence, or its more frequently studied version known as Single Bubble Sonoluminescence, consisting in the emission of light by a collapsing bubble in water under ultrasounds, represents one of the most challenging and interesting phenomenon in theoretical physics. In fact, despite its relatively easy reproducibility in a simple laboratory, its understanding within the commonly accepted picture of condensed matter remained so far unsatisfactory. On the other hand, the possibility to control the physical process involved in sonoluminescence, representing a sort of nuclear fusion on small scale, could open unthinkable prospects of free energy production from water. Different explanations has been proposed during the past years considering, in various way, the photoemission to be related to electromagnetic Zero Point Field energy dynamics, by considering the bubble surface as a Casimir force boundary. More recently a model invoking Cherenkov radiation emission from superluminal photons generated in quantum vacuum has been successfully proposed. In this paper it will be shown that the same results can be more generally explained and quantitative obtained within a QED coherent dynamics of quantum vacuum, according to which the electromagnetic energy of the emitted photons would be related to the latent heat involved in the phase transition from water's vapor to liquid phase during the bubble collapse. The proposed approach could also suggest an explanation of a possible mechanism of generation of faster than light (FTL) photons required to start Cherenkov radiation as well as possible applications to energy production from quantum vacuum.
https://doi.org/10.1142/9789814719063_0043
The knowledge about hidden variables in physics, (Bohr's-Schrödinger theories) and their developments, boundaries seem more and more fuzzy at physical scales. Also some other new theories give to both time and space as much fuzziness. The classical theory, (school of Copenhagen's) and also Heisenberg and Louis de Broglie give us the idea of a dual wave and particle parts such the way we observe. Thus, the Pondichery interpretation recently developed by Cramer and al. gives to the time part this duality. According Cramer, there could be a little more to this duality, some late or advanced waves of time that have been confirmed and admitted as possible solutions with the Maxwell's equations. We developed here a possible pattern that could matched in the sequence between Space and both retarded and advanced time wave in the “Cramer handshake” in locality of the present when the observation is made everything become local.
https://doi.org/10.1142/9789814719063_0044
Current theories about the Universe based on an FLRW model conclude that it is composed of ~4% normal matter, ~28 % dark matter and ~68% Dark Energy which is responsible for the well-established accelerated expansion: this model works extremely well. As the Universe expands the density of normal and dark matter decreases while the proportion of Dark Energy increases. This model assumes that the amount of dark matter, whose nature at present is totally unknown, has remained constant. This is a natural assumption if dark matter is a particle of some kind – WIMP, sterile neutrino, lightest supersysmmetric particle or axion, etc. – that must have emerged from the early high temperature phase of the Big Bang. This paper proposes that dark matter is not a particle such as these but a vacuum effect, and that the proportion of dark matter in the Universe is actually increasing with time. The idea that led to this suggestion was that a quantum process (possibly the Higgs mechanism) might operate in the nilpotent vacuum that Rowlands postulates is a dual space to the real space where Standard Model fundamental fermions (and we) reside. This could produce a vacuum quantum state that has mass, which interacts gravitationally, and such states would be ‘dark matter’. It is proposed that the rate of production of dark matter by this process might depend on local circumstances, such as the density of dark matter and/or normal matter. This proposal makes the testable prediction that the ratio of baryonic to dark matter varies with redshift and offers an explanation, within the framework of Rowlands' ideas, of the coincidence problem – why has cosmic acceleration started in the recent epoch at redshift z ~0.55 when the Dark Energy density first became equal to the matter density?. This process also offers a potential solution to the ‘missing baryon’ problem.
https://doi.org/10.1142/9789814719063_0045
We utilize the deuterium-hydrogen abundances and their role in setting limits on the mass and other conditions of cosmogenesis and cosmological evolution. We calculate the dependence of a set of physical variables such as density, temperature, energy mass, entropy and other physical variable parameters through the evolution of the universe under the Schwarzschild conditions as a function from early to present time. Reconciliation with the 3°K and missing mass is made. We first examine the Schwarzschild condition; second, the geometrical constraints of a multidimensional Cartesian space on closed cosmologies, and third we will consider the cosmogenesis and evolution of the universe in a multidimensional Cartesian space, obeying the Schwarzschild condition. Implications of this model for matter creation are made. We also examine experimental evidence for closed versus open cosmologies; x-ray detection of the “missing mass” density. Also the interstellar deuterium abundance, along with the value of the Hubble constant set a general criterion on the value of the curvature constant, k. Once the value of the Hubble constant, H is determined, the deuterium abundance sets stringent restrictions on the value of the curvature constant k by an detailed discussion is presented. The experimental evidences for the determination of H and the primary set of coupled equations to determine D abundance is given. 'The value of k for an open, closed, or flat universe will be discussed in terms of the D abundance which will affect the interpretation of the Schwarzschild, black hole universe. We determine cosmology solutions to Einstein's field obeying the Schwarzschild solutions condition. With this model, we can form a reconciliation of the black hole, from galactic to cosmological scale. Continuous creation occurs at the dynamic blackhole plasma field. We term this new model the multiple big bang or “little whimper model”. We utilize the deuteriumhydrogen abundances and their role in setting limits on the mass and other conditions of cosmogenesis and cosmological evolution. We calculate the dependence of a set of physical variables such as density, temperature, energy mass, entropy and other physical variable parameters through the evolution of the universe under the Schwarzschild conditions as a function from early to present time. Reconciliation with the 3°K background and missing mass is made.
https://doi.org/10.1142/9789814719063_0046
A review of the literature on the Lyman alpha forest gives direct evidence on the dynamics of the universe. In an expanding universe one would expect the average temperature of the universe to fall as it expands - but a review of the Doppler parameters of the Hydrogen clouds in Quasar spectra shows that contrary to this, they are increasing in temperature (or at least, becoming increasingly disturbed) as the universe ages. Additionally, the evidence is that Hydrogen clouds are, on average, evenly spaced up to a redshift of one - if not beyond. These results beg the question, how is it that the Hydrogen clouds can have differing redshifts and hence widely differing ‘velocities’ whilst, on average, remain equally spaced? Especially since this range of redshifts includes the supernovae data used to show ‘acceleration’ and so called ‘time dilation.’ Taking these results in isolation implies that the universe has been static for at least the last billion years or so and therefore a new model of redshift is needed to explain redshifts in a static universe. The model proposed here is that in a static universe, photons of light from distant galaxies are absorbed and reemitted by electrons in the plasma of intergalactic space and on each interaction the electron recoils. Energy is lost to the recoiling electron (New Tired Light theory) and thus the reemitted photon has less energy, a reduced frequency and therefore an increased wavelength. It has been redshifted. The Hubble relationship becomes 'photons of light from a galaxy twice as far away, make twice as many interactions with the electrons in the plasma of IG space, lose twice as much energy and undergo twice the redshift. A relationship between redshift and distance is found and, using published values of collision cross-sections and number density of electrons in IG space, a value for the Hubble constant is derived which is in good agreement with measured values. Assuming that the energy transferred to the recoiling electron is emitted as secondary radiation; the wavelength is calculated and found to be consistent with the wavelengths of the CMB. A further test is proposed whereby a high powered laser could be fired through sparse cold plasma and the theories predicted increase in emission of microwave radiation of a particular frequency determined.
https://doi.org/10.1142/9789814719063_0047
If one is willing to consider the current cosmic microwave back ground temperature as a quantum gravitational effect of the evolving primordial cosmic black hole (universe that constitutes dynamic space-time and exhibits quantum behavior) automatically general theory of relativity and quantum mechanics can be combined into a ‘scale independent’ true unified model of quantum gravity. By considering the ‘Planck mass’ as the initial mass of the baby Hubble volume, past and current physical and thermal parameters of the cosmic black hole can be understood. Current rate of cosmic black hole expansion is being stopped by the microscopic quantum mechanical lengths. In this new direction authors observed 5 important quantum mechanical methods for understanding the current cosmic deceleration. To understand the ground reality of current cosmic rate of expansion, sensitivity and accuracy of current methods of estimating the magnitudes of current CMBR temperature and current Hubble constant must be improved and alternative methods must be developed. If it is true that galaxy constitutes so many stars, each star constitutes so many hydrogen atoms and light is coming from the excited electron of galactic hydrogen atom, then considering redshift as an index of ‘whole galaxy’ receding may not be reasonable. During cosmic evolution, at any time in the past, in hydrogen atom emitted photon energy was always inversely proportional to the CMBR temperature. Thus past light emitted from older galaxy's excited hydrogen atom will show redshift with reference to the current laboratory data. As cosmic time passes, in future, the absolute rate of cosmic expansion can be understood by observing the rate of increase in the magnitude of photon energy emitted from laboratory hydrogen atom. Aged super novae dimming may be due to the effect of high cosmic back ground temperature. Need of new mathematical methods & techniques, computer simulations, advanced engineering skills seem to be essential in this direction.
https://doi.org/10.1142/9789814719063_0048
This report summarizes ongoing research and development since our 2012 foundation paper, including the emergent effects of a deterministic mechanism for fermion interactions: (1) the coherence of black holes and particles using a quantum chaotic model; (2) wide-scale (anti)matter prevalence from exclusion and weak interaction during the fermion reconstitution process; and (3) red-shift due to variations of vacuum energy density. We provide a context for Standard Model fields, and show how gravitation can be accountably unified in the same mechanism, but not as a unified field.
https://doi.org/10.1142/9789814719063_0049
Starting from the simple premises of one size of fundamental building block, two types of energy and only three dimensions, it is shown that there can be no multiverses outside our universe, that some black holes are observable failed inflation events within our universe and that there can be only one underlying set of the laws of physics. These laws will be the same everywhere and fail nowhere. Composites formed from the building blocks during different inflation events can produce different sizes of fermions, nucleons and atoms, but a type of universe with symmetries similar to ours is the inevitable outcome of a successful inflation event. The building blocks provide the base for matter, anti-matter and dark matter in the same composite forms and show how the existence or otherwise of dark energy can be observed. Also explained are why only positive masses are observed, why some particle configurations and orbits are stable and what the terms ‘energy’ and ‘inertia’ really describe.
https://doi.org/10.1142/9789814719063_0050
In the light of the theoretical and experimental developments in neutrino sector and their imprtance, we study its connection with new physics above the electroweak scale MEW ~ 102GeV . In particular, by considering the neutrino oscillations with the possible effective mass, we investigate, according to the experimental data, the underlying GUT scale MGUT ~ 1015GeV .
https://doi.org/10.1142/9789814719063_0051
The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.
https://doi.org/10.1142/9789814719063_0052
Quanta of action (Planck's, Stoney's, Kittel's etc.) are related to the so-called units determined by universal constants: c – speed of light in vacuum, G – Newtonian gravitational constant and by the respective constant connected with the respective interaction. If we introduce Λ - units determined by c, G and Λ we obtain also the Λ- mega quantum of action. It will be shown that this quantum of action can be disclosed in the Lagrangian used to express the stationary action in General Relativity applied in cosmology in which the cosmological constant Λ appears. It will be discussed if this mega quantum is connected with the causally bounded zones in our universe, i.e. with the Hubble spheres. If we introduce Kittel's gravitational units determined by c, G and MG, where MG means the gravitational mass (of ordinary + and dark matter), embedded and causally bounded in a Hubble' sphere then we obtain also a Kittel mega quantum of action. The meaning of both mega quanta will be discussed.
https://doi.org/10.1142/9789814719063_0053
I trace the historical and scientific origin of Continuum Theory, from its observationally enforced beginning in 1959, in never-to-be-repeated military circumstances, and follow this by a discussion of some of its more recent developments. The presence of this and of several other CT-related contributions to this symposium volume on Unified Field Mechanics can be justified by a view that CT, as currently developing, could, in a very real sense, be given an alternative name ‘Unified Aether Mechanics’. The substitution of ‘field’ by ‘aether’ reflects Newton's 1692 thesis that ‘fields’ cannot exist per se, a view that persisted for over 200 years; they must have an agent or medium within which they exist and are communicated between objects. Hence the term ‘aether mechanics’ would be appropriate. A principal aim in ‘unification’, moreover, has always been the unification of gravitation into the family of forces. Einstein's response was the meanderings of space-time. CT achieves its unification into the electromagnetic family by its implementation of the Maxwell's equations aether, with insightful results, apparently regardless of scale. Particletied in nature, the existence of such an aether is was effectively demonstrated experimentally by the Michelson-Morley finding of 1887.
https://doi.org/10.1142/9789814719063_0054
The formula for periastron advance/perihelion advance of Mercury is not original nor unique to GR. It was first derived, and shown to work for the Mercury example, by Paul Gerber in 1898 and his follow-up paper of 1902 (republished 1917), on the basis of gravitational communication being at velocity c. This physical view differs importantly from that espoused in General Relativity, so its incorporation, unacknowledged, by Einstein into his GR equations of 1915 constitutes an omission for GR, questioning its vaunted status as a physically consistent body of theory. However Einstein himself said GR was ‘just a convenient stopping place.’ Continuum Theory (CT) on the other hand, finds that gravitation is one of the electromagnetic family of forces, with the expectation that its communication is at velocity c or a simple multiple thereof.
https://doi.org/10.1142/9789814719063_0055
My development of Continuum Theory rests importantly on two mathematical treatments and calculations which I wrote in 1994 and were published in 1998 as Appendices A and B to my PIRT V paper presented in London in 1996. In view of their continuing scientific relevance, this contribution to the V9 conference proceedings is a republication of those Appendices, subject to minimal re-editing. Appendix B, presented first, tackles our 1959 finding that the daylight sky brightness distribution at high altitude shows the presence of an additional contribution whose intensity and distribution which, on careful analysis, I identified as having come from a deflection scattering mechanism due to transmission by an (atmospheric) ‘particle-tied aether’. Appendix A shows that redshift is one of the consequences of such transmission. The parameters involved are then used to analyse the 1968 radio ground-wave caesium clock redshift observations of Sadeh et al and to extrapolate them to the intergalactic transmission paths pertinent to the cosmic redshift as a transmission effect, not a velocity. It finds this to be a reasonable evaluation within observational uncertainties, notably those of density and degree of ionization. In that case, there being no Big Bang, the temperature is precisely known from the CMBR, identified as synchrotron-type radiation from the randomly moving aether along the path, but slightly elevated where the path has traversed a heat-generating cluster.
https://doi.org/10.1142/9789814719063_0056
Electron-positron pairs and proton-antiproton pairs often appear in high energy experiments but the antimatter components, positrons and antiprotons, are extremely scarce in the wider environment. Why is this so? Positrons are positive objects. But antiprotons are electrically negative, built up of three quarks. So what part, if any, does polarity play in the evidently low durability of antimatter? Vorticity is displayed at every scale in the Universe, from spiral galaxies downward, yet the linear relative motions that would result from a bigbang do not generate vorticity unless viscosity is present, so how does/did it arise? Answers to both questions are adduced within the basic framework of Continuum Theory (CT). This opens up an explanation of why atomic matter is so uniform throughout the Universe.
https://doi.org/10.1142/9789814719063_0057
Although CT is based at the sub-particle aether scale, I have shown that most of its implications require large-distance accumulation to become observable. It is suggested that the phenomenon which constrains the temperature limit to which electrical superconductivity is maintained may offer us the chance of testing CT at the laboratory scale, perhaps with a physically startling result.
https://doi.org/10.1142/9789814719063_0058
A spherically rotating model of an electron shows great promise for explaining many phenomena but has received little interest. It can be a purely classical theory. Others have shown that it is consistent with the SU(2) covering group, provides an explanation for the Compton and De Broglie waves, and obeys the Lorentz interpretation of relativity. Further progress can be made by carefully defining the aether that is rotating and describing the behavior of ensembles of spherically rotating particles. In particular, if reasonable assumptions are made about the angular momentum and magnetic dipoles of astronomical bodies, spherical rotation provides a mechanism for the General Allais Effect. The purpose of this paper is to convince seekers of unification to look at a different approach.
https://doi.org/10.1142/9789814719063_0059
Algebraic and geometric representations of the genetic code are used to show their functions in coding for amino acids. The algebra is a 64-part vector quaternion combination, and the geometry is based on the structure of the regular icosidodecahedron. An almost perfect pattern suggests that this is a biologically significant way of representing the genetic code.
https://doi.org/10.1142/9789814719063_0060
The relationship that exits between interactions-complexity-consciousness (ICC) can be applied to degrees of consciousness at the smaller scales, e.g., as atoms and to degrees of consciousness at larger scales, e.g., the human brain. In this work I propose that the interactions obey the triadic dimensional vortical paradigm (TDVP) in where time, space, and consciousness are tethered from the origin point. As these interactions increase they ascend complexity at different scales and as the interaction-complexity value increases they give rise to higher degrees of consciousness. The higher degrees of consciousness can be feedback through the original process of the ICC-TDVP to give rise to even higher degrees of consciousness. This can be repeated an infinite number of occurrences in space and time. In this proposal the evolution of degrees of consciousness can be seen as a detailed self-replicating pattern which may be nearly the same at different emerging scales. Eventually, reset points or universal rewrite events occur, which is seen possibly at the level of DNA consciousness, cellular consciousness, and human consciousness. This theoretical work focuses on the development of a paradigm that interprets consciousness as both a fundamental and an emerging property.
https://doi.org/10.1142/9789814719063_0061
A complete theoretical model of how consciousness arises in neural nets can be developed based on a mixed quantum/classical basis. Both mind and consciousness are multi-leveled scalar and vector electromagnetic complexity patterns, respectively, which emerge within all living organisms through the process of evolution. Like life, the mind and consciousness patterns extend throughout living organisms (bodies), but the neural nets and higher level groupings that distinguish higher levels of consciousness only exist in the brain so mind and consciousness have been traditionally associated with the brain alone. A close study of neurons and neural nets in the brain shows that the microtubules within axons are classical bio-magnetic inductors that emit and absorb electromagnetic pulses from each other. These pulses establish interference patterns that influence the quantized vector potential patterns of interstitial water molecules within the neurons as well as create the coherence within neurons and neural nets that scientists normally associate with more complex memories, thought processes and streams of thought. Memory storage and recall are guided by the microtubules and the actual memory patterns are stored as magnetic vector potential complexity patterns in the points of space at the quantum level occupied by the water molecules. This model also accounts for the plasticity of the brain and implies that mind and consciousness, like life itself, are the result of evolutionary processes. However, consciousness can evolve independent of an organism's birth genetics once it has evolved by normal bottom-up genetic processes and thus force a new type of top-down evolution on living organisms and species as a whole that can be explained by expanding the laws of thermodynamics to include orderly systems.
https://doi.org/10.1142/9789814719063_0062
A new mathematical formula referred to as the Planckian distribution equation (PDE) has been found to fit long-tailed histograms generated in various fields of studies, ranging from atomic physics to single-molecule enzymology, cell biology, brain neurobiology, glottometrics, econophysics, and to cosmology. PDE can be derived from a Gaussian-like equation (GLE) by non-linearly transforming its variable, x, while keeping the y coordinate constant. Assuming that GLE represents a random distribution (due to its symmetry), it is possible to define a binary logarithm of the ratio between the areas under the curves of PDE and GLE as a measure of the non-randomness (or order) underlying the biophysicochemical processes generating long-tailed histograms that fit PDE. This new function has been named the Planckian information, IP, which (i) may be a new measure of order that can be applied widely to both natural and human sciences and (ii) can serve as the opposite of the Boltzmann-Gibbs entropy, S, which is a measure of disorder. The possible rationales for the universality of PDE may include (i) the universality of the wave-particle duality embedded in PDE, (ii) the selection of subsets of random processes (thereby breaking the symmetry of GLE) as the basic mechanism of generating order, organization, and function, and (iii) the quantity-quality complementarity as the connection between PDE and Peircean semiotics.
https://doi.org/10.1142/9789814719063_0063
The experience of ‘nowness’, also known as the present moment, is examined from the perspectives of neurobiology and neurophenomenology. A comparison of these perspectives suggests that once our sensory experiences and motor activities are encoded by the brain, our intended actions can be modified in the next nowness moment by a process, probably heuristic, that is based on the current contents of conscious experience. Modifications can be made to the extent permitted by time available, as specified by the duration of the nowness moment. In this way our actions are available for change, at least to some degree, at all times.
https://doi.org/10.1142/9789814719063_bmatter
The following sections is included:
Sample Chapter(s)
Foreword (144 KB)
Iterants, Fermions and Majorana Operators (409 KB)