We could be on the threshold of a scientific revolution. Quantum mechanics is based on unique, finite, and discrete events. General relativity assumes a continuous, curved space-time. Reconciling the two remains the most fundamental unsolved scientific problem left over from the last century. The papers of H Pierre Noyes collected in this volume reflect one attempt to achieve that unification by replacing the continuum with the bit-string events of computer science. Three principles are used: physics can determine whether two quantities are the same or different; measurement can tell something from nothing; this structure (modeled by binary addition and multiplication) can leave a historical record consisting of a growing universe of bit-strings. This book is specifically addressed to those interested in the foundations of particle physics, relativity, quantum mechanics, physical cosmology and the philosophy of science.
https://doi.org/10.1142/9789812810090_fmatter
The following sections are included:
https://doi.org/10.1142/9789812810090_0001
Major scientific revolutions are rarely, if ever, started deliberately. They can be “in the air” for a long time before the first recognizable paradigm appears. I have believed for some time that this is the kind of intellectual environment in which we now find ourselves. For a while I even thought that some of the work collected here might, in fact, provide a few of the first steps toward a new natural philosophy alternative to the prevailing world view. Since this has not come to pass, I offer these papers to the community of scientists in the hope that some younger physicist may find clues in them that help him or her to succeed where I have proved wanting…
https://doi.org/10.1142/9789812810090_0002
The following sections are included:
https://doi.org/10.1142/9789812810090_0003
The combinatorial hierarchy model for basic particle processes is based on elementary entities; any representation they may have is discrete and two-valued. We call them Schnurs to suggest their most fundamental aspect as concatenating strings. Consider a definite small number of them. Consider an elementary creation act as a result of which two different Schnurs generate a new Schnur which is again different. We speak of this process as a “discrimination”. By this process and by this process alone can the complexity of the universe be explored. By concatenations of this process we create more complex entities which are themselves Schnurs at a new level of complexity. Everything plays a dual role in which something comes in from the outside to interact, and also serves as a synopsis or concatenation of such a process. We thus incorporate the observation metaphysic at the start, rejecting Bohr's reduction to the haptic language of common sense and classical physics. Since discriminations occur sequentially, our model is consistent with a “fixed past-uncertain future” philosophy of physics. We demonstrate that this model generates four hierarchical levels of rapidly increasing complexity. Concrete interpretation of the four levels of the hierarchy (with cardinals 3,7,127,2127 – 1 ≈ 1038) associates the three levels which map up and down with the three absolute conservation laws (charge, baryon number, lepton number) and the spin dichotomy. The first level represents +, −, and ± unit charge. The second has the quantum numbers of a baryon–antibaryon pair and associated charged meson (e.g., ,
,
,
,π+,π0,π−). The third level associates this pair, now including four spin states as well as four charge states, with a neutral lepton–antilepton pair (eē or
), each pair in four spin states (total, 64 states)—three charged spinless, three charged spin-1, and a neutral spin-1 mesons (15 states), and a neutral vector boson associated with the leptons; this gives 3 + 15 + 3 × 15 = 63 possible boson states, so a total correct count of 63 + 64 = 127 states. Something like SU2 × SU3 and other indications of quark quantum numbers can occur as substructures at the fourth (unstable) level. Breaking into the (Bose) hierarchy by structures with the quantum numbers of a fermion, if this is an electron, allows us to understand Parker-Rhodes' calculation of mp/me = 1836.1515 in terms of our interpretation of the hierarchy. A slight extension gives us the usual static approximation to the binding energy of the hydrogen atom, α2mec2. We also show that the cosmological implications of the theory are in accord with current experience. We conclude that we have made a promising beginning in the physical interpretation of a theory which could eventually encompass all branches of physics.
https://doi.org/10.1142/9789812810090_0004
We construct the particulate states of quantum physics using a recursive computer program that incorporates non-determinism by means of locally arbitrary choices. Quantum numbers and coupling constants arise from the construction via the unique 4-level combinatorial hierarchy. The construction defines indivisible quantum events with the requisite supraluminal correlations, yet does not allow supraluminal communication. Measurement criteria incorporate c, ħ and mp or (not “ and”) G, connected to laboratory events via finite particle number scattering theory and the counter paradigm. The resulting theory is discrete throughout, contains no infinities, and, as far as we have developed it, is in agreement with quantum mechanical and cosmological fact.
https://doi.org/10.1142/9789812810090_0005
Starting from the principles of finiteness, discreteness, finite computability and absolute nonuniqueness, we develop the ordering operator calculus, a strictly constructive mathematical system having the empirical properties required by quantum mechanical and special relativistic phenomena. We show how to construct discrete distance functions, and both rectangular and spherical coordinate systems (with a discrete version of “π”). The richest discrete space constructible without a preferred axis and preserving translational and rotational invariance is shown to be a discrete 3–space with the usual symmetries. We introduce a local ordering parameter with local (proper) time-like properties and universal ordering parameters with global (cosmological) time-like properties. Constructed “attribute velocities” connect ensembles with attributes that are invariant as the appropriate time-like parameter increases. For each such attribute, we show how to construct attribute velocities which must satisfy the “relativistic Doppler shift” and the “relativistic velocity composition law,” as well as the Lorentz transformations. By construction, these velocities have finite maximum and minimum values. In the space of all attributes, the minimum of these maximum velocities will predominate in all multiple attribute computations, and hence can be identified as a fundamental limiting velocity. General commutation relations are constructed which under the physical interpretation are shown to reduce to the usual quantum mechanical commutation relations.
https://doi.org/10.1142/9789812810090_0006
We base our theory of physics and cosmology on the five principles of finiteness, discreteness, finite computability, absolute nonuniqueness, and strict construction. Our modeling methodology starts from the current practice of physics, constructs a self-consistent representation based on the ordering operator calculus and provides rules of correspondence that allow us to test the theory by experiment. We use PROGRAM UNIVERSE to construct a growing collection of bit strings whose initial portions (labels) provide the quantum numbers that are conserved in the events defined by the construction. The labels are followed by content strings, which are used to construct event-based finite aned discrete coordinates. On general grounds such a theory has a limiting velocity, and positions and velocities do not commute. We therefore reconcile quantum mechanics with relativity at an appropriately fundamental stage in the construction. We show that 1) events in different coordinate systems are connected by the appropriate finite and discrete version of the Lorentz transformation, 2) three-momentum is conserved in events, and 3) this conservation law is the same as the requirement that different paths can “interfere” only when they differ by an integral number of de Broglie wavelengths.
The labels are organized into the four levels of the combinatorial hierarchy characterized by the cumulative cardinals 3, 10, 137, 2127 + 136 ≃ 1.7 × 1038. We justify the identification of the last two cardinals as a first approximation to ħc/e2 and respectively. We show that the quantum numbers associated with the first three levels can be rigorously identified with the quantum numbers of the first generation of the standard model of quarks and leptons, with color confinement and a first approximation to weak-electromagnetic unification. Our cosmology provides an event horizon, a zero-velocity frame for the background radiation, a fireball time of about 3.5 × 106 years, about the right amount of visible matter, and 12.7 times as much dark matter. A preliminary calculation of the fine structure spectrum of hydrogen gives the Sommerfeld formula and a correction to our first approximation for the fine structure constant, which leads to 1/α = 137.035 967 4 …. We can now justify the earlier results mp/me = 1836.151 497… and mπ/me ≲ = 274. Our estimate of the weak angle is sin2 θWeak = 1/4 and of the fermi constant
. Our finite particle number relativistic scattering theory should allow us to systematically extend these results. Eteris paribus, caveat lector.
https://doi.org/10.1142/9789812810090_0007
Using our discrete retativistic combinatorial bit–string theory of physics in the context of the hydrogen spectrum, we calculate our first two approximations for the fine-structure constant as α(1) = 1/137 and α(2) = [1 − 1/(30 × 127)] /137 = 1/137.035 967 4 …; we can then derive the Sommerfeld formula.
https://doi.org/10.1142/9789812810090_0008
Zurek and Thorne [1] have shown that the number of bits of information lost in forming a rotating, charged black hole is equal to the area of the event horizon in Planck areas, i.e., the Beckenstein number [2]. Wheeler [3] has suggested that this could be a significant clue in the search through the foundations of physics for links between information theory and quantum mechanics. In this comment we show that if one accepts the conservation of baryon number, as attested by the experimentally unchallenged stability of the proton, one can argue that the proton is a stable, charged, rotating black hole with baryon number +1, charge +e, angular momentum ½ħ, and Beckenstein number N = ħc/Gmpc2 ≃ 1.7 × 1038…
https://doi.org/10.1142/9789812810090_0009
The masses, coupling constants and cosmological parameters obtained using our discrete and combinatorial physics based on discrimination between bit-strings indicate that we can achieve the unification of quantum mechanics with relativity which had become the goal of twentieth century physics. To broaden our case we show that limitations on measurement of the position and velocity of an individual massive particle observed in a colliding beam scattering experiment imply real, rational commutation relations between position and velocity. Prior to this limit being pushed down to quantum effects, the lower bound is set by the available technology, but is otherwise scale invariant. Replacing force by force per unit mass and force per unit charge allows us to take over the Feynman-Dyson proof of the Maxwell Equations and extend it to weak gravity. The crossing symmetry of the individual scattering processes when one or more particles are replaced by anti-particles predicts both Coulomb attraction (for charged particles) and a Newtonian repulsion between any particle and its anti-particle. Previous quantum results remain intact, and predict the expected relativistic fine structure and spin dependencies. Experimental confirmation of this anti-gravity prediction would inaugurate the physics of the twenty-first century.
https://doi.org/10.1142/9789812810090_0010
Consider a proton moving in empty space past a positively charged earth with charge (squared) . In the absence of further information, we expect it to move past the earth without being deflected. The CPT theorem asserts that an anti-proton must move past a negatively charged anti-earth in precisely the same way, so gives us no new information. Conventional theories predict that an anti-proton moving past a positively charged earth at distance r would experience an acceleration toward the earth of twice the amount
that a neutral object of mass mp would experience. However, if we are correct in asserting that “crossing symmetry” requires all “forces” (i.e. accelerations per unit inertial mass) on a particle to reverse when that particle is replaced by its antiparticle, then the fact that we can use electromagnetic forces to balance a particle in a gravitational field plus crossing symmetry predict that the electromagnetic fields would have to be reversed in order to balance an anti-particle in the same configuration…
https://doi.org/10.1142/9789812810090_0011
It is claimed here that by 1936 Bridgman had developed severe criticisms of orthodox quantum mechanics from the point of view of his operational philosophy. We try to meet these criticisms by a radical analysis of the measurement of the finite and discrete length and time intervals between particulate events. We show that a scale invariant counter paradigm based on two arbitrary finite and discrete measurement intervals Δℓ, Δt offers an alternative starting point for constructing relativistic particle mechanics. Using the scale invariant definitions
https://doi.org/10.1142/9789812810090_0012
Freeman Dyson has recently focused attention on a remarkable derivation of electromagnetism from apparently quantum mechanical assumptions about the motion of a single particle. We present a new version of the Feynman–Dyson derivation in a discrete context. In the course of our derivation, we have uncovered a useful and elegant reformulation of the calculus of finite differences.
https://doi.org/10.1142/9789812810090_0013
We rewrite the 1 + 1 Dirac equation in light cone coordinates in two significant forms, and solve them exactly using the classical calculus of finite differences. The complex form yields “Feynman's checkerboard” - a weighted sum over lattice paths. The rational, real form can also be interpreted in terms of bit-strings.
https://doi.org/10.1142/9789812810090_0014
We note that if hadrons are gravitationally stabilized “black holes”, as discrete physics suggests, it is possible that PARTONS, and in particular quarks, could be modeled as tachyons, i.e. particles having v2 > c2, without conflict with the observational fact that neither quarks nor tachyons have appeared as “free particles”. Some consequences of this model are explored.
https://doi.org/10.1142/9789812810090_0015
This paper starts with a personal memoir of how some significant ideas arose and events took place during the period from 1972, when I first encountered Ted Bastin, to 1979, when I proposed the foundation of ANPA. I then discuss program universe, the fine structure paper and its rejection, the quantitative results up to ANPA 17 and take a new look at the handy-dandy formula. Following this historical material is a first pass at establishing new foundations for bit-string physics. An abstract model for a laboratory notebook and an historical record are developed, culminating in the bit-string representation. I set up a tic-toc laboratory with two synchronized clocks and show how this can be used to analyze arbitrary incoming data. This allows me to discuss (briefly) finite and discrete Lorentz transformations, commutation relations, and scattering theory. Earlier work on conservation laws in 3- and 4- events and the free space Dirac and Maxwell equations is cited. The paper concludes with a discussion of the quantum gravity problem from our point of view and speculations about how a bit-string theory of strong, electromagnetic, weak and gravitational unification could take shape.
https://doi.org/10.1142/9789812810090_0016
We shall argue in this paper that a central piece of modern physics does not really belong to physics at all but to elementary probability theory. Given a joint probability distribution J on a set of random variables containing x and y, define a link between x and y to be the condition x=y on J. Define the state D of a link x=y as the joint probability distribution matrix on x and y without the link. The two core laws of quantum mechanics are the Born probability rule and the unitary dynamical law whose best known form is Schroedinger's equation. Von Neumann formulated these two laws in the language of Hilbert space as prob(P) = trace(PD) and D'T = TD respectively, where P is a projection, D and D' are (von Neumann) density matrices, and T is a unitary transformation. We'll see that if we regard link states as density matrices, the algebraic forms of these two core laws occur are completely general theorems about links When we extend probability theory by allowing cases to count negatively, we find that the Hilbert space framework of quantum mechanics proper emerges from the assumption that all D's are symmetrical in rows and columns. On the other hand, Markovian systems emerge when we assume that one of every linked variable pair has a uniform probability distribution. By representing quantum and Markovian structure in this way, we see clearly both how they differ and also how they can coexist in natural harmony with each other, as they must in quantum measurement, which we will examine. Looking beyond quantum mechanics, we see how both structures have their special places in a much larger continuum of formal systems that we have yet to look for in nature.
https://doi.org/10.1142/9789812810090_0017
Using a simple combinatorial algorithm for generating finite and discrete events as our numerical cosmology, we predict that the baryon/photon ratio at the time of nucleogenesis is η = 1/2564, ΩDM/ΩB = 12.7 and (for a cosmological constant of ΩΛ = 0.6±0.1 predicted on general grounds by E.D.Jones) that 0.325 > ΩM > 0.183. The limits are set not by our theory but by the empirical bounds on the renormalized Hubble constant of 0.6 < h0 < 0.8. If we impose the additional empirical bound of t0 < 14 Gyr, the predicted upper bound on ΩM falls to 0.26. The predictions of ΩM and ΩΛ were in excellent agreement with Glanz' analysis in 1998, and are still in excellent agreement with Lineweaver's recent analysis despite the reduction of observational uncertainty by close to an order of magnitude.
https://doi.org/10.1142/9789812810090_bmatter
The following sections are included: