A careful study of the classical/quantum connection with the aid of coherent states offers new insights into various technical problems. This analysis includes both canonical as well as closely related affine quantization procedures. The new tools are applied to several examples including: (1) A quantum formulation that is invariant under arbitrary classical canonical transformations of coordinates; (2) A toy model that for all positive energy solutions has singularities which are removed at the classical level when the correct quantum corrections are applied; (3) A fairly simple model field theory with non-trivial classical behavior that, when conventionally quantized, becomes trivial, but nevertheless finds a proper solution using the enhanced procedures; (4) A model of scalar field theories with non-trivial classical behavior that, when conventionally quantized, becomes trivial, but nevertheless finds a proper solution using the enhanced procedures; (5) A viable formulation of the kinematics of quantum gravity that respects the strict positivity of the spatial metric in both its classical and quantum versions; and (6) A proposal for a non-trivial quantization of that is ripe for study by Monte Carlo computational methods. All of these examples use fairly general arguments that can be understood by a broad audience.
I give a brief introduction to many worlds or "no wave function collapse" quantum mechanics, suitable for non-specialists. I then discuss the origin of probability in such formulations, distinguishing between objective and subjective notions of probability.
In our recently proposed quantum theory of gravity, the universe is made of ‘atoms‘” of space-time-matter (STM). Planck scale foam is composed of STM atoms with Planck length as their associated Compton wave-length. The quantum dispersion and accompanying spontaneous localization of these STM atoms amounts to a cancellation of the enormous curvature on the Planck length scale. However, an effective dark energy term arises in Einstein equations, of the order required by current observations on cosmological scales. This happens if we propose an extremely light particle having a mass of about 10−33eV/c2, forty-two orders of magnitude lighter than the proton. The holographic principle suggests there are about 10122 such particles in the observed universe. Their net effect on space-time geometry is equivalent to dark energy, this being a low energy quantum gravitational phenomenon. In this sense, the observed dark energy constitutes evidence for quantum gravity. We then invoke Dirac’s large number hypothesis to also propose a dark matter candidate having a mass halfway (on the logarithmic scale) between the proton and the dark energy particle, i.e. about 10−12eV/c2.
We have recently proposed a Lagrangian in trace dynamics to describe a possible unification of gravity, Yang–Mills fields, and fermions, at the Planck scale. This Lagrangian for the unified entity — called the aikyon — is invariant under global unitary transformations, and as a result possesses a novel conserved charge, known as the Adler–Millard charge. In this paper, we derive an eigenvalue equation, analogous to the time-independent Schrödinger equation, for the Hamiltonian of the theory. We show that in the emergent quantum theory, the energy eigenvalues of the aikyon are characterized in terms of a fundamental frequency times Planck’s constant. The eigenvalues of this equation can, in principle, determine the values of the parameters of the standard model. We also report a ground state, in this theory of spontaneous quantum gravity, which could characterize a non-singular initial epoch in quantum cosmology.
We show that quantum nonequilibrium (or deviations from the Born rule) can propagate nonlocally across space. Such phenomena are allowed in the de Broglie–Bohm pilot-wave formulation of quantum mechanics. We show that an entangled state can act as a channel whereby quantum nonequilibrium can be transferred nonlocally from one region to another without any classical interaction. This suggests a novel mechanism whereby information can escape from behind the classical event horizon of an evaporating black hole.
In this paper we consider theories in which reality is described by some underlying variables, λ. Each value these variables can take represents an ontic state (a particular state of reality). The preparation of a quantum state corresponds to a distribution over the ontic states, λ. If we make three basic assumptions, we can show that the distributions over ontic states corresponding to distinct pure states are nonoverlapping. This means that we can deduce the quantum state from a knowledge of the ontic state. Hence, if these assumptions are correct, we can claim that the quantum state is a real thing (it is written into the underlying variables that describe reality). The key assumption we use in this proof is ontic indifference — that quantum transformations that do not affect a given pure quantum state can be implemented in such a way that they do not affect the ontic states in the support of that state. In fact this assumption is violated in the Spekkens toy model (which captures many aspects of quantum theory and in which different pure states of the model have overlapping distributions over ontic states). This paper proves that ontic indifference must be violated in any model reproducing quantum theory in which the quantum state is not a real thing. The argument presented in this paper is different from that given in a recent paper by Pusey, Barrett and Rudolph. It uses a different key assumption and it pertains to a single copy of the system in question.
It is well known that classical systems governed by ODE or PDE can have extremely complex emergent properties. Many researchers have asked: is it possible that the statistical correlations which emerge over time in classical systems would allow effects as complex as those generated by quantum field theory (QFT)? For example, could parallel computation based on classical statistical correlations in systems based on continuous variables, distributed over space, possibly be as powerful as quantum computing based on entanglement? This paper proves that the answer to this question is essentially "yes," with certain caveats.
More precisely, the paper shows that the statistics of many classical ODE and PDE systems obey dynamics remarkably similar to the Heisenberg dynamics of the corresponding quantum field theory (QFT). It supports Einstein's conjecture that much of quantum mechanics may be derived as a statistical formalism describing the dynamics of classical systems.
Predictions of QFT result from combining quantum dynamics with quantum measurement rules. Bell's Theorem experiments which rule out "classical field theory" may therefore be interpreted as ruling out classical assumptions about measurement which were not part of the PDE. If quantum measurement rules can be derived as a consequence of quantum dynamics and gross thermodynamics, they should apply to a PDE model of reality just as much as they apply to a QFT model. This implies: (1) the real advantage of "quantum computing" lies in the exploitation of quantum measurement effects, which may have possibilities well beyond today's early efforts; (2) Lagrangian PDE models assuming the existence of objective reality should be reconsidered as a "theory of everything." This paper will review the underlying mathematics, prove the basic points, and suggest how a PDE-based approach might someday allow a finite, consistent unified field theory far simpler than superstring theory, the only known alternative to date.
Depending on the outcome of the triphoton experiment now underway, it is possible that the new local realistic Markov Random Field (MRF) models will be the only models now available to correctly predict both that experiment and Bell's theorem experiments. The MRF models represent the experiments as graphs of discrete events over space-time. This paper extends the MRF approach to continuous time, by defining a new class of realistic model, the stochastic path model, and showing how it can be applied to ideal polaroid type polarizers in such experiments. The final section discusses possibilities for future research, ranging from uses in other experiments or novel quantum communication systems, to extensions involving stochastic paths in the space of functions over continuous space. As part of this, it derives a new Boltzmann-like density operator over Fock space, which predicts the emergent statistical equilibria of nonlinear Hamiltonian field theories, based on our previous work of extending the Glauber–Sudarshan P mapping from the case of classical systems described by a complex state variable α to the case of classical continuous fields. This extension may explain the stochastic aspects of quantum theory as the emergent outcome of nonlinear PDE in a time-symmetric universe.
We argue in a model-independent way that the Hilbert space of quantum gravity is locally finite-dimensional. In other words, the density operator describing the state corresponding to a small region of space, when such a notion makes sense, is defined on a finite-dimensional factor of a larger Hilbert space. Because quantum gravity potentially describes superpositions of different geometries, it is crucial that we associate Hilbert-space factors with spatial regions only on individual decohered branches of the universal wave function. We discuss some implications of this claim, including the fact that quantum-field theory cannot be a fundamental description of nature.
It is believed that classical behavior emerges in a quantum system due to decoherence. It has also been proposed that gravity can be a source of this decoherence. We examine this in detail by studying a number of quantum systems, including ultrarelativistic and nonrelativistic particles, at low and high temperatures in an expanding universe, and show that this proposal is valid for a large class of quantum systems.
There must exist a reformulation of quantum field theory which does not refer to classical time. We propose a pre-quantum, pre-spacetime theory, which is a matrix-valued Lagrangian dynamics for gravity, Yang–Mills fields, and fermions. The definition of spin in this theory leads us to an eight-dimensional octonionic spacetime. The algebra of the octonions reveals the standard model; model parameters are determined by roots of the cubic characteristic equation of the exceptional Jordan algebra. We derive the asymptotic low-energy value 1/137 of the fine structure constant, and predict the existence of universally interacting spin one Lorentz bosons, which replace the hypothesised graviton. Gravity is not to be quantized, but is an emergent four-dimensional classical phenomenon, precipitated by the spontaneous localisation of highly entangled fermions.
We introduce new mathematical aspects of the Bell states using matrix factorizations, non-noetherian singularities, and noncommutative blowups. A matrix factorization of a polynomial p consists of two matrices ϕ1, ϕ2 such that ϕ1ϕ2 = ϕ2ϕ1 = p id. Using this notion, we show how the Bell states emerge from the separable product of two mixtures, by defining pure states over complex matrices rather than just the complex numbers. We then show in an idealized algebraic setting that pure states are supported on non-noetherian singularities. Moreover, we find that the collapse of a Bell state is intimately related to the representation theory of the noncommutative blowup along its singular support. This presents an exchange in geometry: the nonlocal commutative spacetime of the entangled state emerges from an underlying local noncommutative spacetime.
We discuss the recently observed “loophole free” violation of Bell’s inequalities in the framework of a physically realist view of quantum mechanics (QM), which requires that physical properties are attributed jointly to a system, and to the context in which it is embedded. This approach is clearly different from classical realism, but it does define a meaningful “quantum realism” from a general philosophical point of view. Consistently with Bell test experiments, this quantum realism embeds some form of non-locality, but does not contain any action at a distance, in agreement with QM.
Understanding temporal processes and their correlations in time is of paramount importance for the development of near-term technologies that operate under realistic conditions. Capturing the complete multi-time statistics that define a stochastic process lies at the heart of any proper treatment of memory effects. This is well understood in classical theory, where a hierarchy of joint probability distributions completely characterizes the process at hand. However, attempting to generalize this notion to quantum mechanics is problematic: observing realizations of a quantum process necessarily disturbs the state of the system, breaking an implicit, and crucial, assumption in the classical setting. This issue can be overcome by separating the experimental interventions from the underlying process, enabling an unambiguous description of the process itself and accounting for all possible multi-time correlations for any choice of interrogating instruments.
In this paper, using a novel framework for the characterization of quantum stochastic processes, we first solve the long standing question of unambiguously describing the memory length of a quantum processes. This is achieved by constructing a quantum Markov order condition, which naturally generalizes its classical counterpart for the quantification of finite-length memory effects. As measurements are inherently invasive in quantum mechanics, one has no choice but to define Markov order with respect to the interrogating instruments that are used to probe the process at hand: different memory effects are exhibited depending on how one addresses the system, in contrast to the standard classical setting. We then fully characterize the structural constraints imposed on quantum processes with finite Markov order, shedding light on a variety of memory effects that can arise through various examples. Finally, we introduce an instrument-specific notion of memory strength that allows for a meaningful quantification of the temporal correlations between the history and the future of a process for a given choice of experimental intervention.
These findings are directly relevant to both characterizing and exploiting memory effects that persist for a finite duration. In particular, immediate applications range from developing efficient compression and recovery schemes for the description of quantum processes with memory to designing coherent control protocols that efficiently perform information-theoretic tasks, amongst a plethora of others.
In the last five years of his life Itamar Pitowsky developed the idea that the formal structure of quantum theory should be thought of as a Bayesian probability theory adapted to the empirical situation that Nature’s events just so happen to conform to a non-Boolean algebra. QBism too takes a Bayesian stance on the probabilities of quantum theory, but its probabilities are the personal degrees of belief a sufficiently-schooled agent holds for the consequences of her actions on the external world. Thus QBism has two levels of the personal where the Pitowskyan view has one. The differences go further. Most important for the technical side of both views is the quantum mechanical Born Rule, but in the Pitowskyan development it is a theorem, not a postulate, arising in the way of Gleason from the primary empirical assumption of a non-Boolean algebra. QBism on the other hand strives to develop a way to think of the Born Rule in a pre-algebraic setting, so that it itself may be taken as the primary empirical statement of the theory. In other words, the hope in QBism is that, suitably understood, the Born Rule is quantum theory’s most fundamental postulate, with the Hilbert space formalism (along with its perceived connection to a non-Boolean event structure) arising only secondarily. This paper will avail of Pitowsky’s program, along with its extensions in the work of Jeffrey Bub and William Demopoulos, to better explicate QBism’s aims and goals.
For the last hundred years, philosophers and physicists alike have been obsessed with the quest of comprehending the mysteries of quantum mechanics. I’ll argue that those who still pursue an understanding of quantum riddles are burdened with a classical view of reality and fail to truly embrace the fundamental quantum aspects of nature.
To date, quantum mechanics has proven to be our most successful theoretical model. However, it is still surrounded by a “mysterious halo”, which can be summarized in a simple but challenging question: Why quantum phenomena are not understood under the same logic as classical ones? Although this is an open question (probably without an answer), from a pragmatist's point of view there is still room enough to further explore the quantum world, marveling ourselves with new physical insights. We just need to look back in the historical evolution of the quantum theory and thoroughly reconsider three key issues: (1) how this theory has developed since its early stages at a conceptual level, (2) what kind of experiments can be performed at present in a laboratory, and (3) what nonstandard conceptual models are available to extract some extra information. This contribution is aimed at providing some answers (and, perhaps, also raising some issues) to these questions through one of such models, namely Bohmian mechanics, a hydrodynamic formulation of the quantum theory, which is currently trying to open new pathways of understanding. Specifically, the Chapter constitutes a brief and personal overview on the historic and contextual evolution of this quantum formulation, its physical meaning and interest (leaving aside metaphysical issues), and how it may help to overcome some preconceived paradoxical aspects of the quantum theory.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.