Please login to be able to save your searches and receive alerts for new content matching your search criteria.
In a quantum measurement setting, it is known that environment-induced decoherence theory describes the emergence of effectively classical features of the quantum system–measuring apparatus composite system when the apparatus is allowed to interact with the environment. In [E. A. Galapon, Europhys. Lett.113, 60007 (2016)], a measurement model is found to have the feature of inducing exact decoherence at a finite time via one internal degree of freedom of the apparatus provided that the apparatus is decomposed into a pointer and an inaccessible probe, with the pointer and the probe being in momentum-limited initial states. However, an issue can be raised against the model: while the factorization method of the time-evolution operator used there is formally correct, it is not completely rigorous due to some unstated conditions on the validity of the factorization in the Hilbert space of the model. Furthermore, no examples were presented there in implementing the measurement scheme in specific quantum systems. The goal of this paper is to re-examine the model and confirm its features independently by solving the von Neumann equation for the joint state of the composite system as a function of time. This approach reproduces the joint state obtained in the original work, leading to the same conditions for exact decoherence and orthogonal pointer states when the required initial conditions on the probe and pointer are imposed. We illustrate the exact decoherence process in the measurement of observables of a spin-1/2 particle and a quantum harmonic oscillator by using the model.
Research on software measurement can be organized around five key conceptual and methodological issues: how to apply measurement theory to software, how to frame software metrics, how to develop metrics, how to collect core measures, and how to analyze measures. The subject is of special concern for the industry, which is interested in improving practices — mainly in developing countries, where the software industry represents an opportunity for growth and usually receives institutional support for matching international quality standards. Academics are also in need of understanding and developing more effective methods for managing the software process and assessing the success of products and services, as a result of an enhanced awareness about the emergency of aligning business processes and information systems. This paper unveils the fundamentals of measurement in software engineering and discusses current issues and foreseeable trends for the subject. A literature review was performed within major academic publications in the last decade, and findings suggest a sensible shift of measurement interests towards managing the software process as a whole — without losing from sight the customary focus on hard issues like algorithm efficiency and worker productivity.
A classical deterministic, reversible dynamical system, reproducing the Einstein–Podolsky–Rosen (EPR) correlations in full respect of causality and locality and without the introduction of any ad hoc selection procedure, was constructed in Ref. 4.
In this paper we prove that the above-mentioned model is unique (see Theorem 3.1) in the sense that any local causal probability measure which reproduces the EPR correlations must coincide, under natural and generic assumptions, with the one constructed in Ref. 4.
In this work, we have studied the post-measurement dynamics of several electrons in a double quantum dot system using a new method of calculation. We have solved the effective-mass non-linear Schrödinger equation for two electron wave packets in a double quantum well taking into account the different number of projected electrons in each quantum well after an observation. It is found the existence of a critical density that reduces strongly the tunneling dynamics between both quantum wells. In addition, we have shown the possibility of having a new kind of electromagnetic radiation emerging from a semiconductor device after a quantum observation.
The understanding about the creation of our universe is explored in many philosophies, natural sciences, religions, ideologies, traditions and many other disciplines. Currently, natural science cannot answer this question at the most fundamental level. In this work, based on the ancient Chinese Tao wisdom about creation, we propose the Law of Tao Yin–Yang Creation. This law states that everything is created from emptiness through yin–yang interaction. Yin and yang are the two basic elements that make up everything. Yin and yang are opposite, relative, co-created, inseparable and co-dependent. The Law of Tao Yin–Yang Creation gives us a deeper insight about space and time. We propose that space and time are two basic measurements we conduct. Time relates to the measurement of movement and change. Space relates to the measurement of stillness and solidity. Space and time are a yin–yang pair. Interaction of two fundamental yin–yang pairs, the space and time yin–yang pair and the inclusion and exclusion duality pair, create our universe. We demonstrate that from this insight, one can derive string theory, superstring or M-theory and the universal wave function interpretation of string theory. We suggest that the Law of Tao Yin–Yang Creation presents the exact process how “it from bit” and it could be the fundamental principle leading to the grand unification theory and the theory of everything.
The purpose of this paper is to explore how weak scales, defined in the representational theory of measurement as nominal scales or ordinal scales, can be calibrated. Prior to this exploration, the definition of the calibration process is analysed then extended to a more general form compatible with weak scales. The opportunity to apply such process to nominal scales and ordinal scales is presented. Then the calibration of metrical scales is proposed. Finally an application to the calibration of a colour space is presented.