Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Using Monte Carlo model of biological evolution it is discovered that populations can switch between two different strategies of their genomes' evolution: Darwinian purifying selection and complementing the haplotypes. The first one is exploited in the large panmictic populations while the second one in the small highly inbred populations. The choice depends on the crossover frequency. There is a power law relation between the critical value of crossover frequency and the size of panmictic population. Under constant inbreeding this critical value of crossover does not depend on the population size and has a character of phase transition. Close to this value sympatric speciation is observed.
The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.
This paper determines the conditions required for fiscal contractions to be expansionary when government spending is financed by money seigniorage. It shows that the expansionary effects of permanent fiscal contractions are dependent on the initial rates of inflation prevailing in the economy and that the set of initial inflation rates under which fiscal contractions are expansionary is affected by the degree of substitutability between public and private consumption. Starting from the benchmark case where public and private consumption are independent, introducing complementarity between public and private consumption gives rise to a weaker set of conditions for fiscal contractions to be expansionary. This implies that economies that print money to finance government spending complementary with private consumption are more likely to experience expansions if they pursue fiscal contractions.
In this paper, we study several NCP-functions for the nonlinear complementarity problem (NCP) which are indeed based on the generalized Fischer–Burmeister function, ϕp(a, b) = ||(a, b)||p - (a + b). It is well known that the NCP can be reformulated as an equivalent unconstrained minimization by means of merit functions involving NCP-functions. Thus, we aim to investigate some important properties of these NCP-functions that will be used in solving and analyzing the reformulation of the NCP.
In the last decade direct detection Dark Matter (DM) experiments have increased enormously their sensitivity and ton-scale setups have been proposed, especially using germanium and xenon targets with double readout and background discrimination capabilities. In light of this situation, we study the prospects for determining the parameters of Weakly Interacting Massive Particle (WIMP) DM (mass, spin-dependent (SD) and spin-independent (SI) cross-section off nucleons) by combining the results of such experiments in the case of a hypothetical detection. In general, the degeneracy between the SD and SI components of the scattering cross-section can only be removed using targets with different sensitivities to these components. Scintillating bolometers, with particle discrimination capability, very good energy resolution and threshold and a wide choice of target materials, are an excellent tool for a multitarget complementary DM search. We investigate how the simultaneous use of scintillating targets with different SD-SI sensitivities and/or light isotopes (as the case of CaF2 and NaI) significantly improves the determination of the WIMP parameters. In order to make the analysis more realistic we include the effect of uncertainties in the halo model and in the spin-dependent nuclear structure functions, as well as the effect of a thermal quenching different from 1.
A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realization and a quantum realization. Extending the ''probability via expectation'' approach of Whittle to noncommuting quantities, this paper defines quantities, ensembles, and experiments as mathematical concepts and shows how to model complementarity, uncertainty, probability, nonlocality and dynamics in these terms. The approach carries no connotation of unlimited repeatability; hence it can be applied to unique systems such as the universe. Consistent experiments provide an elegant solution to the reality problem, confirming the insistence of the orthodox Copenhagen interpretation on that there is nothing but ensembles, while avoiding its elusive reality picture. The weak law of large numbers explains the emergence of classical properties for macroscopic systems.
Complementarity relations in composite bipartite quantum systems of arbitrary dimensions are proposed. Genuine bipartite quantum properties whose information content is quantified by the generalized concurrence mutually exclude the single-partite properties of the subsystems. The single-partite properties are determined by generalized predictabilities and visibilities, i.e. by the traditional concepts of wave-particle duality and define two complementary realities. These properties combined together are complementary to the generalized I-concurrence which quantifies genuine quantum correlations in n ⊗ m-dimensional bipartite systems. An important feature of the proposed quantitative complementarity relation is its upper bound which is given by a dimensional-dependent factor that exceeds unity in case of qudits with dimensionality d > 2. In d > 2-dimensional systems the information content can exceed one bit of information.
A comprehensive theory of phase for finite-dimensional quantum systems is developed. The only physical requirement imposed is that phase is complementary to amplitude. This complementarity is implemented by resorting to the notion of mutually unbiased bases. For a d-dimensional system, where d is a power of a prime, we explicitly construct d + 1 classes of maximally commuting operators, each one consisting of d - 1 operators. One of this class consists of diagonal operators that represent amplitudes and, by the finite Fourier transform, operators in this class are mapped to off-diagonal operators that can be appropriately interpreted as phases. The relevant example of a system of qubits is examined in detail.
We develop a unified, information theoretic interpretation of the number-phase complementarity that is applicable both to finite-dimensional (atomic) and infinite-dimensional (oscillator) systems, with number treated as a discrete Hermitian observable and phase as a continuous positive operator valued measure (POVM). The relevant uncertainty principle is obtained as a lower bound on entropy excess, X, the difference between the entropy of one variable, typically the number, and the knowledge of its complementary variable, typically the phase, where knowledge of a variable is defined as its relative entropy with respect to the uniform distribution. In the case of finite-dimensional systems, a weighting of phase knowledge by a factor μ (> 1) is necessary in order to make the bound tight, essentially on account of the POVM nature of phase as defined here. Numerical and analytical evidence suggests that μ tends to 1 as the system dimension becomes infinite. We study the effect of non-dissipative and dissipative noise on these complementary variables for an oscillator as well as atomic systems.
This essay contends that in quantum gravity, some spatial regions do not admit a unitary Hilbert space. Because the gravitational path integral spontaneously breaks CPT symmetry, “states” with negative probability can be identified on either side of trapped surfaces. I argue that these negative norm states are tolerable, by analogy to quantum mechanics. This viewpoint suggests a resolution of the firewall paradox, similar to black hole complementarity. Implications for cosmology are briefly discussed.
The history of complementary observables and mutual unbiased bases is shortly reviewed. A characterization is given in terms of conditional entropy of subalgebras due to Connes and Størmer. The extension of complementarity to noncommutative subalgebras is considered as well. Possible complementary decompositions of a four-level quantum system are described and a characterization of the Bell basis is obtained: The MASA generated by the Bell basis is complementary to both M2(ℂ) tensor factors.
The main non-classical feature of quantum key distribution (QKD) is that it is characterized by a trade-off relation that limits the information possibly gained by a spy and the quality of the transmission line between the authorized users. In particular, perfect cloning is impossible, due to this trade-off, while optimal imperfect cloning saturates the trade-off relation. We investigate by numerical methods the deep nature of this trade-off relation, in the case of optimal cloning, and find that it reveals a subtle interplay between fidelity and entanglement.
We design a double-slit experimental set-up, realizable at least in principle, for ascertaining three noncommuting quantum observables according to a method alternative to that of the "mean king" problem. Then, we exploit this experimental scheme to identify a physical situation where the known notions of quantum interferometry do not directly apply. A generalization of such notions is proposed that turns out to satisfy the authentic meaning of the interferometric quantum complementarity.
An analytic proof is given which shows that it is impossible to extend any triple of mutually unbiased (MU) product bases in dimension six by a single MU vector. Furthermore, the 16 states obtained by removing two orthogonal states from any MU product triple cannot figure in a (hypothetical) complete set of seven MU bases. These results follow from exploiting the structure of MU product bases in a novel fashion, and they are among the strongest ones obtained for MU bases in dimension six without recourse to computer algebra.
Complementarity is one of the central mysteries of quantum mechanics, dramatically illustrated by the wave-particle duality in Young's double-slit experiment, and famously regarded by Feynman as "impossible, absolutely impossible to describe classically, [and] which has in it the heart of quantum mechanics" (emphasis original).1 The overarching goal of this thesis is to demonstrate that complementarity is also at the heart of quantum information theory, that it allows us to make (some) sense of just what information "quantum information" refers to, and that it is useful in understanding and constructing quantum information processing protocols.
In 1935, Albert Einstein and two colleagues, Boris Podolsky and Nathan Rosen (EPR) developed a thought experiment to demonstrate what they felt was a lack of completeness in quantum mechanics (QM). EPR also postulated the existence of more fundamental theory where physical reality of any system would be completely described by the variables/states of that fundamental theory. This variable is commonly called hidden variable and the theory is called hidden variable theory (HVT). In 1964, John Bell proposed an empirically verifiable criterion to test for the existence of these HVTs. He derived an inequality, which must be satisfied by any theory that fulfill the conditions of locality and reality. He also showed that QM, as it violates this inequality, is incompatible with any local-realistic theory. Later it has been shown that Bell’s inequality (BI) can be derived from different set of assumptions and it also find applications in useful information theoretic protocols. In this review, we will discuss various foundational as well as information theoretic implications of BI. We will also discuss about some restricted nonlocal feature of quantum nonlocality and elaborate the role of Uncertainty principle and Complementarity principle in explaining this feature.
The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert–Greenberger–Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.
In this work, we study the wave particle duality and the Aharonov–Bohm effect in a Mach–Zehnder interferometer in a magnetic field which is in a superposition of two opposite directions. In addition, a “classical” experimental scheme is proposed which mimics such an interferometer.
Open innovation model is the best choice for the firms that cannot afford R&D costs but intent to continue playing the innovation game. This model offers to any firm the possibility to have companies spread worldwide and in all research fields as partners in R&D. However, the possible partnership can be restricted to the manager's know-who. Patent documents can be the source of rich information about technical development and innovation from a huge amount of firms. Search through all these daily created documents is a cumbersome task that technology managers cannot afford. This paper aims to introduce an automated model to seek matching firms' R&D using data mining techniques applied into patent documents database. The methodology considers the search for patent documents from possible partners and these data treatment through the association technique among IPC fields. An evaluation system was implemented and a sample experiment was made. The results reached are patterns of technological knowledge interdependence that can be used to evaluate four possible types of partnership.
In the field of product development, there is an increased focus on product complementarity. This study explores how the elements of multi-homing complementarity in the platform ecosystem (complementarity type, degree of platform complementarity, and degree of complementarity with other products) affect perceived product value. The data comprise 248 samples of smart-home multi-homing products, collected from the Amazon EC platform. The results indicate that prioritizing the degree of platform complementarity can increase the number of reviews, while prioritizing the degree of complementarity with other products can reduce that number. Focusing on unique complementarity can have a negative impact on ratings. Regarding the interaction term, the interaction of unique complementarity and the degree of complementarity with other products increases the number of reviews. Thus, unique complementarity should be included in the product-development process to ensure product quality and enhance sales.