Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The quantum computer is supposed to process information by applying unitary transformations to 2N complex amplitudes defining the state of N qubits. A useful machine needing N~103 or more, the number of continuous parameters describing the state of a quantum computer at any given moment is at least 21000 ~10300 which is much greater than the number of protons in the Universe. However, the theorists believe that the feasibility of large-scale quantum computing has been proved via the “threshold theorem”. Like for any theorem, the proof is based on a number of assumptions considered as axioms. However, in the physical world none of these assumptions can be fulfilled exactly. Any assumption can be only approached with some limited precision. So, the rather meaningless “error per qubit per gate” threshold must be supplemented by a list of the precisions with which all assumptions behind the threshold theorem should hold. Such a list still does not exist. The theory also seems to ignore the undesired free evolution of the quantum computer caused by the energy differences of quantum states entering any given superposition. Another important point is that the hypothetical quantum computer will be a system of 103 –106 qubits PLUS an extremely complex and monstrously sophisticated classical apparatus. This huge and strongly nonlinear system will generally exhibit instabilities and chaotic behavior.