# Quantum-bit definition

The smallest unit of information in a quantum computer, existing in a superposition of two states (1 and 0), and settling on one state or the other only when a measurement of the state is made in order to retrieve the output of the computation.
noun
The smallest unit of information in a quantum computer. Unlike bits in classical systems, which are in one of two possible states labelled 1 and 0, a quantum bit exists in a superposition of these two states, settling on one or the other only when a measurement of the state is made.
See qubit.
(physics) A normalized linear combination of an "up" state and a "down" state for, say, the spin of an electron. (The "up" and "down" states can be "measured" along some particular direction. Such measurement could be performed by applying a magnetic field in that direction, in which case the electron responds non-classically in either one of two ways: (1) it emits no photon, in which case it would have been collapsed in the "up" state, or (2) it emits a photon of a certain energy (always the same), in which case it would have been collapsed in the "down" state. Afterwards, the "north" pole of the electron's magnetic moment points towards the "south" pole of the magnetic field; so it would stay in the "up" state, regardless of which direction it collapsed in. Pre-measurement, the coefficient of the "up" state times its complex conjugate gives the probability of collapsing in the "up" state (during measurement) and the coefficient of the "down" state times its complex conjugate gives the probability of collapsing in the "down" state (during measurement). The two probabilities add up to one, i.e., the linear combination is normalized.)
noun