Measurement and Superposition

In order to observe or measure something, you have to interact with the system being observed. Just looking at an object through a microscope, for example, requires that the object be bombarded with electromagnetic radiation (light); otherwise you won’t be able to see it. However, if you want to measure something extremely tiny, like an atom or subatomic particle, even the simple act of shining light on the object will significantly affect the thing you’re trying to measure. In fact, any type of measurement at the subatomic scale significantly alters the object being measured. This fact, together with the strange wave-like nature of fundamental particles, limits what we can measure or observe at the subatomic scale.

photo of Werner Heisenberg in 1926
Werner Heisenberg
(1901 - 1976)

Though many scientists were involved in the development of quantum theory, Heisenberg’s momentous contributions earned him the 1932 Nobel Prize in Physics “for the creation of quantum mechanics.” This photo was taken in 1926, around the time Heisenberg was developing the theory.

Credit: Friedrich Hund, CC BY 3.0 File available via Wikimedia Commons.

One of the most important limitations on measurement was first recognized by German physicist Werner Heisenberg, a leading pioneer of quantum mechanics.  (He was also a Christian, by the way.)See this brief essay for some insights into Heisenberg’s personal faith. The more precisely the position of a particle is measured, the less precisely its momentum at that time can be known, and vice versa. This is called Heisenberg’s uncertainty principle.

More precisely, Heisenberg’s uncertainty principle says that the standard deviation of the position measurement times the standard deviation of the momentum measurement must be greater than or equal to a certain constant (namely, half the reduced Planck constant): σxσp   ≥   ħ/2

Heisenberg’s uncertainty principle isn’t merely a limitation on what we can measure. Though Heisenberg initially proposed his uncertainty principle as a limitation on measurement, today physicists understand it as a deeper principle about reality itself. The uncertainty principle expresses a fundamental limitation on the degree to which particles actually have precise positions and momenta. A particle can’t have a precise position and a precise momentum at the same time. The actual state of the particle itself is sort of vague or “fuzzy,” so to speak.

According to quantum mechanics, the state of a physical system (e.g. a particle or collection of particles) doesn’t necessarily correspond to any definite values for measurement variables like position or velocity. When a system doesn’t have a definite value for a given measurement variable, the system is said to be in a superposition of various possible values for that variable. In the double-slit experiment, for example, an electron doesn’t have a definite position as it passes through the slits: it is in a superposition of both locations.

In fact, according to quantum mechanics, electrons and other subatomic particles never have exact positions at all. They are always in superpositions, but the degree of “fuzziness” can vary. A particle’s position can be smeared out, so to speak, over larger or smaller regions of space. Similarly, a subatomic particle never has an exact momentum. It is always in a superposition of many possible momenta, and this superposition may be spread over a larger or smaller range.

Measuring the position of a particle forces its superposition to “collapse” to a relatively small region of space. The more precisely you measure a particle’s position, the less fuzzy its position becomes. At the same time, measuring a particle’s position forces its momentum to become fuzzier; and conversely, measuring the particle’s momentum causes its position to become fuzzier. The more precisely you measure a particle’s momentum, the fuzzier its position becomes, and vice versa: the more precisely you measure position, the fuzzier momentum becomes. Position and momentum are said to be complementary variables, which means that their degrees of superposition are inversely correlated: the greater the superposition of one, the smaller the superposition of the other. (The fuzzier one is, the less fuzzy the other is.) Another example of complementary variables involves the polarization of light, as we’ll see later.