Materials Science and Engineering/List of Topics/Quantum Mechanics/Postulates of Quantum Mechanics

Mathematical Formalism and Postulates of Quantum Mechanics
The mathematical formulation of quantum mechanics is the body of mathematical formalisms which permits a rigorous description of quantum mechanics. It is distinguished from mathematical formalisms for theories developed prior to the early 1900s by the use of abstract mathematical structures, such as infinite-dimensional Hilbert spaces and operators on these spaces. Many of these structures were drawn from functional analysis, a research area within pure mathematics that developed in parallel with, and was influenced by, the needs of quantum mechanics. In brief, values of physical observables such as energy and momentum were no longer considered as values of functions on phase space, but as eigenvalues of linear operators.

This formulation of quantum mechanics continues to be used today. At the heart of the description are ideas of quantum state and quantum observable which, for systems of atomic scale, are radically different from those used in previous models of physical reality. While the mathematics permits calculation of many quantities that can be measured experimentally, there is a definite theoretical limit to values that can be simultaneously measured. This limitation was first elucidated by Heisenberg through a thought experiment, and is represented mathematically in the new formalism by the non-commutativity of quantum observables.

Prior to the emergence of quantum mechanics as a separate theory, the mathematics used in physics consisted mainly of differential geometry and partial differential equations; probability theory was used in statistical mechanics. Geometric intuition clearly played a strong role in the first two and, accordingly, theories of relativity were formulated entirely in terms of geometric concepts. The phenomenology of quantum physics arose roughly between 1895 and 1915, and for the 10 to 15 years before the emergence of quantum theory (around 1925) physicists continued to think of quantum theory within the confines of what is now called classical physics, and in particular within the same mathematical structures. The most sophisticated example of this is the Sommerfeld-Wilson-Ishiwara quantization rule, which was formulated entirely on the classical phase space.

The Wavefunction
A wave function is a mathematical tool used in quantum mechanics to describe any physical system. It is a function from a space that consists of the possible states of the system into the complex numbers. The laws of quantum mechanics (i.e. the Schrödinger equation) describe how the wave function evolves over time. The values of the wave function are probability amplitudes — complex numbers — the squares of the absolute values of which, give the probability distribution that the system will be in any of the possible states.

For example, in an atom with a single electron, such as hydrogen or ionized helium, the wave function of the electron provides a complete description of how the electron behaves. It can be decomposed into a series of atomic orbitals which form a basis for the possible wave functions. For atoms with more than one electron (or any system with multiple particles), the underlying space is the possible configurations of all the electrons and the wave function describes the probabilities of those configurations.

Observables and Operators
In mathematics, an operator is a function, that operates on (or modifies) another function. Often, an "operator" is a function that acts on functions to produce other functions (the sense in which Oliver Heaviside used the term).

In physics, particularly in quantum physics, a system observable is a property of the system state that can be determined by some sequence of physical operations. For example, these operations might involve submitting the system to various electromagnetic fields and eventually reading a value off some gauge. In systems governed by classical mechanics, any experimentally observable value can be shown to be given by a real-valued function on the set of all possible system states. In quantum physics, on the other hand, the relation between system state and the value of an observable is more subtle, requiring some basic linear algebra to explain. In the mathematical formulation of quantum mechanics, states are given by non-zero vectors in a Hilbert space V (where two vectors are considered to specify the same state if, and only if, they are scalar multiples of each other) and observables are given by self-adjoint operators on V. However, as indicated below, not every self-adjoint operator corresponds to a physically meaningful observable. For the case of a system of particles, the space V consists of functions called wave functions.

In quantum mechanics, measurement of observables exhibits some seemingly mysterious phenomena. This often leads to many misconceptions about the nature of quantum mechanics itself. The facts of the matter, however, are far more ordinary. Specifically, if a system is in a state described by a wave function, the measurement process affects the state in a non-deterministic, but statistically predictable way. In particular, after a measurement is applied, the state description by a single wave function may be destroyed, being replaced by a statistical ensemble of wave functions. The irreversible nature of measurement operations in quantum physics is sometimes referred to as the measurement problem and is described mathematically by quantum operations. By the structure of quantum operations, this description is mathematically equivalent to that offered by relative state interpretation where the original system is regarded as a subsystem of a larger system and the state of the original system is given by the partial trace of the state of the larger system.

Physically meaningful observables must also satisfy transformation laws which relate observations performed by different observers in different frames of reference. These transformation laws are automorphisms of the state space, that is bijective transformations which preserve some mathematical property. In the case of quantum mechanics, the requisite automorphisms are unitary (or antiunitary) linear transformations of the Hilbert space V. Under Galilean relativity or special relativity, the mathematics of frames of reference is particularly simple, and in fact restricts considerably the set of physically meaningful observables.

Postulates of Quantum Mechanics
The following summary of the mathematical framework of quantum mechanics can be partly traced back to von Neumann's postulates.
 * Each physical system is associated with a (topologically) separable complex Hilbert space H with inner product $$\langle\phi\mid\psi\rangle$$. Rays (one-dimensional subspaces) in H are associated with states of the system. In other words, physical states can be identified with equivalence classes of vectors of length 1 in H, where two vectors represent the same state if they differ only by a phase factor. Separability is a mathematically convenient hypothesis, with the physical interpretation that countably many observations are enough to uniquely determine the state.
 * The Hilbert space of a composite system is the Hilbert space tensor product of the state spaces associated with the component systems. For a non-relativistic system consisting of a finite number of distinguishable particles, the component systems are the individual particles.
 * Physical symmetries act on the Hilbert space of quantum states unitarily or antiunitarily (supersymmetry is another matter entirely).
 * Physical observables are represented by densely-defined self-adjoint operators on H.
 * The expected value (in the sense of probability theory) of the observable A for the system in state represented by the unit vector $$\left|\psi\right\rangle\in H$$ is
 * $$\langle\psi\mid A\mid\psi\rangle$$
 * By spectral theory, we can associate a probability measure to the values of A in any state ψ. We can also show that the possible values of the observable A in any state must belong to the spectrum of A.  In the special case A has only discrete spectrum, the possible outcomes of measuring A are its eigenvalues.


 * More generally, a state can be represented by a so-called density operator, which is a trace class, nonnegative self-adjoint operator $$\rho$$ normalized to be of trace 1. The expected value of A in the state $$\rho$$ is
 * $$ \operatorname{tr}(A\rho)$$
 * If $$\rho_\psi$$ is the orthogonal projector onto the one-dimensional subspace of H spanned by $$\left|\psi\right\rangle$$, then
 * $$ \operatorname{tr}(A\rho_\psi)=\left\langle\psi\mid A\mid\psi\right\rangle$$
 * Density operators are those that are in the closure of the convex hull of the one-dimensional orthogonal projectors. Conversely, one-dimensional orthogonal projectors are extreme points of the set of density operators. Physicists also call one-dimensional orthogonal projectors pure states and other density operators mixed states.

The Uncertainty Principle
The Uncertainty Principle is a mathematical principle which follows from the definition of operators of momentum and position (namely, lack of commutativity between them) and which explains the behavior of the universe at atomic and subatomic scales.

The Uncertainty Principle was developed as an answer to the question: How does one measure the location of an electron around a nucleus if an electron is a wave? When quantum mechanics was developed, it was seen to be a relation between the classical and quantum descriptions of a system using wave mechanics.

In March 1926, working in Niels Bohr's institute, Werner Heisenberg formulated the principle of uncertainty thereby laying the foundation of what became known as the Copenhagen interpretation of quantum mechanics. Heisenberg had been studying the papers of Paul Dirac and Jordan. Heisenberg discovered a problem with measurement of basic variables in the equations. His analysis showed that uncertainties, or imprecisions, always turned up if one tried to measure the position and the momentum of a particle at the same time. Heisenberg concluded that these uncertainties or imprecisions in the measurements were not the fault of the experimenter, but fundamental in nature and are inherent mathematical properties of operators in quantum mechanics arising from definitions of these operators.

The term Copenhagen interpretation of quantum mechanics was often used interchangeably with and as a synonym for Heisenberg's Uncertainty Principle by detractors who believed in fate and determinism and saw the common features of the Bohr-Heisenberg theories as a threat. Within the widely but not universally accepted Copenhagen interpretation of quantum mechanics (i.e. it was not accepted by Einstein or other physicists such as Alfred Lande), the uncertainty principle is taken to mean that on an elementary level, the physical universe does not exist in a deterministic form, but rather as a collection of probabilities, or potentials. For example, the pattern (probability distribution) produced by millions of photons passing through a diffraction slit can be calculated using quantum mechanics, but the exact path of each photon cannot be predicted by any known method. The Copenhagen interpretation holds that it cannot be predicted by any method, not even with theoretically infinitely precise measurements.

If one goes even further to the direct interpretation that classical physics and ordinary language are only approximations to a completely quantum reality, then the probabilities are assigned to these approximations and are no longer fundamental. The equations of quantum mechanics themselves specify the progression of the quantum state of any isolated system uniquely.

Uncertainty Principle Versus Observer Effect
The uncertainty principle in quantum mechanics is sometimes erroneously explained by claiming that the measurement of position necessarily disturbs a particle's momentum, and vice versa—i.e., that the uncertainty principle is a manifestation of the observer effect. Indeed, Heisenberg himself may have initially offered explanations which suggested this view. Prior to the more modern understanding, a measurement was often visualized as a physical disturbance inflicted directly on the measured system, being sometimes illustrated as a thought experiment called Heisenberg's microscope. For instance, when measuring the position of an electron, one imagines shining a light on it, thus disturbing the electron and producing the quantum mechanical uncertainties in its position.

Equating the uncertainty principle and the observer effect mischaracterizes the way measurement in quantum mechanics is understood. The uncertainty principle is an inequality related to the statistical spreading of the wave function whereas the observer effect creates a systematic error. Because the uncertainty principle is a statistical property of the wave function, it appears most readily in a sequence of several measurements.

Consider a hypothetical experiment in which a physicist prepares an ensemble of 2N particles in the same way (so that each is in precisely the same initial quantum state). Suppose further that the physicist is using perfect measuring equipment and that N is sufficiently large so that the net result is statistically significant. For the first N particles of this ensemble, the position would be measured and recorded, giving a probability distribution for position. For the remaining N particles, momentum would be measured, giving a probability distribution for momentum. Finally, the product of the standard deviations would be computed, giving a value of at least $$\hbar/2$$. If the position and momentum had been measured subsequently for the same particle, then the results of the second measurement would not reflect the original state, due to a correct application of the observer effect. But in this experiment, no such claim is made. The physicist never attempts to measure the position and momentum of a single particle but measures them for a different set of N particles from the same initial state. One measurement cannot affect the other. Moreover, although each measurement collapses the quantum state of the particle, the probability distribution resulting from these measurements will correctly reflect the quantum state as it existed before the measurement. Consequently, the uncertainty principle should be considered an intrinsic smearing of statistical information instead of a limitation on measuring equipment.

In any case, it is now understood that the uncertainties in the system exist prior to and independent of the measurement, and the uncertainty principle is therefore independent of the observer effect.

Consequences of quantum symmetry
Remarkably, there exists a realm of physics for which mathematical assertions of simple symmetries in real objects cease to be approximations. That is the domain of quantum physics, which for the most part is the physics of very small, very simple objects such as electrons, protons, light, and atoms.

Unlike everyday objects, objects such as electrons have very limited numbers of configurations, called states, in which they can exist. This means that when symmetry operations such as exchanging the positions of components are applied to them, the resulting new configurations often cannot be distinguished from the originals no matter how diligent an observer is. Consequently, for sufficiently small and simple objects the generic mathematical symmetry assertion F(x) = x ceases to be approximate, and instead becomes an experimentally precise and accurate description of the situation in the real world.

While it makes sense that symmetries could become exact when applied to very simple objects, the immediate intuition is that such a detail should not affect the physics of such objects in any significant way. This is in part because it is very difficult to view the concept of exact similarity as physically meaningful. Our mental picture of such situations is invariably the same one we use for large objects: We picture objects or configurations that are very, very similar, but for which if we could "look closer" we would still be able to tell the difference.

However, the assumption that exact symmetries in very small objects should not make any difference in their physics was discovered in the early 1900s to be spectacularly incorrect. The situation was succinctly summarized by Richard Feynman in the direct transcripts of his Feynman Lectures on Physics, Volume III, Section 3.4, Identical particles. (Unfortunately, the quote was edited out of the printed version of the same lecture.)


 * "... if there is a physical situation in which it is impossible to tell which way it happened, it always interferes; it never fails."

The word "interferes" in this context is a quick way of saying that such objects fall under the rules of quantum mechanics, in which they behave more like waves that interfere than like everyday large objects.

In short, when an object becomes so simple that a symmetry assertion of the form F(x) = x becomes an exact statement of experimentally verifiable sameness, x ceases to follow the rules of classical physics and must instead be modeled using the more complex—and often far less intuitive—rules of quantum physics.

This transition also provides an important insight into why the mathematics of symmetry are so deeply intertwined with those of quantum mechanics. When physical systems make the transition from symmetries that are approximate to ones that are exact, the mathematical expressions of those symmetries cease to be approximations and are transformed into precise definitions of the underlying nature of the objects. From that point on, the correlation of such objects to their mathematical descriptions becomes so close that it is difficult to separate the two.

The Correspondence Principle
In physics, the correspondence principle is a quantitative tool in the old quantum theory, explicitly formulated by Niels Bohr in 1923. It says that the behavior of quantum mechanical systems reproduce classical physics in the limit of large quantum numbers.

Expectation Values
Quantum physics shows an inherent statistical behaviour: The measured outcome of an experiment will generally not be the same if the experiment is repeated several times. Only the statistical mean of the measured values, averaged over a large number of runs of the experiment, is a repeatable quantity. Quantum theory does not, in fact, predict the result of individual measurements, but only their statistical mean. This predicted mean value is called the expectation value.

While the computation of the mean value of experimental results is very much the same as in classical statistics, its mathematical representation in the formalism of quantum theory differs significantly from classical measure theory.

Formalism in quantum mechanics
In quantum theory, an experimental setup is described by the observable $$A$$ to be measured, and the state $$\sigma$$ of the system. The expectation value of $$A$$ in the state $$\sigma$$ is denoted as $$\langle A \rangle_\sigma$$.

Mathematically, $$A$$ is a selfadjoint operator on a Hilbert space. In the most commonly used case in quantum mechanics, $$\sigma$$ is a pure state, described by a normalized vector $$\psi$$ in the Hilbert space. The expectation value of $$A$$ in the state $$\psi$$ is defined as

(1)     $$ \langle A \rangle_\psi = \langle \psi | A \psi \rangle $$.

If dynamics is considered, either the vector $$\psi$$ or the operator $$A$$ is taken to be time-dependent, depending on whether the Schrödinger picture or Heisenberg picture is used. The time-dependence of the expectation value does not depend on this choice, however.

If $$A$$ has a complete set of eigenvectors $$\phi_j$$, with eigenvalues $$a_j$$, then (1) can be expressed as

(2)     $$ \langle A \rangle_\psi = \sum_j a_j |\langle \psi | \phi_j \rangle|^2 $$.

This expression is similar to the arithmetic mean, and illustrates the physical meaning of the mathematical formalism: The eigenvalues $$a_j$$ are the possible outcomes of the experiment, and their corresponding coefficient $$|\langle \psi | \phi_j \rangle|^2$$ is the probability that this outcome will occur; it is often called the transition probability.

A particularly simple case arises when $$A$$ is a projection, and thus has only the eigenvalues 0 and 1. This physically corresponds to a "yes-no" type of experiment. In this case, the expectation value is the probability that the experiment results in "1", and it can be computed as

(3)     $$ \langle A \rangle_\psi = \| A \psi \|^2$$.

In quantum theory, also operators with non-discrete spectrum are in use, such as the position operator $$Q$$ in quantum mechanics. This operator does not have eigenvalues, but has a completely continuous spectrum. In this case, the vector $$\psi$$ can be written as a complex-valued function $$\psi(x)$$ on the spectrum of $$Q$$ (usually the real line). For the expectation value of the position operator, one then has the formula

(4)     $$ \langle Q \rangle_\psi = \int \, x \, |\psi(x)|^2 \, dx$$.

A similar formula holds for the momentum operator $$P$$, in systems where it has continuous spectrum.

All the above formulae are valid for pure states $$\sigma$$ only. Prominently in thermodynamics, also mixed states are of importance; these are described by a positive trace-class operator $$\rho = \sum_i \rho_i | \psi_i \rangle \langle \psi_i |$$, the statistical operator or density matrix. The expectation value then can be obtained as

(5)     $$ \langle A \rangle_\rho = \mathrm{Trace} (\rho A) =  \sum_i \rho_i \langle \psi_i | A \psi_i \rangle = \sum_i \rho_i \langle A \rangle_{\psi_i} $$.

The Ehrenfest Theorem
The Ehrenfest theorem, named after Paul Ehrenfest, relates the time derivative of the expectation value for a quantum mechanical operator to the commutator of that operator with the Hamiltonian of the system. It is


 * $$\frac{d}{dt}\langle A\rangle = \frac{1}{i\hbar}\langle [A,H] \rangle + \left\langle \frac{\partial A}{\partial t}\right\rangle $$

where A is some QM operator and $$\langle A\rangle$$ is its expectation value. Ehrenfest's theorem fits neatly into the Heisenberg picture of quantum mechanics.

Ehrenfest's theorem is closely related to Liouville's theorem from Hamiltonian mechanics, which involves the Poisson bracket instead of a commutator. In fact, it is a general rule of thumb that a theorem in quantum mechanics which contains a commutator can be turned into a theorem in classical mechanics by changing the commutator into a Poisson bracket and multiplying by $$i\hbar$$.

The theorem can be shown to follow from the Lindblad equation, a master equation for the time evolution of a mixed state.

Derivation
Suppose some system is presently in a quantum state $$\Phi$$. If we want to know the instantaneous time derivative of the expectation value of A, that is, by definition


 * $$ \frac{d}{dt}\langle A\rangle = \frac{d}{dt}\int \Phi^* A \Phi~dx^3 = \int \left( \frac{\partial \Phi^*}{\partial t} \right) A\Phi~dx^3 + \int \Phi^* \left( \frac{\partial A}{\partial t}\right) \Phi~dx^3 +\int \Phi^* A \left( \frac{\partial \Phi}{\partial t} \right) ~dx^3 $$


 * $$ = \int \left( \frac{\partial \Phi^*}{\partial t} \right) A\Phi~dx^3 + \left\langle \frac{\partial A}{\partial t}\right\rangle + \int \Phi^* A \left( \frac{\partial \Phi}{\partial t} \right) ~dx^3, $$

where we are integrating over all space. Often (but not always) the operator A is time independent, so that its derivative is zero and we can ignore the middle term. If we apply the Schrödinger equation, we find that


 * $$\frac{\partial \Phi}{\partial t} = \frac{1}{i\hbar}H\Phi$$

and


 * $$\frac{\partial \Phi^*}{\partial t} = \frac{-1}{i\hbar}\Phi^*H^* = \frac{-1}{i\hbar}\Phi^*H.$$

Notice $$H=H^*$$ because the Hamiltonian is hermitian. Placing this into the above equation we have


 * $$\frac{d}{dt}\langle A\rangle = \frac{1}{i\hbar}\int \Phi^* (AH-HA) \Phi~dx^3 + \left\langle \frac{\partial A}{\partial t}\right\rangle = \frac{1}{i\hbar}\langle [A,H]\rangle + \left\langle \frac{\partial A}{\partial t}\right\rangle.$$

Wave Packets
In physics, a wave packet is an envelope or packet containing an arbitrary number of wave forms. In quantum mechanics the wave packet is ascribed a special significance: it is interpreted to be a "probability wave" describing the probability that a particle or particles in a particular state will be measured to have a given position and momentum.

By applying the Schrödinger equation in quantum mechanics it is possible to deduce the time evolution of a system, similar to the process of the Hamiltonian formalism in classical mechanics. The wave packet is a mathematical solution to the Schrödinger equation. The square of the area under the wave packet solution is interpreted to be the probability density of finding the particle in a region.

In the coordinate representation of the wave (such as the Cartesian coordinate system) the position of the wave is given by the position of the packet. Moreover, the narrower the wave packet, and therefore the better defined the position of the wave packet, the larger the uncertainty in the momentum of the wave. This trade-off is known as the Heisenberg uncertainty principle.

Reference
Nouredine Zettili, "Quantum Mechanics: Concepts and Application". John Wiley & Sons, LTD. New York, 2001