Dot product



The dot product, also commonly known as the "inner product", or, less commonly, the "scalar product", is a number associated with a pair of vectors. It figures prominently in many problems in physics, and variants of it appear in an enormous number of mathematical areas.

Geometric Definition
It is defined geometrically as the product of the lengths of the two vectors times the cosine of the angle between them. This definition is used in 2 dimensions (plane geometry) or 3 dimensions (solid geometry and physics.) The dot product is written with a raised dot between the vectors,
 * $$\vec{A} \cdot \vec{B} = AB\cos\theta$$,

where $$\theta$$ is the angle between the two vectors. Following a common convention, the magnitudes have been written as the variable without the vector symbol: $$A\equiv ||\vec A||$$, and $$B\equiv ||\vec B||$$.

A few properties are immediately apparent.
 * It is symmetric: $$\vec{A} \cdot \vec{B} = \vec{B} \cdot \vec{A}\,$$.
 * It is linear in each argument: $$(\lambda\vec{A}) \cdot \vec{B} = \vec{A} \cdot (\lambda\vec{B}) = \lambda (\vec{A} \cdot \vec{B})\,$$.
 * If the two vectors are orthogonal (perpendicular) to each other, their dot product is zero. This is because the cosine of 90° is zero.
 * The length of a vector is the square root of the dot product of the vector with itself. This is because the angle is 0, so the cosine is 1.
 * $$\|\vec{A}\| = \sqrt{\vec{A} \cdot \vec{A}}\,$$
 * (The double absolute value sign, shown above, is known as the norm. It can be thought of as a generalization of the absolute value of a number.  It is used in many places in mathematics.  For vectors, it is the length.)

Algebraic definition
It turns out that, given the components of vectors in a Cartesian (Euclidean) vector space, there is an extremely simple way to calculate the dot product. It is the sum of the pairwise products of the components of the vectors:
 * $$\vec{A} \cdot \vec{B} = A_xB_x + A_yB_y + A_zB_z\,$$.

or, depending on what notation you prefer:
 * $$\vec{A} \cdot \vec{B} = A_1B_1 + A_2B_2 + A_3B_3 = \sum_{i=1}^3 A_iB_i\,$$.

This definition is also used in 2 dimensions, with only two terms in the addition.


 * It needs to be emphasized that this formula only works if the components of the vectors are described in Cartesian space. In other coordinate systems, it may be necessary to use other, somewhat more complicated formulas.

The equivalence of this algebraic definition and the earlier geometric one is extremely important. Among many other things, it lets us calculate the angle between two vectors, given their components. Calculate the lengths of the vectors as the square root of the dot product of each vector with itself, calculate the dot product of the two vectors, divide by the lengths, and take the inverse cosine:
 * $$\theta = \cos^{-1} \left(\frac{A_xB_x + A_yB_y + A_zB_z}{\|A\| \|B\|}\right)\,$$.

Using the algebraic definition, one can see that the dot product is additive in each argument:
 * $$\vec{A} \cdot (\vec{B} + \vec{C}) = \vec{A} \cdot \vec{B} + \vec{A} \cdot \vec{C}\,$$.
 * $$(\vec{A} + \vec{B}) \cdot \vec{C} = \vec{A} \cdot \vec{C} + \vec{B} \cdot \vec{C}\,$$.

something which is not so easy to see from the geometric definition.

Proof of Equivalence of the Two Definitions
If one is good at geometric visualization in 3 dimensions, and has a good understanding of the Euclidean rotation group, one can rotate the Cartesian coordinate system around until the two vectors lie in the x-y plane, and the first vector lies along the x axis.
 * This requires knowing that any rotation around a coordinate axis is a transformation by an orthogonal matrix, that the sum of the pairwise products of components is unchanged by an orthogonal transformation, and that the two vectors can be rotated into the required position by a combination of rotations around coordinate axes.
 * Once this is done, it becomes a problem in 2 dimensions.

$$B_x\,$$ is the projection of $$\vec{B}\,$$ onto the x axis, which is also the length of $$\vec{B}\,$$ times the cosine of $$\theta\,$$.
 * $$B_x = \|B\| \cos\theta\,$$
 * $$A_x = \|A\|\,$$
 * $$A_y = 0\,$$
 * $$A_xB_x + A_yB_y = A_xB_x = \|A\| \|B\| \cos\theta\,$$

Another way to prove this is to prove it first for the norm of a vector, and then use the law of cosines.

We first need to show that the geometric length of $$\vec{A}\,$$ is $$\sqrt{A_x^2 + A_y^2 + A_z^2}\,$$

Whatever coordinate system we are in, the vector $$\vec{A}\,$$ runs from one corner to the opposite corner of a rectangular box with sides $$A_x\,$$, $$A_y\,$$, and $$A_z\,$$.

Two applications of Pythagoras' theorem suffice to prove the required result. The length of a vector is the square root of the sum of the squares of its components.

Next, whatever coordinate system we are in, form the plane containing $$\vec{A}\,$$ and $$\vec{B}\,$$, and use plane geometry. Let $$\vec{C} = \vec{A} - \vec{B}\,$$. That is, $$\vec{C}\,$$ runs from the tip of $$\vec{B}\,$$ to the tip of $$\vec{A}\,$$. Form the triangle with those vectors as sides.

By the law of cosines for a triangle:
 * $$\|C\|^2 = \|A\|^2 + \|B\|^2 - 2 \|A\|\ \|B\| \cos\theta\,$$

Since we have established that the algebraic definition works for calculating norms:
 * $$C_x^2 + C_y^2 + C_z^2 = A_x^2 + A_y^2 + A_z^2 + B_x^2 + B_y^2 + B_z^2 - 2 \|A\| \|B\| \cos\theta\,$$

Since $$\vec{C} = \vec{A} - \vec{B}\,$$:
 * $$A_x^2 + B_x^2 - 2 A_xB_x + A_y^2 + B_y^2 - 2 A_yB_y + A_z^2 + B_z^2 - 2 A_zB_z = A_x^2 + A_y^2 + A_z^2 + B_x^2 + B_y^2 + B_z^2 - 2 \|A\| \|B\| \cos\theta\,$$

so
 * $$- 2 A_xB_x - 2 A_yB_y - 2 A_zB_z = - 2 \|A\| \|B\| \cos\theta\,$$

so
 * $$A_xB_x + A_yB_y + A_zB_z = \|A\| \|B\| \cos\theta\,$$