Vector spaces

Motivation
If the reader is familiar with analytic geometry, she will probably know that points in the plane can be identified by ordered tuples $$\begin{align}(x, y)\end{align}$$ where each entry is a number denoting the distance of the point from the origin in a certain direction. We call $$\begin{align}x\end{align}$$ and $$\begin{align}y\end{align}$$ the coordinates of the point in the plane, and they are often real numbers.

Although these ordered tuples are useful for describing the plane, it would seem that they lack some of the desirable behaviour of real numbers. Consider the equation $$\begin{align}2 \cdot (42.125 - x)=98.543\end{align}$$; we know that it defines a unique number $$\begin{align}x\end{align}$$, and we can find that number by noting that the equation is equivalent to $$\begin{align}\frac{84.25 - 98.543}{2} = x\end{align}$$. Can we do the same for ordered tuples? Given an equation like $$\begin{align}4 \cdot ((x,y) + (13,12.5))=(1,4)\end{align}$$, can we identify $$\begin{align}(x,y)\end{align}$$ ? Does the equation even make sense?

As you may have guessed, this equation does make sense, and yes, we can solve it, but first we must make clear what it means to add two ordered tuples, and what it means to multiply them by a number.

Definitions
A vector space $$\begin{align}V\end{align}$$ over a field $$\begin{align}F\end{align}$$ is a set of objects under two binary operations

$$ \begin{align} + &: V \times V \rightarrow V\\ \cdot &: F \times V \rightarrow V \end{align} $$

typically called vector addition and scalar multiplication respectively, which satisfies axioms below. Note that we call elements of $$\begin{align}V\end{align}$$ vectors and elements of $$\begin{align}F\end{align}$$ scalars.


 * 1) There is a vector $$\vec{z} \in V$$ such that for any $$\vec{x} \in V$$ we have $$\begin{align}\vec{z} + \vec{x} = \vec{x}\end{align}$$ (existence of additive identity).
 * 2) For any $$\vec{x}, \vec{y} \in V$$ we have $$\begin{align}\vec{x} + \vec{y} = \vec{y} + \vec{x}\end{align}$$ (commutativity of vector addition).
 * 3) For any $$\vec{w}, \vec{x}, \vec{y} \in V$$ we have $$\begin{align}\vec{w} + (\vec{x} + \vec{y}) = (\vec{w} + \vec{x}) + \vec{y}\end{align}$$ (associativity of vector addition).
 * 4) For any $$\vec{x} \in V$$ there is a vector $$\vec{y} \in V$$ such that $$\begin{align}\vec{x} + \vec{y} = \vec{z}\end{align}$$ where $$\vec{z}$$ is the additive identity mentioned above (existence of additive inverse).


 * 5) There is a scalar $$a \in F$$ such that for any $$\vec{x} \in V$$ we have $$a \cdot \vec{x} = \vec{x}$$ (existence of multiplicative identity).
 * 6) For any vectors $$\vec{x}, \vec{y} \in V$$ and scalar $$a \in F$$ we have $$a \cdot (\vec{x} + \vec{y}) = a \cdot \vec{x} + a \cdot \vec{y}$$ (distributivity of scalar multiplication over vector addition).
 * 7) For any vector $$\vec{x} \in V$$ and scalars $$a, b \in F$$ we have $$(a +_F b) \cdot \vec{x} = a \cdot \vec{x} + b \cdot \vec{x}$$ (distributivity of scalar multiplication over field addition). Note that $$\begin{align}+_F\end{align}$$ is addition in the field.
 * 8) For any vector $$\vec{x} \in V$$ and scalars $$a, b \in F$$ we have $$a \cdot (b \cdot \vec{x}) = (a \cdot_F b) \cdot \vec{x}$$ (compatibility of field multiplication with scalar multiplication). Note that $$\cdot_F$$ is multiplication in the field.

Exercise: Compare these axioms with those for other algebraic structures like groups, rings, and fields. Note the similarities and differences. In what sense is a vector space like a group? How is a vector space like a field? Can you think of a structure that is both a vector space and a field?

Exercise: Although not stated explicitly, these axioms imply that $$\begin{align}0_F \cdot \vec{x} = \vec{0}\end{align}$$ for any vector $$ \vec{x} \in V $$ where $$\begin{align}0_F\end{align}$$ is the additive identity in the field. Prove this. Hint: use axioms 1 and 7.

Examples

 * Ordered tuples like those mentioned in the beginning of this lesson, with appropriate definitions of vector addition and scalar multiplication, can be made into elements of a vector space. Let $$\begin{align}\vec{x} = (x_1,x_2), \vec{y} = (y_1,y_2)\end{align}$$ and define:

$$\begin{align} \vec{x} + \vec{y} = (x_1 +_F y_1, x_2 +_F y_2)\end{align}$$

$$\begin{align} a \cdot \vec{x} = (a \cdot_F x_1, a \cdot_F x_2)\end{align}$$

where $$a \in \mathbb{R}$$ (but any field would work). This gives us definitions of vector addition and scalar multiplication in terms of field addition and field multiplication, concepts with which we are quite comfortable. Checking these definitions against the axioms should convince you that the set of ordered tuples under these operations is a vector space over $$\mathbb{R}$$.

Now consider these definitions:

$$\vec{x} + \vec{y} = (y_1 \cdot_F x_1, y_2 \cdot_F x_2)$$ $$a \cdot \vec{x} = ({x_1}^a, {x_2}^a)$$

Remarkably, this defines a vector space over $$\mathbb{R}$$ as well! Although this vector space and the one previously defined share the same set of vectors, and are defined over the same field, they are different as vector spaces.


 * Any field is a vector space over itself. To see this, take the elements of the field to be the vectors, then check this against the axioms.


 * The complex numbers are a vector space over the real numbers.

Subspaces
Any non-empty subset of is called a subspace of V if it respect the two following properties:

$\forall (x,y) \in V, x + y \in V $

$\forall (a,\vec{x}) \in F\times V, a\cdot\vec{x}\in V $

And we can easily verify that such a set is a vector space over $$F$$.

Dependence Relations and Linear Independence
Any set $$(e_n)_{n \in J\subseteq \Nu}$$ of elements of $$V$$ is said linearly independent (or "free") if it is every linear combination of those vectors is different from the zero vector. In other words, that :

$$\forall (x_1, x_2, ... ) \in F^J, \begin{align} (\exist n \in J, x_n \neq 0)\Longrightarrow x_1\cdot e_1 + x_2 \cdot e_2 +... \neq 0_V \end{align}$$

or, in a completely equivalent manner :

$$\forall (x_1, x_2, ... ) \in F^J, \begin{align} x_1\cdot e_1 + x_2 \cdot e_2 +... = 0_V \Longrightarrow (\forall n \in J, x_n = 0) \end{align}$$

Otherwise, such a set is said "dependent".