Table of Contents
Working with Vectors: Operations
In this chapter we focus on the basic operations you can perform with vectors in the context of linear algebra. We will work mainly with vectors written in component form, such as
$$
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
and think of them as arrows or as ordered lists of numbers, depending on what is most convenient.
We will cover:
- Addition and subtraction of vectors
- Scalar multiplication
- Linear combinations
- Dot product (scalar product)
- Length (magnitude) and normalization
- Angle between vectors and orthogonality
- Projection of one vector onto another
Throughout, assume all vectors have the same number of components when we combine them; otherwise the operations are not defined.
Vector Addition and Subtraction
Vector addition combines two vectors componentwise.
If
$$
\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \vdots \\ u_n \end{bmatrix},
\quad
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
then their sum is
$$
\mathbf{u} + \mathbf{v}
=
\begin{bmatrix}
u_1 + v_1 \\
u_2 + v_2 \\
\vdots \\
u_n + v_n
\end{bmatrix}.
$$
Vector subtraction is defined similarly, component by component:
$$
\mathbf{u} - \mathbf{v}
=
\begin{bmatrix}
u_1 - v_1 \\
u_2 - v_2 \\
\vdots \\
u_n - v_n
\end{bmatrix}.
$$
Geometrically (in 2D or 3D), addition corresponds to placing the tail of one vector at the head of the other and drawing the resulting arrow from the free tail to the free head. Subtraction $\mathbf{u} - \mathbf{v}$ can be viewed as adding $\mathbf{u}$ and $-\mathbf{v}$, where $-\mathbf{v}$ has the same length as $\mathbf{v}$ but the opposite direction.
Some basic algebraic properties (for all vectors $\mathbf{u},\mathbf{v},\mathbf{w}$ of the same size):
- Commutativity:
$$
\mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u}.
$$ - Associativity:
$$
(\mathbf{u} + \mathbf{v}) + \mathbf{w} = \mathbf{u} + (\mathbf{v} + \mathbf{w}).
$$ - Additive identity: there is a zero vector
$$
\mathbf{0} = \begin{bmatrix} 0 \ \vdots \ 0 \end{bmatrix}
$$
such that $\mathbf{u} + \mathbf{0} = \mathbf{u}$. - Additive inverse: every vector $\mathbf{u}$ has an opposite $-\mathbf{u}$ with
$$
\mathbf{u} + (-\mathbf{u}) = \mathbf{0}.
$$
These properties are part of what makes the set of all $n$‑component vectors into a vector space, a concept explored more fully elsewhere.
Scalar Multiplication
A scalar is just a number (in this course, usually a real number). Scalar multiplication stretches, shrinks, or reverses a vector.
Given a scalar $c$ and a vector
$$
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
the product $c\mathbf{v}$ is defined by multiplying each component by $c$:
$$
c\mathbf{v} =
\begin{bmatrix}
c v_1 \\
c v_2 \\
\vdots \\
c v_n
\end{bmatrix}.
$$
Typical cases:
- If $c > 0$, the direction of $\mathbf{v}$ stays the same; only the length changes.
- If $c = 0$, the result is the zero vector.
- If $c < 0$, the direction is reversed and the length is scaled by $|c|$.
Scalar multiplication interacts with addition in predictable ways. For all scalars $c,d$ and vectors $\mathbf{u},\mathbf{v}$:
- Distributive over vector addition:
$$
c(\mathbf{u} + \mathbf{v}) = c\mathbf{u} + c\mathbf{v}.
$$ - Distributive over scalar addition:
$$
(c + d)\mathbf{v} = c\mathbf{v} + d\mathbf{v}.
$$ - Associativity of scalar multiplication:
$$
c(d\mathbf{v}) = (cd)\mathbf{v}.
$$ - Multiplication by 1:
$$
1\mathbf{v} = \mathbf{v}.
$$
These rules are essential when simplifying expressions involving vectors.
Linear Combinations of Vectors
Many ideas in linear algebra are built on linear combinations.
Given vectors $\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k$ and scalars $c_1,c_2,\dots,c_k$, a linear combination is any vector of the form
$$
c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots + c_k \mathbf{v}_k.
$$
This uses only two basic operations:
- Scalar multiplication (multiply each vector by a scalar)
- Vector addition (add the results)
Examples of linear combinations include:
- $3\mathbf{v}_1 - 2\mathbf{v}_2$ (here $-2$ is the scalar for $\mathbf{v}_2$)
- $\mathbf{0}$, which is $0\mathbf{v}_1 + 0\mathbf{v}_2 + \cdots + 0\mathbf{v}_k$
Concepts such as span, linear independence, and bases are all phrased in terms of linear combinations, and are treated in other chapters. Here, the important skill is being able to compute and simplify linear combinations correctly.
Dot Product (Scalar Product)
The dot product takes two vectors of the same size and produces a single number (a scalar).
For
$$
\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \vdots \\ u_n \end{bmatrix},
\quad
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
the dot product $\mathbf{u} \cdot \mathbf{v}$ is
$$
\mathbf{u} \cdot \mathbf{v} = u_1 v_1 + u_2 v_2 + \cdots + u_n v_n.
$$
Some key algebraic properties (for vectors of the same size and scalars $c$):
- Commutativity:
$$
\mathbf{u} \cdot \mathbf{v} = \mathbf{v} \cdot \mathbf{u}.
$$ - Distributivity over addition:
$$
\mathbf{u} \cdot (\mathbf{v} + \mathbf{w})
= \mathbf{u} \cdot \mathbf{v} + \mathbf{u} \cdot \mathbf{w}.
$$ - Compatibility with scalar multiplication:
$$
(c\mathbf{u}) \cdot \mathbf{v} = c(\mathbf{u} \cdot \mathbf{v})
= \mathbf{u} \cdot (c\mathbf{v}).
$$ - Dot product with itself (nonnegative):
$$
\mathbf{v} \cdot \mathbf{v} \ge 0,
$$
and $\mathbf{v} \cdot \mathbf{v} = 0$ if and only if $\mathbf{v} = \mathbf{0}$.
The dot product has a geometric interpretation related to length and angle, which will be used below.
Length (Magnitude) and Normalization
The length (or magnitude, or norm) of a vector $\mathbf{v}$, written $\|\mathbf{v}\|$, is defined using the dot product:
$$
\|\mathbf{v}\| = \sqrt{\mathbf{v} \cdot \mathbf{v}}.
$$
If $\mathbf{v} = \begin{bmatrix} v_1 \\ \vdots \\ v_n \end{bmatrix}$, then
$$
\|\mathbf{v}\| = \sqrt{v_1^2 + v_2^2 + \cdots + v_n^2}.
$$
This generalizes the Pythagorean theorem to higher dimensions.
A unit vector is a vector of length $1$. To turn a nonzero vector $\mathbf{v}$ into a unit vector pointing in the same direction, divide by its length. This process is called normalization:
$$
\text{If } \mathbf{v} \ne \mathbf{0}, \quad
\widehat{\mathbf{v}} = \frac{\mathbf{v}}{\|\mathbf{v}\|}
$$
is the normalized (unit) vector in the direction of $\mathbf{v}$.
Note that $\mathbf{0}$ cannot be normalized, since its length is $0$ and division by $0$ is undefined.
Scalar multiplication and length interact in a simple way. For any scalar $c$ and vector $\mathbf{v}$,
$$
\|c\mathbf{v}\| = |c|\,\|\mathbf{v}\|.
$$
Angle Between Vectors and Orthogonality
The dot product can also be expressed in terms of the lengths of the vectors and the angle between them.
For nonzero vectors $\mathbf{u}$ and $\mathbf{v}$, let $\theta$ be the angle between them (with $0 \le \theta \le \pi$). Then
$$
\mathbf{u} \cdot \mathbf{v}
= \|\mathbf{u}\|\,\|\mathbf{v}\| \cos \theta.
$$
Rearranging gives a formula for the cosine of the angle:
$$
\cos \theta
=
\frac{\mathbf{u} \cdot \mathbf{v}}
{\|\mathbf{u}\|\,\|\mathbf{v}\|},
$$
provided neither vector is the zero vector.
This is how the dot product connects algebra (components) with geometry (angles).
Two vectors are called orthogonal if the angle between them is $90^\circ$ (or $\pi/2$ radians). From the formula above, this is equivalent to
$$
\mathbf{u} \cdot \mathbf{v} = 0.
$$
Thus, orthogonality in vector spaces is expressed purely by the dot product being zero. In many applications, “perpendicular” and “orthogonal” are used interchangeably.
Projection of One Vector onto Another
The projection of one vector onto another captures the idea of “shadow” or “component” of a vector in the direction of another vector.
Given a nonzero vector $\mathbf{u}$ and any vector $\mathbf{v}$, the projection of $\mathbf{v}$ onto $\mathbf{u}$ is denoted $\operatorname{proj}_{\mathbf{u}} \mathbf{v}$ and is defined as
$$
\operatorname{proj}_{\mathbf{u}} \mathbf{v}
=
\left( \frac{\mathbf{v} \cdot \mathbf{u}}{\mathbf{u} \cdot \mathbf{u}} \right) \mathbf{u}.
$$
This uses several operations already introduced:
- Dot product: $\mathbf{v} \cdot \mathbf{u}$ and $\mathbf{u} \cdot \mathbf{u}$
- Scalar multiplication: multiplying $\mathbf{u}$ by the scalar
$$
\frac{\mathbf{v} \cdot \mathbf{u}}{\mathbf{u} \cdot \mathbf{u}}.
$$
Geometric interpretation:
- $\operatorname{proj}_{\mathbf{u}} \mathbf{v}$ is the vector lying on the line spanned by $\mathbf{u}$ that is “closest” to $\mathbf{v}$.
- The difference
$$
\mathbf{v} - \operatorname{proj}_{\mathbf{u}} \mathbf{v}
$$
is orthogonal to $\mathbf{u}$.
This decomposition,
$$
\mathbf{v}
=
\operatorname{proj}_{\mathbf{u}} \mathbf{v}
+
\left(\mathbf{v} - \operatorname{proj}_{\mathbf{u}} \mathbf{v}\right),
$$
splits $\mathbf{v}$ into a component parallel to $\mathbf{u}$ and a component orthogonal to $\mathbf{u}$, and is used heavily in later topics (such as least squares and orthogonalization).
Summary of Vector Operations
Here is a concise list of the main operations covered:
- Addition:
$$
(\mathbf{u} + \mathbf{v})_i = u_i + v_i.
$$ - Subtraction:
$$
(\mathbf{u} - \mathbf{v})_i = u_i - v_i.
$$ - Scalar multiplication:
$$
(c\mathbf{v})_i = c v_i.
$$ - Linear combination:
$$
c_1\mathbf{v}_1 + \cdots + c_k\mathbf{v}_k.
$$ - Dot product:
$$
\mathbf{u} \cdot \mathbf{v} = \sum_{i=1}^n u_i v_i.
$$ - Length:
$$
\|\mathbf{v}\| = \sqrt{\mathbf{v} \cdot \mathbf{v}}.
$$ - Normalization (for $\mathbf{v} \ne \mathbf{0}$):
$$
\widehat{\mathbf{v}} = \dfrac{\mathbf{v}}{\|\mathbf{v}\|}.
$$ - Angle (for nonzero $\mathbf{u},\mathbf{v}$):
$$
\cos \theta
=
\dfrac{\mathbf{u} \cdot \mathbf{v}}
{\|\mathbf{u}\|\,\|\mathbf{v}\|}.
$$ - Orthogonality: $\mathbf{u} \cdot \mathbf{v} = 0$.
- Projection of $\mathbf{v}$ onto nonzero $\mathbf{u}$:
$$
\operatorname{proj}_{\mathbf{u}} \mathbf{v}
=
\left( \frac{\mathbf{v} \cdot \mathbf{u}}{\mathbf{u} \cdot \mathbf{u}} \right) \mathbf{u}.
$$
Being fluent with these operations is essential for all later work in linear algebra, including matrix operations, linear transformations, and more advanced geometric and analytic applications.