Kahibaro
Discord Login Register

12.1.1 Vector operations

Working with Vectors: Operations

In this chapter we focus on the basic operations you can perform with vectors in the context of linear algebra. We will work mainly with vectors written in component form, such as
$$
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
and think of them as arrows or as ordered lists of numbers, depending on what is most convenient.

We will cover:

Throughout, assume all vectors have the same number of components when we combine them; otherwise the operations are not defined.

Vector Addition and Subtraction

Vector addition combines two vectors componentwise.

If
$$
\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \vdots \\ u_n \end{bmatrix},
\quad
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
then their sum is
$$
\mathbf{u} + \mathbf{v}
=
\begin{bmatrix}
u_1 + v_1 \\
u_2 + v_2 \\
\vdots \\
u_n + v_n
\end{bmatrix}.
$$

Vector subtraction is defined similarly, component by component:
$$
\mathbf{u} - \mathbf{v}
=
\begin{bmatrix}
u_1 - v_1 \\
u_2 - v_2 \\
\vdots \\
u_n - v_n
\end{bmatrix}.
$$

Geometrically (in 2D or 3D), addition corresponds to placing the tail of one vector at the head of the other and drawing the resulting arrow from the free tail to the free head. Subtraction $\mathbf{u} - \mathbf{v}$ can be viewed as adding $\mathbf{u}$ and $-\mathbf{v}$, where $-\mathbf{v}$ has the same length as $\mathbf{v}$ but the opposite direction.

Some basic algebraic properties (for all vectors $\mathbf{u},\mathbf{v},\mathbf{w}$ of the same size):

These properties are part of what makes the set of all $n$‑component vectors into a vector space, a concept explored more fully elsewhere.

Scalar Multiplication

A scalar is just a number (in this course, usually a real number). Scalar multiplication stretches, shrinks, or reverses a vector.

Given a scalar $c$ and a vector
$$
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
the product $c\mathbf{v}$ is defined by multiplying each component by $c$:
$$
c\mathbf{v} =
\begin{bmatrix}
c v_1 \\
c v_2 \\
\vdots \\
c v_n
\end{bmatrix}.
$$

Typical cases:

Scalar multiplication interacts with addition in predictable ways. For all scalars $c,d$ and vectors $\mathbf{u},\mathbf{v}$:

These rules are essential when simplifying expressions involving vectors.

Linear Combinations of Vectors

Many ideas in linear algebra are built on linear combinations.

Given vectors $\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k$ and scalars $c_1,c_2,\dots,c_k$, a linear combination is any vector of the form
$$
c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots + c_k \mathbf{v}_k.
$$

This uses only two basic operations:

Examples of linear combinations include:

Concepts such as span, linear independence, and bases are all phrased in terms of linear combinations, and are treated in other chapters. Here, the important skill is being able to compute and simplify linear combinations correctly.

Dot Product (Scalar Product)

The dot product takes two vectors of the same size and produces a single number (a scalar).

For
$$
\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \vdots \\ u_n \end{bmatrix},
\quad
\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix},
$$
the dot product $\mathbf{u} \cdot \mathbf{v}$ is
$$
\mathbf{u} \cdot \mathbf{v} = u_1 v_1 + u_2 v_2 + \cdots + u_n v_n.
$$

Some key algebraic properties (for vectors of the same size and scalars $c$):

The dot product has a geometric interpretation related to length and angle, which will be used below.

Length (Magnitude) and Normalization

The length (or magnitude, or norm) of a vector $\mathbf{v}$, written $\|\mathbf{v}\|$, is defined using the dot product:
$$
\|\mathbf{v}\| = \sqrt{\mathbf{v} \cdot \mathbf{v}}.
$$
If $\mathbf{v} = \begin{bmatrix} v_1 \\ \vdots \\ v_n \end{bmatrix}$, then
$$
\|\mathbf{v}\| = \sqrt{v_1^2 + v_2^2 + \cdots + v_n^2}.
$$

This generalizes the Pythagorean theorem to higher dimensions.

A unit vector is a vector of length $1$. To turn a nonzero vector $\mathbf{v}$ into a unit vector pointing in the same direction, divide by its length. This process is called normalization:
$$
\text{If } \mathbf{v} \ne \mathbf{0}, \quad
\widehat{\mathbf{v}} = \frac{\mathbf{v}}{\|\mathbf{v}\|}
$$
is the normalized (unit) vector in the direction of $\mathbf{v}$.

Note that $\mathbf{0}$ cannot be normalized, since its length is $0$ and division by $0$ is undefined.

Scalar multiplication and length interact in a simple way. For any scalar $c$ and vector $\mathbf{v}$,
$$
\|c\mathbf{v}\| = |c|\,\|\mathbf{v}\|.
$$

Angle Between Vectors and Orthogonality

The dot product can also be expressed in terms of the lengths of the vectors and the angle between them.

For nonzero vectors $\mathbf{u}$ and $\mathbf{v}$, let $\theta$ be the angle between them (with $0 \le \theta \le \pi$). Then
$$
\mathbf{u} \cdot \mathbf{v}
= \|\mathbf{u}\|\,\|\mathbf{v}\| \cos \theta.
$$

Rearranging gives a formula for the cosine of the angle:
$$
\cos \theta
=
\frac{\mathbf{u} \cdot \mathbf{v}}
{\|\mathbf{u}\|\,\|\mathbf{v}\|},
$$
provided neither vector is the zero vector.

This is how the dot product connects algebra (components) with geometry (angles).

Two vectors are called orthogonal if the angle between them is $90^\circ$ (or $\pi/2$ radians). From the formula above, this is equivalent to
$$
\mathbf{u} \cdot \mathbf{v} = 0.
$$

Thus, orthogonality in vector spaces is expressed purely by the dot product being zero. In many applications, “perpendicular” and “orthogonal” are used interchangeably.

Projection of One Vector onto Another

The projection of one vector onto another captures the idea of “shadow” or “component” of a vector in the direction of another vector.

Given a nonzero vector $\mathbf{u}$ and any vector $\mathbf{v}$, the projection of $\mathbf{v}$ onto $\mathbf{u}$ is denoted $\operatorname{proj}_{\mathbf{u}} \mathbf{v}$ and is defined as
$$
\operatorname{proj}_{\mathbf{u}} \mathbf{v}
=
\left( \frac{\mathbf{v} \cdot \mathbf{u}}{\mathbf{u} \cdot \mathbf{u}} \right) \mathbf{u}.
$$

This uses several operations already introduced:

Geometric interpretation:

This decomposition,
$$
\mathbf{v}
=
\operatorname{proj}_{\mathbf{u}} \mathbf{v}
+
\left(\mathbf{v} - \operatorname{proj}_{\mathbf{u}} \mathbf{v}\right),
$$
splits $\mathbf{v}$ into a component parallel to $\mathbf{u}$ and a component orthogonal to $\mathbf{u}$, and is used heavily in later topics (such as least squares and orthogonalization).

Summary of Vector Operations

Here is a concise list of the main operations covered:

Being fluent with these operations is essential for all later work in linear algebra, including matrix operations, linear transformations, and more advanced geometric and analytic applications.

Views: 58

Comments

Please login to add a comment.

Don't have an account? Register now!