Table of Contents
Understanding Eigenvalues and Eigenvectors
In this chapter, we focus on square matrices and the special vectors they act on in a very simple way: by stretching or shrinking without changing direction.
Throughout, we assume you already know basic matrix operations from the earlier linear algebra chapters: matrix addition, scalar multiplication, and matrix–vector multiplication.
The Eigenvalue Equation
Let $A$ be an $n \times n$ matrix (a square matrix), and let $x$ be a nonzero vector in $\mathbb{R}^n$ (or $\mathbb{C}^n$).
An eigenvector of $A$ is a nonzero vector $x$ such that multiplying by $A$ does not change its direction, only its length and possibly its sign. This is expressed by the equation
$$
A x = \lambda x,
$$
where $\lambda$ is a scalar (a number). The scalar $\lambda$ is called the eigenvalue corresponding to the eigenvector $x$.
Key points:
- $A$ must be square.
- $x \neq 0$. The zero vector is never considered an eigenvector.
- The eigenvalue $\lambda$ can be zero or nonzero, real or complex (depending on the entries of $A$).
The equation $A x = \lambda x$ says: when $A$ acts on $x$, the result is just a scaled version of $x$.
Turning $A x = \lambda x$ into a Solvable Equation
To find eigenvalues and eigenvectors, we rearrange
$$
A x = \lambda x
$$
to put everything on one side. Subtract $\lambda x$:
$$
A x - \lambda x = 0.
$$
Factor $x$:
$$
A x - \lambda I x = (A - \lambda I)x = 0,
$$
where $I$ is the identity matrix of the same size as $A$.
So eigenvectors and eigenvalues satisfy
$$
(A - \lambda I)x = 0.
$$
This is a homogeneous system of linear equations. For a nonzero solution $x$ to exist, the matrix $A - \lambda I$ must be singular (non-invertible). In terms of determinants, that means
$$
\det(A - \lambda I) = 0.
$$
This condition is the key to finding eigenvalues.
The Characteristic Polynomial and Eigenvalues
The expression $\det(A - \lambda I)$ is a polynomial in $\lambda$ (for an $n \times n$ matrix, it has degree $n$). It is called the characteristic polynomial of $A$.
To find eigenvalues:
- Form $A - \lambda I$.
- Compute the determinant $\det(A - \lambda I)$.
- Set it equal to zero:
$$
\det(A - \lambda I) = 0.
$$ - Solve this polynomial equation for $\lambda$.
Each solution $\lambda$ of this equation is an eigenvalue of $A$.
For a $2 \times 2$ matrix
$$
A = \begin{bmatrix}
a & b \\
c & d
\end{bmatrix},
$$
we can compute the characteristic polynomial explicitly:
$$
\det(A - \lambda I)
= \det\begin{bmatrix}
a - \lambda & b \\
c & d - \lambda
\end{bmatrix}
= (a - \lambda)(d - \lambda) - bc.
$$
Expanding, this becomes
$$
\lambda^2 - (a + d)\lambda + (ad - bc).
$$
So for $2 \times 2$ matrices the eigenvalues are the roots of a quadratic.
Two important quantities often appear here:
- The trace of $A$ (sum of diagonal entries) is $a + d$. For a $2 \times 2$ matrix, it is the sum of the eigenvalues (counting multiplicity).
- The determinant of $A$ is $ad - bc$. For a $2 \times 2$ matrix, it is the product of the eigenvalues (again, counting multiplicity).
These relationships generalize to higher dimensions, but we will not develop that fully here.
Finding Eigenvectors for a Given Eigenvalue
Once an eigenvalue $\lambda$ is known, its eigenvectors satisfy
$$
(A - \lambda I)x = 0.
$$
To find eigenvectors corresponding to $\lambda$:
- Form the matrix $A - \lambda I$.
- Solve the homogeneous system $(A - \lambda I)x = 0$.
- Any nonzero solution $x$ is an eigenvector for the eigenvalue $\lambda$.
Since $(A - \lambda I)$ is singular, there are infinitely many solutions forming a subspace of $\mathbb{R}^n$ (or $\mathbb{C}^n$). This subspace is called the eigenspace associated with $\lambda$.
Eigenspaces
For an eigenvalue $\lambda$ of $A$, the eigenspace corresponding to $\lambda$ is defined as
$$
E_\lambda = \{ x \neq 0 : A x = \lambda x \} \cup \{0\}.
$$
Equivalently,
$$
E_\lambda = \{ x : (A - \lambda I)x = 0 \}.
$$
So $E_\lambda$ is the null space (set of all solutions) of the matrix $A - \lambda I$. Every nonzero vector in $E_\lambda$ is an eigenvector for $\lambda$.
The eigenspace is always a subspace: it contains the zero vector, is closed under addition, and under scalar multiplication.
The dimension of $E_\lambda$ is called the geometric multiplicity of the eigenvalue $\lambda$.
Example: A $2 \times 2$ Matrix
Consider
$$
A = \begin{bmatrix}
2 & 1 \\
0 & 3
\end{bmatrix}.
$$
Step 1: Find eigenvalues.
Compute
$$
\det(A - \lambda I)
= \det\begin{bmatrix}
2 - \lambda & 1 \\
0 & 3 - \lambda
\end{bmatrix}
= (2 - \lambda)(3 - \lambda) - 0
= (2 - \lambda)(3 - \lambda).
$$
Set equal to zero:
$$
(2 - \lambda)(3 - \lambda) = 0.
$$
The eigenvalues are
$$
\lambda_1 = 2, \quad \lambda_2 = 3.
$$
Step 2: Find eigenvectors for $\lambda_1 = 2$.
We solve $(A - 2I)x = 0$:
$$
A - 2I = \begin{bmatrix}
2 - 2 & 1 \\
0 & 3 - 2
\end{bmatrix}
= \begin{bmatrix}
0 & 1 \\
0 & 1
\end{bmatrix}.
$$
Let $x = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix}$. Then
$$
\begin{bmatrix}
0 & 1 \\
0 & 1
\end{bmatrix}
\begin{bmatrix}
x_1 \\
x_2
\end{bmatrix}
=
\begin{bmatrix}
x_2 \\
x_2
\end{bmatrix}
=
\begin{bmatrix}
0 \\
0
\end{bmatrix},
$$
so $x_2 = 0$, and $x_1$ is free.
All eigenvectors for $\lambda = 2$ have the form
$$
x = \begin{bmatrix}
x_1 \\
0
\end{bmatrix}
= x_1 \begin{bmatrix}
1 \\
0
\end{bmatrix}, \quad x_1 \neq 0.
$$
A simple eigenvector is
$$
v_1 = \begin{bmatrix} 1 \\ 0 \end{bmatrix}.
$$
Step 3: Find eigenvectors for $\lambda_2 = 3$.
We solve $(A - 3I)x = 0$:
$$
A - 3I = \begin{bmatrix}
2 - 3 & 1 \\
0 & 3 - 3
\end{bmatrix}
= \begin{bmatrix}
-1 & 1 \\
0 & 0
\end{bmatrix}.
$$
Then
$$
\begin{bmatrix}
-1 & 1 \\
0 & 0
\end{bmatrix}
\begin{bmatrix}
x_1 \\
x_2
\end{bmatrix}
=
\begin{bmatrix}
- x_1 + x_2 \\
0
\end{bmatrix}
=
\begin{bmatrix}
0 \
0
\end{bmatrix}.
$$
So $-x_1 + x_2 = 0$, which means $x_2 = x_1$.
All eigenvectors for $\lambda = 3$ have the form
$$
x = \begin{bmatrix}
t \\
t
\end{bmatrix}
= t \begin{bmatrix}
1 \\
1
\end{bmatrix}, \quad t \neq 0.
$$
A simple eigenvector is
$$
v_2 = \begin{bmatrix} 1 \\ 1 \end{bmatrix}.
$$
We can verify:
$$
A v_1 = \begin{bmatrix}
2 & 1 \\
0 & 3
\end{bmatrix}
\begin{bmatrix}
1 \\
0
\end{bmatrix}
=
\begin{bmatrix}
2 \\
0
\end{bmatrix}
= 2 \begin{bmatrix} 1 \\ 0 \end{bmatrix} = 2 v_1,
$$
so $\lambda = 2$.
And
$$
A v_2 = \begin{bmatrix}
2 & 1 \\
0 & 3
\end{bmatrix}
\begin{bmatrix}
1 \\
1
\end{bmatrix}
=
\begin{bmatrix}
3 \\
3
\end{bmatrix}
= 3 \begin{bmatrix} 1 \\ 1 \end{bmatrix} = 3 v_2,
$$
so $\lambda = 3$.
Geometric Picture (in Low Dimensions)
While matrices and vectors can live in any dimension, it is helpful to visualize eigenvectors for $2 \times 2$ and $3 \times 3$ matrices.
Consider a linear transformation $T$ represented by a $2 \times 2$ matrix $A$. When $A$ acts on a vector, it can:
- Rotate the vector.
- Stretch or shrink it.
- Reflect it.
Most vectors are both rotated and stretched. But eigenvectors are special: they are not rotated, only stretched (and maybe reversed, which is a stretch by a negative factor).
For example:
- If $\lambda > 1$, eigenvectors in that direction are stretched.
- If $0 < \lambda < 1$, eigenvectors are shrunk towards zero.
- If $\lambda < 0$, eigenvectors are stretched and reversed in direction.
- If $\lambda = 0$, eigenvectors are mapped to the zero vector.
In two dimensions, each distinct real eigenvalue (with at least one eigenvector) corresponds to a line through the origin consisting of eigenvectors (plus the zero vector).
In higher dimensions, each eigenspace is a subspace passing through the origin, possibly with dimension greater than $1$.
Repeated Eigenvalues and Multiplicities
An eigenvalue can appear more than once as a root of the characteristic polynomial. There are two related notions of multiplicity:
- The algebraic multiplicity of an eigenvalue is how many times it appears as a root of the characteristic polynomial.
- The geometric multiplicity of an eigenvalue is the dimension of its eigenspace.
For every eigenvalue $\lambda$,
- The geometric multiplicity is at least $1$ (if $\lambda$ is an eigenvalue at all).
- The geometric multiplicity is at most the algebraic multiplicity.
These multiplicities play a role in whether a matrix can be diagonalized, which will be taken up in the next chapter.
Real and Complex Eigenvalues
If all entries of $A$ are real numbers:
- Eigenvalues can still be complex (involving $i = \sqrt{-1}$).
- Complex eigenvalues always come in conjugate pairs (if $a + bi$ is an eigenvalue with real $a, b$, then $a - bi$ is also an eigenvalue).
For many applications, we may restrict attention to cases where all eigenvalues are real. However, for a complete theory, complex eigenvalues and eigenvectors are important, especially in higher-level topics.
Summary of the Basic Procedure
For a square matrix $A$:
- Find eigenvalues by solving
$$
\det(A - \lambda I) = 0.
$$ - For each eigenvalue $\lambda$, find the corresponding eigenspace by solving
$$
(A - \lambda I)x = 0.
$$
Any nonzero solution $x$ is an eigenvector associated with $\lambda$. - Eigenspace: all solutions to $(A - \lambda I)x = 0$ (including $0$) form the eigenspace $E_\lambda$.
These concepts will be used in the next chapter on diagonalization, where we will see how eigenvalues and eigenvectors can simplify the action of a linear transformation by expressing it in a particularly simple form in a suitable basis.