Table of Contents
Matrices are powerful tools for organizing and working with numbers. In this chapter, we focus on what you can do with matrices: how to add, subtract, and multiply them, how to scale them, and how to find special matrices like the identity and the transpose.
You should already be familiar with what a matrix is and basic vector operations from earlier chapters. Here, we build on that.
Matrix Addition and Subtraction
Matrix addition and subtraction are only defined when the matrices have the same size (same number of rows and same number of columns).
Let $A$ and $B$ be two $m \times n$ matrices:
$$
A = (a_{ij}), \quad B = (b_{ij})
$$
where $a_{ij}$ and $b_{ij}$ are the entries in row $i$, column $j$.
Definition of Matrix Addition
The sum $C = A + B$ is the $m \times n$ matrix defined by
$$
c_{ij} = a_{ij} + b_{ij}
$$
for every row $i$ and column $j$.
That is, you add corresponding entries.
Example:
$$
A = \begin{bmatrix}
1 & 3 \\
2 & 4
\end{bmatrix},
\quad
B = \begin{bmatrix}
5 & -1 \\
0 & 2
\end{bmatrix}
$$
Then
$$
A + B = \begin{bmatrix}
1+5 & 3+(-1) \\
2+0 & 4+2
\end{bmatrix}
=
\begin{bmatrix}
6 & 2 \\
2 & 6
\end{bmatrix}.
$$
Definition of Matrix Subtraction
Subtraction works the same way, entry-by-entry:
$$
C = A - B \quad \text{means} \quad c_{ij} = a_{ij} - b_{ij}.
$$
Using the same $A$ and $B$:
$$
A - B =
\begin{bmatrix}
1-5 & 3-(-1) \\
2-0 & 4-2
\end{bmatrix}
=
\begin{bmatrix}
-4 & 4 \\
2 & 2
\end{bmatrix}.
$$
If $A$ and $B$ do not have the same dimensions, $A + B$ and $A - B$ are not defined.
Basic Properties of Addition and Subtraction
For any $m \times n$ matrices $A, B, C$:
- Commutative law of addition:
$$
A + B = B + A.
$$ - Associative law of addition:
$$
(A + B) + C = A + (B + C).
$$ - Additive identity: There is an $m \times n$ zero matrix, usually written $0$, with all entries equal to 0, such that
$$
A + 0 = 0 + A = A.
$$ - Additive inverse: For every matrix $A$, the matrix $-A$ (obtained by negating every entry) satisfies
$$
A + (-A) = 0.
$$
Subtraction is equivalent to adding the additive inverse:
$$
A - B = A + (-B).
$$
Scalar Multiplication
You can multiply a matrix by a scalar (a single number). This scales all entries of the matrix.
Let $k$ be a scalar (real or complex number) and $A = (a_{ij})$ an $m \times n$ matrix. The product $kA$ is the matrix
$$
kA = (ka_{ij}),
$$
i.e. each entry is multiplied by $k$.
Example:
$$
A = \begin{bmatrix}
2 & -1 \\
0 & 3
\end{bmatrix},\quad
k = 4
$$
Then
$$
4A = \begin{bmatrix}
4\cdot 2 & 4\cdot (-1) \\
4\cdot 0 & 4\cdot 3
\end{bmatrix}
=
\begin{bmatrix}
8 & -4 \\
0 & 12
\end{bmatrix}.
$$
Note that $(-1)A = -A$.
Basic Properties of Scalar Multiplication
Let $A$ and $B$ be $m \times n$ matrices, and let $k$ and $\ell$ be scalars:
- Distributive over matrix addition:
$$
k(A + B) = kA + kB.
$$ - Distributive over scalar addition:
$$
(k + \ell)A = kA + \ell A.
$$ - Associativity with scalars:
$$
k(\ell A) = (k\ell)A.
$$ - Multiplying by 1 and 0:
$$
1A = A,\quad 0A = 0.
$$
These rules mirror the familiar properties of numbers, applied entry-by-entry.
Matrix Multiplication
Matrix multiplication is more subtle than addition. It is not done entry-by-entry. Instead, each entry of the product involves a row of the first matrix and a column of the second matrix, connected by a dot-product-like rule.
When Is Matrix Multiplication Defined?
If $A$ is an $m \times n$ matrix and $B$ is an $n \times p$ matrix, then their product $AB$ is defined and will be an $m \times p$ matrix.
The inner dimensions (the $n$ from $m \times n$ and the $n$ from $n \times p$) must match. If not, $AB$ is not defined.
- $A$ is: $m$ rows, $n$ columns.
- $B$ is: $n$ rows, $p$ columns.
- $AB$ is: $m$ rows, $p$ columns.
How to Compute a Matrix Product
Let
$$
A = (a_{ij}) \quad \text{(size } m\times n\text{)}, \quad
B = (b_{jk}) \quad \text{(size } n\times p\text{)}.
$$
The product $C = AB$ is the $m \times p$ matrix with entries
$$
c_{ik} = \sum_{j=1}^{n} a_{ij} b_{jk}.
$$
This means:
- To find entry $c_{ik}$:
- Take row $i$ of $A$.
- Take column $k$ of $B$.
- Multiply corresponding entries and add them up.
Example (multiplying a $2\times 3$ matrix by a $3\times 2$ matrix):
$$
A =
\begin{bmatrix}
1 & 2 & 3 \\
4 & 5 & 6
\end{bmatrix},
\quad
B =
\begin{bmatrix}
1 & 0 \\
-1 & 2 \\
0 & 3
\end{bmatrix}.
$$
Here, $A$ is $2\times 3$, $B$ is $3\times 2$, so $AB$ will be $2\times 2$.
Compute each entry:
- Row 1 of $A$ is $(1, 2, 3)$.
- Row 2 of $A$ is $(4, 5, 6)$.
- Column 1 of $B$ is $(1, -1, 0)$.
- Column 2 of $B$ is $(0, 2, 3)$.
Now:
- $c_{11}$ (row 1, column 1):
$$
c_{11} = 1\cdot 1 + 2\cdot (-1) + 3\cdot 0 = 1 - 2 + 0 = -1.
$$ - $c_{12}$ (row 1, column 2):
$$
c_{12} = 1\cdot 0 + 2\cdot 2 + 3\cdot 3 = 0 + 4 + 9 = 13.
$$ - $c_{21}$ (row 2, column 1):
$$
c_{21} = 4\cdot 1 + 5\cdot (-1) + 6\cdot 0 = 4 - 5 + 0 = -1.
$$ - $c_{22}$ (row 2, column 2):
$$
c_{22} = 4\cdot 0 + 5\cdot 2 + 6\cdot 3 = 0 + 10 + 18 = 28.
$$
So
$$
AB =
\begin{bmatrix}
-1 & 13 \\
-1 & 28
\end{bmatrix}.
$$
If you try to compute $BA$ (swap the order), you get a $3\times 3$ matrix, and it will not equal $AB$. In general, the order matters.
Matrix–Vector Multiplication
A useful special case: multiplying a matrix by a column vector.
If $A$ is an $m \times n$ matrix and $\mathbf{x}$ is an $n \times 1$ column vector, then $A\mathbf{x}$ is an $m \times 1$ vector. You compute each entry as a dot product of a row of $A$ with the vector $\mathbf{x$}.
Example:
$$
A = \begin{bmatrix}
2 & 1 \\
-1 & 3 \\
4 & 0
\end{bmatrix},
\quad
\mathbf{x} =
\begin{bmatrix}
1 \\ 2
\end{bmatrix}.
$$
Then
$$
A\mathbf{x} =
\begin{bmatrix}
2\cdot 1 + 1\cdot 2 \\
-1\cdot 1 + 3\cdot 2 \\
4\cdot 1 + 0\cdot 2
\end{bmatrix}
=
\begin{bmatrix}
4 \\ 5 \\ 4
\end{bmatrix}.
$$
This is a key operation in linear algebra, often interpreted as applying a linear transformation to a vector.
Properties of Matrix Multiplication
Let the matrices be of sizes where all products below make sense.
- Associative:
$$
(AB)C = A(BC).
$$ - Distributive over addition:
$$
A(B + C) = AB + AC,\quad
(A + B)C = AC + BC.
$$ - Scalar multiplication:
$$
(kA)B = k(AB) = A(kB).
$$ - Not commutative in general:
$$
AB \neq BA \ \text{in general.}
$$
In fact, for some pairs $A,B$, $AB$ is defined but $BA$ is not even defined (because of mismatched dimensions).
Multiplying by Zero and Identity Matrices
- If $0$ is a zero matrix of suitable size, then
$$
A0 = 0, \quad 0A = 0.
$$ - For square matrices, there is a special identity matrix $I$ (detailed below) such that
$$
AI = IA = A.
$$
Identity Matrix
For each positive integer $n$, the identity matrix of size $n \times n$, denoted $I_n$ or simply $I$, is the square matrix with 1s on the main diagonal and 0s elsewhere.
Example:
$$
I_2 = \begin{bmatrix}
1 & 0 \\
0 & 1
\end{bmatrix},
\quad
I_3 = \begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1
\end{bmatrix}.
$$
The identity matrix acts like the number 1 in matrix multiplication.
If $A$ is an $n \times n$ matrix, then
$$
AI_n = I_nA = A.
$$
More generally, if $A$ is $m \times n$, then:
- $I_m A = A$ (left-multiplying by the $m\times m$ identity),
- $A I_n = A$ (right-multiplying by the $n\times n$ identity).
You can see $I$ as the matrix that leaves every vector unchanged:
$$
I \mathbf{x} = \mathbf{x}
$$
for any vector $\mathbf{x}$ of compatible size.
The identity matrix will be crucial when we discuss inverses and solving systems of linear equations.
Transpose of a Matrix
The transpose of a matrix is obtained by turning its rows into columns (or equivalently, its columns into rows).
If $A$ is an $m \times n$ matrix, its transpose is written $A^T$ (or sometimes $A'$) and is an $n \times m$ matrix defined by
$$
(A^T)_{ij} = a_{ji}.
$$
In words, the entry in row $i$, column $j$ of $A^T$ is the entry in row $j$, column $i$ of $A$.
Example:
$$
A =
\begin{bmatrix}
1 & 4 & 7 \\
2 & 5 & 8
\end{bmatrix}
\quad (2\times 3),
$$
then
$$
A^T =
\begin{bmatrix}
1 & 2 \\
4 & 5 \\
7 & 8
\end{bmatrix}
\quad (3\times 2).
$$
Properties of the Transpose
Suppose $A$ and $B$ are matrices of sizes where the following operations make sense, and $k$ is a scalar:
- Double transpose:
$$
(A^T)^T = A.
$$ - Transpose of a sum:
$$
(A + B)^T = A^T + B^T.
$$ - Transpose and scalars:
$$
(kA)^T = kA^T.
$$ - Transpose of a product:
$$
(AB)^T = B^T A^T.
$$
Notice the order reverses when taking the transpose of a product.
For example, if
$$
A = \begin{bmatrix}
1 & 2 \\
3 & 4
\end{bmatrix},
\quad
B = \begin{bmatrix}
0 & 1 \\
-1 & 0
\end{bmatrix},
$$
then
$$
AB = \begin{bmatrix}
(1\cdot 0 + 2\cdot -1) & (1\cdot 1 + 2\cdot 0) \\
(3\cdot 0 + 4\cdot -1) & (3\cdot 1 + 4\cdot 0)
\end{bmatrix}
=
\begin{bmatrix}
-2 & 1 \\
-4 & 3
\end{bmatrix},
$$
so
$$
(AB)^T = \begin{bmatrix}
-2 & -4 \\
1 & 3
\end{bmatrix}.
$$
On the other hand,
$$
B^T = \begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix},
$$
and
$$
B^T A^T =
\begin{bmatrix}
0 & -1 \\
1 & 0
\end{bmatrix}
\begin{bmatrix}
1 & 3 \\
2 & 4
\end{bmatrix}
=
\begin{bmatrix}
(0\cdot 1 + -1\cdot 2) & (0\cdot 3 + -1\cdot 4) \\
(1\cdot 1 + 0\cdot 2) & (1\cdot 3 + 0\cdot 4)
\end{bmatrix}
=
\begin{bmatrix}
-2 & -4 \\
1 & 3
\end{bmatrix},
$$
which matches $(AB)^T$.
These properties are widely used in linear algebra, especially when working with dot products, symmetric matrices, and orthogonality.
Summary of Matrix Operations
- Addition/Subtraction: Only for same-sized matrices; add or subtract entry-by-entry.
- Scalar Multiplication: Multiply every entry by the scalar.
- Matrix Multiplication:
- Defined when inner dimensions match: $(m\times n)(n\times p) = m\times p$.
- Each entry of the product is a sum of products of row entries of the first and column entries of the second.
- Associative and distributive, but generally not commutative.
- Identity Matrix: Square matrix with 1s on the main diagonal, 0s elsewhere; behaves like 1 for multiplication.
- Transpose: Swap rows and columns; $(AB)^T = B^T A^T$ and $(A^T)^T = A$.
These operations are the basic algebra of matrices and are the foundation for later topics such as determinants, inverses, eigenvalues, and systems of linear equations.