Table of Contents
Linear algebra is the study of linear relationships: relationships where changes happen at a constant rate and can be described using straight lines, matrices, and vectors. It provides a language and toolkit for dealing with many quantities at once, all interacting in a structured way.
At first glance, linear algebra can feel abstract, but many familiar ideas—like solving systems of linear equations—fit naturally inside it. This chapter gives a gentle, big-picture view of linear algebra as a subject, leaving detailed techniques for the later sections of this part of the course.
What “linear” means
A relationship is called linear when it satisfies two key properties:
- If you double the input, the effect on the output doubles (scaling).
- If you add two inputs, the effect on the output is the sum of the individual effects (additivity).
In one variable, a typical example is an equation like
$$
y = 3x
$$
When $x$ increases by 1, $y$ always increases by 3. Graphically, this is a straight line through the origin. In higher dimensions, “linear” still means “scale and add behave predictably,” but the objects and transformations are more general.
Linear algebra takes this idea and applies it to:
- Collections of numbers (vectors),
- Rules for transforming these collections (linear transformations),
- Tables of numbers that represent those rules (matrices),
- Systems of linear equations that describe constraints or balances.
Objects studied in linear algebra
Later sections will formalize these objects carefully, but here is an overview of what they are and why they matter.
Vectors: quantities with direction and magnitude
A vector is a list of numbers that often represents a quantity with both size and direction. For example:
- In the plane: a vector could be $(2, -1)$.
- In 3D space: a vector could be $(1, 0, 3)$.
- In more abstract settings: a vector might list values of several variables, or record features of a data point.
Vectors are added and scaled in a way that preserves linearity. For example, if $u$ and $v$ are vectors, and $c$ is a number (a scalar), then:
- $u + v$ is another vector,
- $c u$ is another vector.
In this course, Vectors and Matrices will develop these ideas and operations in detail.
Matrices: arrays that represent linear rules
A matrix is a rectangular array of numbers. For example:
$$
A = \begin{bmatrix}
1 & 2 \\
3 & 4
\end{bmatrix}
$$
Matrices can represent many things, but in linear algebra they most often represent linear transformations—rules that take a vector as input and produce another vector as output.
For instance, you can “multiply” a matrix $A$ by a vector $x$ to get a new vector $Ax$. This combination encodes several linear equations at once. The later chapter Matrix operations will show how matrix multiplication is defined and why it is set up that way.
Systems of linear equations
A system of linear equations is a collection of equations where each equation is linear in the unknowns. For example:
$$
\begin{aligned}
2x + 3y &= 5 \\
- x + 4y &= 1
\end{aligned}
$$
Linear algebra provides a unified way to:
- Write such systems using matrices and vectors,
- Analyze whether solutions exist,
- Describe all solutions when they do exist,
- Solve them efficiently.
The chapter Systems of Linear Equations will treat these methods in detail, including Gaussian elimination.
Transformations and structure
Behind these objects is a unifying theme: structure. Linear algebra looks at:
- How vectors can be combined through addition and scaling,
- How matrices act on vectors,
- How these actions preserve or change certain geometric or algebraic properties.
This structural viewpoint leads naturally to more advanced ideas such as determinants and eigenvalues, which get their own chapters later.
Geometric intuition
Although linear algebra is often written in symbols, it has a strong geometric side.
- In 2D and 3D, vectors can be drawn as arrows.
- Adding vectors corresponds to moving along one arrow then the other.
- Scalar multiplication stretches or shrinks arrows, possibly reversing direction.
Matrices can be thought of as machines that transform space:
- Some matrices stretch space in certain directions.
- Some rotate or reflect the plane.
- Some “squash” space into a lower dimension, as when projecting a 3D object onto a 2D screen.
Later topics like Determinants and Eigenvalues and Eigenvectors capture these geometric behaviors numerically and conceptually.
Why linear algebra is important
Linear algebra is central to many areas of mathematics and applications. A few examples:
- Science and engineering: modeling forces, currents, vibrations, and many physical systems leads naturally to linear equations and matrices.
- Computer graphics: rotations, scalings, and projections used to draw 3D scenes are matrix transformations of vectors.
- Data science and statistics: data sets are frequently stored as matrices, and many methods (like regression, principal component analysis) rely heavily on linear algebra.
- Economics: models of input–output relationships between different sectors of an economy are often written as linear systems.
- Differential equations: linear algebra helps describe and solve systems of differential equations, especially in higher dimensions.
Linear algebra gives concise, powerful ways to express such problems and to analyze their structure.
Structure of this part of the course
This Linear Algebra section of the course is organized into several chapters, each building on the previous ones:
Vectors and Matricesintroduces vectors, matrices, and their basic operations.Systems of Linear Equationsexplains how to use matrices and row operations (Gaussian elimination) to solve many linear equations at once.Determinantsexplores a special number associated with a square matrix that encodes information about invertibility and geometric scaling.Eigenvalues and Eigenvectorsexamines special directions in which a linear transformation acts by simple stretching, and shows how to use them for simplification and diagonalization.
In this introductory chapter, the main goal is orientation: to see linear algebra as the study of linear structures—vectors, matrices, and the transformations between them—and to understand that the later chapters will gradually make these ideas precise and usable.