Table of Contents
Combinatorics is the study of how to count systematically. Instead of listing all possibilities by hand, combinatorics gives general principles and formulas that let you count quickly and reliably, even when the numbers involved are huge.
In this chapter, the focus is on:
- Understanding what it means to “count” in mathematics.
- Learning basic counting principles.
- Seeing how these principles prepare you for more specific topics like permutations and combinations, which have their own chapters.
You do not need any probability or advanced algebra here; just careful thinking and some basic arithmetic.
What “Counting” Means in Combinatorics
In everyday life, counting often means going 1, 2, 3, 4, … to find how many objects there are.
In combinatorics, “counting” usually means:
- Given some rules or conditions,
- How many different objects, arrangements, or choices satisfy those rules?
Some typical questions:
- How many 3-digit codes can be formed using the digits 0–9?
- How many different ways can students be lined up?
- How many different pizzas can you order if you choose from a list of toppings?
We are not usually interested in listing all possibilities, only in finding the total number.
A key part of combinatorics is deciding what counts as “different”:
- “AB” and “BA” might be considered different if order matters (e.g., two seats in a row).
- They might be considered the same if order does not matter (e.g., a team made of Alice and Bob).
This distinction—whether order matters—is central and will be explored further in the later chapters on permutations and combinations.
Counting Problems as Structured Choices
Combinatorial problems are often easier if you see them as a sequence of choices.
A typical problem:
- You are choosing a password.
- Step 1: Choose the first character.
- Step 2: Choose the second character.
- …
- Each step has some number of allowed options.
If you can identify:
- How many choices you have at each step,
- Whether choices overlap or depend on each other,
then you can often apply simple counting rules.
The main basic tools are:
- The addition principle (sometimes called the rule of sum).
- The multiplication principle (sometimes called the rule of product).
These two principles are the backbone of elementary combinatorics.
The Addition Principle
The addition principle applies when you have alternatives and you must choose exactly one of them, and there is no overlap between the alternatives.
Informally:
- If one task can be done in $m$ ways,
- and a different, non-overlapping task can be done in $n$ ways,
- and you must pick one task or the other,
- then there are $m + n$ possible choices.
More formally:
If you have two sets $A$ and $B$ of possible outcomes with no elements in common (they are disjoint), then
$$
|A \cup B| = |A| + |B|.
$$
In words: the number of elements in “$A$ or $B$” is the sum of the numbers in $A$ and in $B$, provided that no element belongs to both.
This idea applies beyond two sets. For pairwise disjoint sets $A_1, A_2, \dots, A_k$:
$$
\left|\bigcup_{i=1}^k A_i\right| = \sum_{i=1}^k |A_i|.
$$
You use the addition principle whenever a problem breaks into cases that:
- Cannot happen at the same time, and
- Cover all possibilities you care about.
If there is overlap, a simple $m+n$ risks double-counting. Handling overlap (for example, with inclusion–exclusion) is a more advanced topic and not the focus of this introductory chapter.
The Multiplication Principle
The multiplication principle applies to sequences of choices where each choice has a certain number of options, and each full outcome is formed by making all of the choices.
Informally:
- If you have $m$ ways to make the first choice,
- and for each first choice you have $n$ ways to make the second choice,
- then there are $m \times n$ possible pairs of choices.
In general, if a process involves $k$ steps, and:
- Step 1 can be done in $n_1$ ways,
- Step 2 can be done in $n_2$ ways (possibly depending on step 1, but always having $n_2$ possibilities for any given earlier choices),
- …
- Step $k$ can be done in $n_k$ ways,
then the total number of distinct outcomes is:
$$
n_1 \cdot n_2 \cdots n_k.
$$
You can think of this as counting the number of elements in a Cartesian product $A_1 \times A_2 \times \dots \times A_k$, where $|A_i| = n_i$:
$$
|A_1 \times A_2 \times \dots \times A_k| = n_1 n_2 \cdots n_k.
$$
You use the multiplication principle whenever an outcome is formed by making several independent choices in sequence.
Combining Addition and Multiplication Principles
Many realistic counting problems require using both principles, often in multiple stages.
General patterns:
- Break a problem into cases (use addition).
- Within each case, analyze steps or positions (use multiplication).
For example, you might count possible objects of “type 1” and of “type 2” separately using the multiplication principle, then add these counts.
The ability to switch between these viewpoints—cases versus steps—is central to combinatorial thinking.
Counting with Restrictions
Real combinatorics problems almost always involve restrictions:
- Some choices may be forbidden.
- Some outcomes may be considered the same.
- There may be requirements like “at least one”, “exactly one”, “no repetitions”, and so on.
A common strategy is:
- Count without restrictions (often easier).
- Subtract forbidden cases (or split into allowed cases and use addition).
- Adjust the count according to the rules.
For example:
- “At least one” type of condition is often handled by counting “all” and then subtracting the “none” case.
- “Exactly one” or “at most one” is often handled by breaking into cases with the addition principle.
Thinking clearly about restrictions is essential preparation for the specific topics of permutations and combinations, where restrictions like “no repetition” and “order doesn’t matter” are central.
Factorials as a Counting Tool
A basic object that appears repeatedly in combinatorics is the factorial.
For a positive integer $n$, the factorial of $n$, written $n!$, is defined by
$$
n! = n \cdot (n-1) \cdot (n-2) \cdot \dots \cdot 2 \cdot 1,
$$
with the special definition
$$
0! = 1.
$$
Factorials naturally arise when counting arrangements of distinct objects. For example, the number of ways to order $n$ distinct items in a row is $n!$.
Understanding factorials is important because they appear in the formulas for permutations and combinations in the next chapters.
Combinatorial Thinking and Problem Strategy
Beyond specific formulas, combinatorics develops a way of thinking:
- Identify precisely what counts as a single outcome.
- Decide whether order matters.
- View outcomes as:
- A sequence of independent choices (for multiplication), or
- A collection of mutually exclusive cases (for addition).
- Watch out for:
- Double-counting the same outcome in more than one way.
- Forgetting special cases.
- Hidden restrictions (like “no repetition”).
As problems grow more complex, these habits prepare you for more specialized tools:
- Permutations: counting ordered arrangements.
- Combinations: counting unordered selections.
- More advanced methods (not detailed here), such as:
- The inclusion–exclusion principle.
- The pigeonhole principle.
- Recurrence relations and generating functions.
Each of these is built on the basic ideas introduced in this chapter: counting carefully with clear rules, and using the addition and multiplication principles as fundamental tools.