Table of Contents
In this chapter we move from the intuitive idea of a limit (introduced earlier in the Limits and Continuity section) to a precise, formal definition. The formal definition is sometimes called the ε–δ definition (read “epsilon–delta definition”). It is the foundation on which all of differential calculus rests.
We will:
- State the ε–δ definition of a limit.
- Unpack what the symbols mean in plain language.
- Work through several examples of proving limits using the ε–δ definition.
- Mention related formal definitions: one-sided limits and infinite limits.
- Briefly connect formal limits to continuity and derivatives (without re-explaining those topics).
The ε–δ Definition of a Limit
Let $f$ be a function defined on some interval around $a$ (but not necessarily at $a$ itself).
We say
$$\lim_{x \to a} f(x) = L$$
if and only if the following is true:
For every real number $\varepsilon > 0$ there exists a real number $\delta > 0$ such that
whenever $0 < |x - a| < \delta$, we have $|f(x) - L| < \varepsilon$.
Symbolically:
$$\forall \varepsilon > 0 \ \exists \delta > 0 \ \text{such that} \ 0 < |x - a| < \delta \implies |f(x) - L| < \varepsilon.$$
This is the formal or rigorous definition of a limit at a finite point with a finite value.
Interpreting ε and δ
- $\varepsilon$ (epsilon) measures how close we want $f(x)$ to be to $L$.
- $\delta$ (delta) measures how close we must take $x$ to $a$ to guarantee that closeness of $f(x)$ to $L$.
The structure of the statement:
- Someone challenges you: “Make $f(x)$ within $\varepsilon$ of $L$!”
- You respond: “Fine; if you choose any $\varepsilon > 0$, I can find a $\delta > 0$ so that
whenever $x$ is within $\delta$ of $a$ (but not equal to $a$), then $f(x)$ is within
$\varepsilon$ of $L$.”
To prove a limit using this definition, you must show how to choose such a $\delta$ in terms of $\varepsilon$.
Basic ε–δ Proof Strategy
To prove $\lim_{x \to a} f(x) = L$ using ε–δ:
- Start from what you must prove.
You want: if < |x - a| < \delta$, then $|f(x) - L| < \varepsilon$.
- Algebraically manipulate $|f(x) - L|$.
Try to bound $|f(x) - L|$ by an expression involving $|x - a|$. - Relate $|f(x) - L|$ to $|x - a|$.
Aim for something like
$$|f(x) - L| \le C |x - a|$$
for some constant $C$, at least when $x$ is close to $a$. - Solve for a condition on $|x - a|$.
From $|f(x) - L| < \varepsilon$ and your bound, obtain something like
$$|x - a| < \frac{\varepsilon}{C}.$$ - Define $\delta$.
Set
$$\delta = \frac{\varepsilon}{C}$$
(or possibly the minimum of several expressions, to satisfy multiple conditions). - Write the formal argument.
Begin: “Let $\varepsilon > 0$ be given. Choose $\delta = \dots$” and then show that
the implication works.
Example 1: A Simple Linear Function
Prove using ε–δ that
$$\lim_{x \to 2} (3x + 1) = 7.$$
We want: for every $\varepsilon > 0$ there is a $\delta > 0$ such that
$$0 < |x - 2| < \delta \implies |(3x + 1) - 7| < \varepsilon.$$
Informal work (to find δ):
Compute:
$$|(3x + 1) - 7| = |3x - 6| = 3|x - 2|.$$
To make this less than $\varepsilon$, it suffices to have
$$3|x - 2| < \varepsilon \quad \Rightarrow \quad |x - 2| < \frac{\varepsilon}{3}.$$
So we plan to choose
$$\delta = \frac{\varepsilon}{3}.$$
Formal proof:
Let $\varepsilon > 0$ be given. Define
$$\delta = \frac{\varepsilon}{3}.$$
Now suppose $0 < |x - 2| < \delta$. Then
$$
| (3x + 1) - 7 |
= |3x - 6|
= 3|x - 2|
< 3\delta
= 3 \cdot \frac{\varepsilon}{3}
= \varepsilon.
$$
Thus, for every $\varepsilon > 0$ we have found a $\delta > 0$ such that
$0 < |x - 2| < \delta$ implies $|(3x + 1) - 7| < \varepsilon$.
Therefore, by the ε–δ definition,
$$\lim_{x \to 2} (3x + 1) = 7.$$
Note the pattern: linear functions are especially easy because $|f(x) - L|$ is exactly a constant multiple of $|x - a|$.
Example 2: A Quadratic Function
Prove that
$$\lim_{x \to 1} x^2 = 1.$$
We must show: for every $\varepsilon > 0$ there exists $\delta > 0$ such that
$$0 < |x - 1| < \delta \implies |x^2 - 1| < \varepsilon.$$
Informal work (to discover δ):
Factor:
$$|x^2 - 1| = |x - 1||x + 1|.$$
We want $|x - 1||x + 1| < \varepsilon$.
This is trickier because of the extra factor $|x + 1|$. A common technique is:
- Restrict attention to $x$ sufficiently close to $1$ so that we can bound $|x + 1|$.
- Use that bound to control the product.
For example, suppose we insist that
$$|x - 1| < 1.$$
Then $x$ is between $0$ and $2$, so $x + 1$ is between $1$ and $3$, giving
$$|x + 1| \le 3.$$
Thus, if $|x - 1| < 1$ then
$$|x^2 - 1| = |x - 1||x + 1| \le 3|x - 1|.$$
To make $3|x - 1| < \varepsilon$, it suffices that
$$|x - 1| < \frac{\varepsilon}{3}.$$
But we also needed $|x - 1| < 1$ to bound $|x + 1|$. So we choose
$$\delta = \min\left(1, \frac{\varepsilon}{3}\right).$$
Formal proof:
Let $\varepsilon > 0$ be given. Define
$$\delta = \min\left(1, \frac{\varepsilon}{3}\right).$$
Assume $0 < |x - 1| < \delta$. Then, in particular, $|x - 1| < 1$, so $0 < x < 2$, and hence
$$1 < x + 1 < 3 \quad \Rightarrow \quad |x + 1| \le 3.$$
Now
$$
| x^2 - 1 |
= |x - 1||x + 1|
\le 3|x - 1|
< 3\delta
\le 3 \cdot \frac{\varepsilon}{3}
= \varepsilon.
$$
Therefore, the ε–δ condition holds, and
$$\lim_{x \to 1} x^2 = 1.$$
This example illustrates a common move: first bound $x$ itself (by forcing $|x - a|$ to be small), then use that bound to control more complicated expressions like $|x + 1|$ or $|x^2 + x + 1|$.
Example 3: A Root Function
Prove that
$$\lim_{x \to 4} \sqrt{x} = 2.$$
We must show: for every $\varepsilon > 0$ there exists $\delta > 0$ such that
$$0 < |x - 4| < \delta \implies |\sqrt{x} - 2| < \varepsilon.$$
Informal work:
We start with
$$|\sqrt{x} - 2|.$$
Multiply numerator and denominator by the conjugate:
$$
| \sqrt{x} - 2 |
= \left|\frac{(\sqrt{x} - 2)(\sqrt{x} + 2)}{\sqrt{x} + 2}\right|
= \left|\frac{x - 4}{\sqrt{x} + 2}\right|
= \frac{|x - 4|}{|\sqrt{x} + 2|}.
$$
We want to make this less than $\varepsilon$. That is,
$$\frac{|x - 4|}{|\sqrt{x} + 2|} < \varepsilon.$$
So it would be enough to have both:
- $|\sqrt{x} + 2|$ bounded below by some positive constant $m$, and
- $|x - 4| < m\varepsilon$.
For (1), note that $\sqrt{x}$ is close to $2$ when $x$ is close to $4$. For example, if we require
$$|x - 4| < 1,$$
then $3 < x < 5$, so $\sqrt{x}$ lies between $\sqrt{3}$ and $\sqrt{5}$. Both are greater than $1$, so
$$\sqrt{x} + 2 > 3 \quad \Rightarrow \quad |\sqrt{x} + 2| \ge 3.$$
So we have
$$|\sqrt{x} - 2| = \frac{|x - 4|}{|\sqrt{x} + 2|} \le \frac{|x - 4|}{3}.$$
To ensure $\frac{|x - 4|}{3} < \varepsilon$, it suffices that
$$|x - 4| < 3\varepsilon.$$
We also need $|x - 4| < 1$ to secure the lower bound on $\sqrt{x} + 2$. So we can choose
$$\delta = \min(1, 3\varepsilon).$$
Formal proof:
Let $\varepsilon > 0$ be given. Define
$$\delta = \min(1, 3\varepsilon).$$
Suppose $0 < |x - 4| < \delta$. Then $|x - 4| < 1$, so $3 < x < 5$. Thus $\sqrt{x} > \sqrt{3} > 1$, hence
$$\sqrt{x} + 2 > 3 \quad \Rightarrow \quad |\sqrt{x} + 2| \ge 3.$$
Now compute:
$$
| \sqrt{x} - 2 |
= \frac{|x - 4|}{|\sqrt{x} + 2|}
\le \frac{|x - 4|}{3}
< \frac{\delta}{3}
\le \frac{3\varepsilon}{3}
= \varepsilon.
$$
Therefore, for every $\varepsilon > 0$ there exists such a $\delta > 0$, so
$$\lim_{x \to 4} \sqrt{x} = 2.$$
When a Limit Does Not Exist (Formal View)
The statement “$\lim_{x \to a} f(x) = L$” means there is some real number $L$ such that the ε–δ condition holds.
Saying the limit does not exist means:
There is no real number $L$ such that
for every $\varepsilon > 0$ there exists $\delta > 0$ with
$0 < |x - a| < \delta \implies |f(x) - L| < \varepsilon$.
Formally, the negation of the limit definition is:
For every $L \in \mathbb{R}$ there exists $\varepsilon > 0$ such that
for every $\delta > 0$ there exists $x$ with $0 < |x - a| < \delta$ and $|f(x) - L| \ge \varepsilon$.
This is often difficult to use directly, but it underlies rigorous non-existence proofs.
Typical reasons a limit fails to exist at $a$:
- The two one-sided limits are different.
- The function oscillates too wildly near $a$ (for example, $f(x) = \sin(1/x)$ as $x \to 0$).
- The function grows without bound (infinite limit; see below for separate formalization).
One-Sided Limits: Formal Definitions
A left-hand limit at $a$:
$$\lim_{x \to a^-} f(x) = L$$
means: for every $\varepsilon > 0$ there exists $\delta > 0$ such that
$$0 < a - x < \delta \implies |f(x) - L| < \varepsilon.$$
Equivalently, $0 < |x - a| < \delta$ but with $x < a$.
A right-hand limit at $a$:
$$\lim_{x \to a^+} f(x) = L$$
means: for every $\varepsilon > 0$ there exists $\delta > 0$ such that
$$0 < x - a < \delta \implies |f(x) - L| < \varepsilon.$$
Equivalently, $0 < |x - a| < \delta$ but with $x > a$.
These definitions parallel the two-sided case, but restrict $x$ to one side of $a$.
A standard result (stated without proof here) is:
- The (two-sided) limit $\lim_{x \to a} f(x)$ exists and equals $L$
if and only if both one-sided limits exist and are equal to $L$:
$$
\lim_{x \to a^-} f(x) = L \quad \text{and} \quad \lim_{x \to a^+} f(x) = L.
$$
Infinite Limits and Limits at Infinity (Formal Versions)
The ε–δ definition handles finite limits at finite points. We now state analogous formal definitions for:
- Limits where the function grows without bound (infinite limits).
- Limits where $x$ goes to infinity (limits at infinity).
These will be used more extensively in later calculus work.
Infinite Limits at a Finite Point
We write
$$\lim_{x \to a} f(x) = +\infty$$
to mean:
For every $M > 0$ there exists $\delta > 0$ such that
$0 < |x - a| < \delta \implies f(x) > M.$
Similarly,
$$\lim_{x \to a} f(x) = -\infty$$
means:
For every $N < 0$ there exists $\delta > 0$ such that
$0 < |x - a| < \delta \implies f(x) < N.$
Note the similarity to the ε–δ definition, but with $\varepsilon$ replaced by large positive bounds $M$ or large negative bounds $N$.
Limits at Infinity
We write
$$\lim_{x \to \infty} f(x) = L$$
to mean:
For every $\varepsilon > 0$ there exists $K > 0$ such that
$x > K \implies |f(x) - L| < \varepsilon.$
So, if we go far enough to the right on the $x$-axis (beyond $K$), $f(x)$ stays within $\varepsilon$ of $L$.
Similarly,
$$\lim_{x \to -\infty} f(x) = L$$
means:
For every $\varepsilon > 0$ there exists $K < 0$ such that
$x < K \implies |f(x) - L| < \varepsilon.$
We also have infinite limits at infinity:
- $\lim_{x \to \infty} f(x) = +\infty$ means:
for every $M > 0$ there exists $K > 0$ such that $x > K \implies f(x) > M$. - $\lim_{x \to \infty} f(x) = -\infty$ means:
for every $N < 0$ there exists $K > 0$ such that $x > K \implies f(x) < N$.
These formal definitions are parallel to the original ε–δ definition but adapted to different kinds of limiting behavior.
Limit Laws in the Formal Setting (Statement Only)
Using the ε–δ definition, one can prove various limit laws rigorously:
If $\lim_{x \to a} f(x) = L$ and $\lim_{x \to a} g(x) = M$, then (under suitable conditions):
- $\lim_{x \to a} (f(x) + g(x)) = L + M$,
- $\lim_{x \to a} (f(x)g(x)) = LM$,
- $\lim_{x \to a} \big(c f(x)\big) = cL$ for constant $c$,
- $\lim_{x \to a} \dfrac{f(x)}{g(x)} = \dfrac{L}{M}$ if $M \ne 0$,
and so on.
The proofs of these laws all proceed by starting from the ε–δ definition and constructing suitable $\delta$’s from the given information. In practice, these laws let you avoid doing ε–δ proofs from scratch for every new function.
Connection to Continuity and Derivatives
The formal definition of limit underpins other key definitions:
- A function is continuous at $a$ precisely when
$$\lim_{x \to a} f(x) = f(a).$$
This uses the same ε–δ idea, replacing $L$ with $f(a)$. - The derivative of $f$ at $a$ is defined as a limit:
$$f'(a) = \lim_{h \to 0} \frac{f(a + h) - f(a)}{h},$$
whose existence is understood via the ε–δ definition.
Thus, everything in differential calculus—continuity, derivatives, rules of differentiation—ultimately rests on the formal notion of limit described in this chapter.