Uniform Convergence
[under construction]
The problems with pointwise convergence
Definition. If $X$ is a metric
space, and $f_n:X\to \R$ ($n\in\N$) is a sequence of functions, then
$f_n$ converges pointwise to $f$ if for every $x\in X$ one has
$\lim_{n\to\infty} f_n(x) = f(x)$.
For example, the sequence of functions $f_n(x) = x^n/n$ converges pointwise
to zero on the interval $X=[-1,1]$, because for each $x\in [-1,1]$ one has
$|x^n/n|\leq 1/n$, and thus
\[
\lim_{n\to\infty} \frac{x^n}{n} = 0.
\]
The limit of a pointwise convergent sequence of continuous functions does not have to be continuous.
For example, consider $X=[0,1]$, and $f_n(x) = x^n$. Then
\[
\lim_{n\to\infty} f_n(x) = f(x) =
\begin{cases} 0 & (0\leq x\lt 1) \\ 1 & (x=1) \end{cases}
\]
The derivatives of a pointwise convergent sequence
of functions do not have to converge. $X=\R$, $f_n(x) = \frac1n
\sin(n^2x)$. In this case
\[
\lim_{n\to\infty} f_n(x) = 0,
\]
so the pointwise limit function is $f(x) = 0$: the sequence of functions
converges to $0$. What about the derivatives of the sequence? These are
given by
\[
f_n'(x) = n\cos (n^2x),
\]
and for most $x\in\R$ the sequence $n \cos(n^2x)$ is unbounded. The
sequence of derivatives $f_n'(x)$ does not converge pointwise.
The integrals of a pointwise convergent sequence
of functions do not have to converge.
Consider $X=[0,1]$, $f_n(x) = \frac{2n^2x}{\bigl(1+n^2x^2\bigr)^2}$.
Then
\[
\lim_{n\to\infty} f_n(x) = 0
\]
for all $x\in[0,1]$. But the integrals of $f_n$ over the interval $X$ are
\[
\int_0^1 \frac{2n^2xdx}{\bigl(1+n^2x^2\bigr)^2}
\stackrel{u=1+n^2x^2}=
\int_1^{1+n^2} \frac{du}{u^2}
= 1 - \frac1{1+n^2}.
\]
Therefore, even though $\lim_{n\to\infty} f_n(x) = 0$ for all $x\in [0,1]$,
we have
\[
\lim_{n\to\infty} \int_0^1 f_n(x) dx = 1.
\]
Uniform convergence
Definition. A sequence of functions
$f_n:X\to Y$ converges uniformly if for every $\epsilon\gt0$ there is an
$N_\epsilon\in\N$ such that for all $n\geq N_\epsilon$ and all $x\in X$ one has
$d(f_n(x), f(x))\lt \epsilon$.
Uniform convergence implies pointwise
convergence, but not the other way around.
For example, the sequence $f_n(x) = x^n$
from
the previous
example converges pointwise on the interval $[0,1]$, but it
does not converge uniformly on this interval. To prove this we
show that the assumption that $f_n(x)$ converges uniformly leads
to a contradiction.
If $f_n(x)$ converges uniformly, then the limit function must
be $f(x) = 0$ for $x\in[0,1)$ and $f(1) = 1$. Uniform
convergence implies that for any $\epsilon\gt0$ there is an
$N_\epsilon\in\N$ such that $|x^n- f(x)|\lt \epsilon$ for all
$n\geq N_\epsilon$ and all $x\in [0,1]$. Assuming this is
indeed true we may choose $\epsilon$, in particular, we can
choose $\epsilon=\frac12$. Then there is an $N\in \N$ such that
for all $n\geq N$ we have $|x^n-f(x)|\lt \frac 12$. We may
choose $n$ and $x$. Let us choose $n=N$, and
$x=\bigl(\frac34\bigr)^{1/N}$. Then we have $f(x)=0$ and thus
\[
|f_N(x) - f(x)| = x^N - 0 = \frac 34 \gt \frac12,
\]
contradicting our assumption.
The uniform metric
Definition—the set of bounded
functions. If $E$ is a set, then a function $f:E\to\R$ is
bounded if there is an $M\in\R$ such that $|f(x)|\leq M$ for all $x\in E$.
We will write $\cB(E)$ for the set of all bounded functions from
$E$ to $\R$.
Definition—the uniform distance between bounded
functions. The uniform distance between two
bounded functions $f, g\in\cB(E)$ is
\[
\du(f, g) = \sup_{x\in E} |f(x) - g(x)|.
\]
Theorem.  $\du(f, g)$ is a metric on $\cB(E)$.
We have to check:
- $\du(f,g) = \du(g,f)$
- $\du(f,g)\geq 0$ with $\du(f,g) = 0$ if and only if $f=g$
- $\du(f,g) \leq \du(f,h) + \du(h,g)$
Example. Let $E = [0,1)$ and
consider the sequence of functions $f_n(x) = x^n$. We know that
$f_n(x)\to0$ pointwise on $[0,1)$.
Question: Does the sequence converge uniformly on $[0,1)$?
Answer:
Since uniform
convergence is equivalent to convergence in the uniform
metric, we can answer this question by computing $\du(f_n, f)$
and checking if $\du(f_n, f)\to0$. We have, by definition
\[
\du(f_n, f) = \sup_{0\leq x\lt 1}|x^n - 0|
=\sup_{0\leq x\lt 1} x^n = 1.
\]
Therefore
\[
\lim_{n\to\infty} \du(x^n, 0) = \lim_{n\to\infty} 1 = 1 \neq 0.
\]
The sequence of functions $x^n$ does not converge
uniformly on the interval $[0,1)$.
Question:  Does the same sequence of functions
converge uniformly on the interval $[0,a]$ if $a$ is some number
with $0\lt a \lt 1$?
Answer:  We again compute the uniform distance between
the functions $f_n(x) = x^n$ and $f(x)=0$, but this time on the
interval $[0,a]$ instead of the interval $(0,1)$ that we used in
the previous example. We have
\[
\du(x^n, 0) = \sup_{0\leq x\leq a} |x^n-0| = \sup_{0\leq x\leq a} x^n = a^n.
\]
Since $0\lt a\lt 1 $ we have
\[
\lim_{n\to\infty} \du(f_n, f) = \lim_{n\to\infty} a^n = 0,
\]
so that the sequence of functions $x^n$ does converge uniformly on
the interval $[0,a]$, for any $a\in(0,1)$.
Three consequences of uniform convergence
In the following theorems $E$ is a metric space.
Theorem. Let $f_n:E\to \R$ be a sequence of
functions. If $f_n$ converges uniformly to $f:E\to \R$, and if each $f_n$ is
continuous, then $f$ is also continuous.
This theorem implies that for any uniformly convergent sequence of functions
$f_n:E\to \R$ and convergent sequence of points $x_n\in E$ one can “switch
limits,” i.e.
\[
\lim_{n\to\infty} \lim_{k\to\infty} f_n(x_k) =
\lim_{k\to\infty} \lim_{n\to\infty} f_n(x_k) .
\]
Theorem.
If $f_n:[a,b]\to\R$ is a sequence of Riemann integrable functions that
converges uniformly to $f:[a,b]\to\R$, then the limit $f$ is also Riemann
integrable and
\[
\lim_{n\to\infty} \int_a^b f_n(x) dx = \int_a^b f(x) dx.
\]
Since $f(x) = \lim_{n\to\infty} f_n(x)$ we can write the conclusion as
\[
\lim_{n\to\infty} \int_a^b f_n(x) dx = \int_a^b \lim_{n\to\infty} f_n(x) dx,
\]
In other words, this theorem justifies switching limits and integration.
Theorem.
Let $f_n:[a,b]\to\R$ be a sequence of differentiable functions whose
derivatives $f_n'$ are continuous. If $f_n$ converges uniformly to $f$ and
$f_n'$ converges uniformly to $g$, then the limit $f$ is differentiable and
its derivative is $f'=g$.
We can rewrite the conclusion as
\[
\lim_{n\to\infty}\frac{d f_n(x)}{dx} =
\frac{d \lim_{n\to\infty}f_n(x)}{dx}.
\]
This theorem justifies switching of limits and derivatives. Note
that the hypotheses of the theorem require that both the sequence
of functions $f_n$ and the sequence of their derivatives $f_n'$
must converge uniformly.
Uniform convergence of series
Pointwise convergence for series. If
$f_n$ is a sequence of functions defined on some set $E$, then we
can consider the partial sums
\[
s_n(x) = f_1(x) + \cdots + f_n(x) = \sum_{k=1}^n f_k(x).
\]
If these converge as $n\to\infty$, and if this happens for every
$x\in E$, then we say that the series converges
pointwise. The sum of the series is
\[
S(x) = \sum_{k=1}^\infty f_k(x)
\stackrel{\sf def}= \lim_{n\to\infty}\sum_{k=1}^n f_k(x)
= \lim_{n\to\infty} s_n(x).
\]
The sum $\sum_1^\infty f_k(x)$ is defined for each $x\in E$ and so
it is a function on $E$.
Uniform convergence of series. A series
$\sum_{k=1}^\infty f_k(x)$ converges uniformly if the sequence of
partial sums $s_n(x) = \sum_{k=1}^n f_k(x)$ converges uniformly.
The Weierstrass M–test. If $f_n:E\to\R$ is a
sequence of functions for which one has a sequence $M_n$ with
\[
|f_n(x)|\leq M_n \text{ for all }x\in E,
\]
and for which
\[
\sum_{n=1}^\infty M_n \lt \infty,
\]
then the series $\sum_{k=1}^\infty f_k(x)$ converges uniformly.
Let $s_n(x) = f_1(x) + \cdots + f_n(x)$ be the $n$
th partial
sum of the series, and let $S(x) = \sum_1^\infty f_n(x)$ be the sum of
the series. Since the series converges uniformly, we have
$\lim_{n\to\infty} \du(s_n, S) = 0$. By the triangle inequality we also
have
$\du(s_n, s_{n-1}) \leq \du(s_{n-1}, S) + \du(S, s_n)$
so that $\du(s_n, s_{n-1}) \to 0$ as $n\to\infty$.
The uniform distance between the two consecutive partial sums $s_{n-1}$
and $s_n$ is
$\du(s_{n-1}, s_n) = \sup_{x\in E} |s_{n-1}(x) - s_n(x)| = \sup_{x\in E} |f_n(x)|$.
It follows that
$\sup_{x\in E} |f_n(x)| \to 0$ as $n\to\infty$.