The Most Important Proofs in Prelims Analysis


This is a completely objective list. My hope is that if I have these memorised, then I can sort of glue them together to work out other proofs.

Analysis I

Monotone sequence theorem

Scenic viewpoints theorem

Bolzano-Weierstrass theorem

  • Follows directly from the scenic viewpoint theorem and the monotone sequence theorem put together.

Convergent iff Cauchy

  • Sort of fiddly, for the Cauchy implies convergent direction you know the sequence is bounded, so has a monotone subsequence by Bolzano-Weierstrass. Then can mess around with the inequality.
  • For the convergent implies Cauchy direction, you just do the “plus L, minus L” trick.

Limit form of comparison test

  • First prove the normal form of the comparison test, which follows from $a _ k$ being a monotone increasing sequence
  • Then consider what $\frac{a _ k}{b _ k} \to L$ tells you by taking $\varepsilon = L/2$ and use the regular comparison test for both directions.

Alternating series test

  • Group terms together in different ways in order to see that both the $s _ {2n}$ and $s _ {2n+1}$ series of partial sums are convergent by the monotone sequence theorem.

Ratio test

  • Need to handle three cases seperately, but in both we compare with a geometric series. In the case where $0 \le L < 1$, this is $\alpha = \frac{1+L}{2}$ with $\varepsilon = \alpha - L$, in the case where $L > 1$, it’s also $\alpha = \frac{1+L}{2}$ but in the other case it’s $\alpha = 2$.

Integral test

  • Show that $\sigma _ n$ is bounded below and decreasing, by adding up all the inequalities like $f(n) \le \int^n _ {n-1} f(x) \text d x \le f(n-1)$ to get $0 \le \sigma _ n \le f(1)$ and then mess with $\sigma _ {n+1} - \sigma _ n$ to show it’s negative.
  • Then the other result follows from the AOL.

Series converge only within their radius of convergence

  • Two cases parts of the proof, absolute convergence within radius of convergence and then divergence outside it
  • Absolute convergence within radius of convergence: There is some $S$ less than the radius of convergence, and let $\varepsilon = R - S$ to find a $\rho$ less than or equal to $R$ where the series converges absolutely. Then use the comparison test.
  • Divergence outside radius of convergence: Use contradiction. Can bound the terms being summed by $M$, and then can find some $\rho$ where $R < \rho < \vert z \vert $ where $0 \le \vert c _ k \rho^k \vert \le \vert c _ k z^k \vert \left \vert \frac \rho z \right \vert ^k \le M \left \vert \frac \rho z\right \vert ^k$ which converges as it’s a geometric series with common ratio less than $1$.

Analysis II

Limit points via sequences

  • [[Notes - Analysis II HT23, Limit points]]U
  • To get a sequence from the condition, take the $x _ n$ given by letting $\varepsilon = \frac{1}{n}$.
  • To get the condition from the sequence, let $\varepsilon$ be arbitrary, and then $\exists p _ N$ such that $0 < \vert p _ N - p \vert < \varepsilon$.

Function limits via sequences

AOL for functions

Limits of composition of functions

Boundedness theorem

Intermediate value theorem

Continuous inverse function theorem

Continuity on closed bounded intervals implies uniform continuity

Uniform limit of continuous functions is continuous

Weierstrass’ M-test

Uniform convergence of power series within their radius of convergence

Equivalence of two definitions of differentiation

Fermat’s theorem

Rolle’s theorem

(Generalised) Mean value theorem

Taylor’s theorem

L’Hopital’s Rule

  • [[Notes - Analysis II HT23, L’Hôpital’s Rule]]U
  • Proofs in the lecture notes aren’t very detailed
  • $\frac 0 0$ case invovles showing that $g(x) = g(x) - g(a)$ is nonzero and then using Cauchy’s mean value theorem.
  • $\frac \infty \infty$ case involves some very dodgy steps that I can’t really justify, but all stems from the fact that for some reason, you can find a $\delta’ \in (0, \delta)$ such that $\left \vert \frac{f(x) - f(c)}{g(x) - g(c)} - L\right \vert < \varepsilon$. Then some rearranging (which might actually be wrong in the lecture notes, or at least misleading) gives the required result.

Real binomial theorem

  • [[Notes - Analysis II HT23, Binomial theorem]]U
  • Show that the quotient $F(x) = \frac{\sum^\infty _ {k=0} {p \choose k} x^k}{(1+x)^p}$ has zero derivative and so this function is constant, using the fact that for both $f(x) = (1+x)^p$ and $g(x) = \sum^\infty _ {k=0} {p \choose k} x^k$, we have $(1+x)f’(x) = pf(x)$.

Analysis III

Definition of integrability in terms of epsilon

  • [[Notes - Analysis III TT23, Step functions and basic definitions]]U
  • Forward direction follows from the supremum and infimum approximation property
  • Backwards direction follows from considering the supremum and infimum as bounds and then applying this to $I(\phi _ +) - I(\phi _ -) < \varepsilon$ to show that they can be squeezed arbitrarily close together.

If integrable on an interval, integrable on parts of that interval

  • [[Notes - Analysis III TT23, Basic theorems about the integral]]U
  • Consider that a majorant for the interval is the same as two “sub-majorants” juxtaposed next to each other (under some assumptions that we can take without loss of generality)
  • Use the fact that $x + y = x’ + y’$ and $x \le x’$ and $y \le y’$ implies $x = x’$ and $y = y’$ applied to the supremums and infimums.

Linearity of integration

Any continuous function is integrable (on a closed interval)

  • [[Notes - Analysis III TT23, Basic theorems about the integral]]U
  • Heavily uses the fact that continuity on a closed interval implies uniform continuity.
  • Consider a partition with mesh less than the $\delta$ from uniform continuity, and then the optimal majorants and minorants on that interval, to eventually show that $I(\phi _ +) - I(\phi _ -)$ is bounded by $\varepsilon(b - a)$.

First fundamental theorem of calculus

  • [[Notes - Analysis III TT23, Fundamental theorems of calculus]]U
  • Actually can show it’s Lipschitz, you just consider $ \vert F(c+h) - F(c) \vert $ and then you know that this translates to the integral of a bounded function.
  • Showing that if $f$ is continuous then $F$ is differentiable involves considering $ \vert F(c+h) - F(c) - hf(c) \vert $ which translates to $\left \vert \int^{c+h} _ c (f(x) - f(c)) \text d x\right \vert $.

Second fundamental theorem of calculus

Integration by substitution

Integration by parts

Integration and uniform limits commute ($\ast\ast$)

  • [[Notes - Analysis III TT23, Integration and limits]]U
  • You know there exists some $f _ n$ where $ \vert f _ n - f \vert $ is less than $\varepsilon$ everywhere and also have majorants and minorants for $f _ n$. So you can define $\hat\phi _ -$ and $\hat \phi _ +$ by adding and subtracting $\varepsilon$ respectively, which will define new majorants and minorants.
  • Then note $I(\hat \phi _ +) - I(\hat \phi _ -)$ is some constant multiple of $\varepsilon$
  • To show the integrals are actually equal, consider $ \vert \int^b _ a (f _ n - f) \vert $.

Differentiation and limits commute, sometimes ($\ast\ast$)

  • [[Notes - Analysis III TT23, Differentiation and limits]]U
  • The limit $f’ _ n \to g$ needs to be uniform and $g$ needs to be bounded because you need to show $g$ is integrable.
  • Define $F(x) = \int^x _ a g$, so that $F’ = g$
  • Consider $\int^x _ a f’ _ n(x) = f _ n(x) - f _ n(a)$ and then take the limit on both sides
  • End up with equality with $F$.

Differentiation theorem for power series

  • [[Notes - Analysis II HT23, Differentiation theorem]]U
  • Relies on two results, that $\sum^\infty _ {i=0} \lambda^i$ and $\sum^\infty _ {i=1} i\lambda^{i-1}$ both converge for $ \vert \lambda \vert <1$
  • Then you use the differentiation and limits commuting result for series, and check the uniform convergence via the M-test.



Related posts