Proofs - Probability I MT22
Random sums theorem
Prove the random sums theorem:
Let $X _ 1, X _ 2, \ldots$ be i.i.d. non-negative integer-valued random variables with p.g.f. $G _ X(s)$. Let $N$ be another non-negative integer-valued random variable, independent of $X _ 1, x _ 2, \ldots$ and with p.g.f. $G _ N(s)$.
Then the p.g.f. of $\sum^N _ {i=1} X _ i$ is $G _ N(G _ X(s))$.
Let $X _ 1, X _ 2, \ldots$ be i.i.d. non-negative integer-valued random variables with p.g.f. $G _ X(s)$. Let $N$ be another non-negative integer-valued random variable, independent of $X _ 1, x _ 2, \ldots$ and with p.g.f. $G _ N(s)$. Then the p.g.f. of $\sum^N _ {i=1} X _ i$ is $G _ N(G _ X(s))$.
Todo, (probability, page 41).
Markov’s inequality
Prove Markov’s inequality:
Let $X$ be a random variable. Then, for all $t > 0$, $\mathbb{P}(X > t) < \frac{\mathbb{E}[X]}{t}$.
Let $X$ be a random variable. Then, for all $t > 0$, $\mathbb{P}(X > t) < \frac{\mathbb{E}[X]}{t}$.
Todo, (probability, page 64)
Chebyshev’s inequality
Prove Chebyshev’s inequality:
Let $X$ be a random variable. Then, for all $t > 0$, $\mathbb{P}( \vert X-\mu \vert \ge t) \le \frac{\sigma^2}{t^2}$.
Let $X$ be a random variable. Then, for all $t > 0$, $\mathbb{P}( \vert X-\mu \vert \ge t) \le \frac{\sigma^2}{t^2}$.
Todo?
Vandermonde’s identity
Prove Vandermonde’s identity, i.e.
\[{\\,{m+n} \choose r} = \sum_{i=0}^r {m \choose i} {n \choose r - i}\]
Todo, (probability, page 6).
Partition theorem
Prove the partition theorem:
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ by sets from $\mathcal F$, such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
\[\mathbb P(A) = \sum _ {i \ge 1} \mathbb P(A \vert B _ i) \mathbb P(B _ i)\]
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ by sets from $\mathcal F$, such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
Todo (probability, page 10).
Bayes’ theorem
Prove Bayes’ theorem:
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ by sets from $\mathcal F$, such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
\[\mathbb P(B_k | A) = \frac{\mathbb P (A | B_k) \mathbb P(B_k)}{\sum_{i \ge 1} \mathbb P(A | B_i)} \mathbb P(B_i)\]
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ by sets from $\mathcal F$, such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
Todo (probability, page 11).
Expectation of a function
Suppose $h : \mathbb R \to \mathbb R$ and $X$ is a random variable. Prove that
\[\mathbb E[h(X)] = \sum_{x \in \text{Im} X} h(x) \mathbb{P}(X = x)\]
Todo (probability, page 19).
Properties of expectation
Prove that if $X$ is a discrete random variable and $\mathbb E[X]$ exists, then
- If $X$ non-negative, then $\mathbb E[X] \ge 0$
- If $a, b \in \mathbb R$ then $\mathbb E[aX + b] = a\mathbb E[X] + b$
Todo (probability, page 20).
Partition theorem for expectations
Prove the partition theorem for expecations:
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
\[\mathbb E[X] = \sum _ {i \ge 1} \mathbb E[A \vert B _ i] \mathbb P(B _ i)\]
Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,
Todo (probability, page 21).
Partition theorem for two variables
Suppose $X$ and $Y$ are discrete random variables and $a, b \in \mathbb R$ are constants. Prove that
\[\mathbb E[aX + bY] = a \mathbb E[X] + b \mathbb E[Y]\]
Todo (probability, page 24).
Expectation of independent random variables
Prove that if $X$ and $Y$ are independent discrete random variables whose expectations exist, then
\[\mathbb E[XY] = \mathbb E[X] \mathbb E[Y]\]
Todo (probability, page 25)
Uniqueness theorem for probability generating functions
Prove that if $X$ is a random variable, then the distribution of $X$ is uniquely determined by its probability generating function $G _ X$.
Todo (probability, page 36).
Probability generating function of a product of independent random variables
Prove that if $X$ and $Y$ are independent random variables, then
\[G_{X+Y}(s) = G_X(s) G_Y(s)\]
Todo (probability, page 37).
Sum of Bernoulli random variables is a binomial distribution
Suppose that $X _ 1, \ldots, X _ n$ are independent $\text{Ber}(p)$ random variables and let $Y = X _ 1 + \ldots + X _ n$. Prove that $Y \sim \text{Bin}(n, p)$.
Todo (probability, page 37).
Sum of Poisson random variables is a Poisson distribution
Suppose that $X _ 1, \ldots, X _ n$ are independent $\text{Poi}(\lambda _ i)$ random variables and let $Y = X _ 1 + \ldots + X _ n$. Prove that $Y \sim \text{Poi}(\sum _ i \lambda _ i)$.
Todo (probability, page 38).
Sum of a Poisson number of Bernoulli variables is a Poisson variable
Suppose that $X _ 1, \ldots$ are independent and identically distributed $\text{Ber}(p)$ variables and that $N \sim \text{Poi}(\lambda)$, independently of $X _ 1, \ldots$. Prove that $\sum^N _ {i=1} X _ i \sim \text{Poi}(\lambda p)$.
Todo (probaiblity, page 41).
Probability generating functions of branching processes
Prove that if a branching process has offspring distribution given by probability generating function $G$ then the distribution for the number of individuals at generation $n$ is given by
\[G^n(s)\]
Todo (probability, page 42).
Expected number of children of a branching process
Let $X _ n$ be the number of children in the $n$-th generation of a branching process and that the mean number of children of a single individual is $\mu$. Then
\[\mathbb E[X_n] = \mu^n\]
Properties of cumulative distribution functions
Prove that if $F _ X$ is the cumulative density function then
- $F _ X$ is non-decreasing
- $\mathbb P(a < X \le b) = F _ X(b) - F _ X(a)$
- As $x \to \infty$, $F _ X(x) \to 1$.
- As $x \to -\infty$, $F _ X(x) \to 0$.
Todo (probability, page 47).
Linearity of expectation for continuous random variables
Suppose that $X$ is a continuous random variable with p.d.f. $f _ X$. Prove that if $a, b \in \mathbb R$, $\mathbb E[aX + b] = a\mathbb E[X] + b$ and $\text{var}(aX + b) = a^2 \text{var}(X)$.
Todo (probability, page 54).
Probability density for a function of a continuous random variable
Suppose that $X$ is a continuous random variable with density $f _ X$ and that $h : \mathbb R \to \mathbb R$ is a strictly increasing differentiable function. Prove that $Y = h(X)$ is a continuous random variable with p.d.f.
\[f_Y(y) = f_X(h^{-1}(y))\frac{\text d}{\text d y} h^{-1}(y)\]
Todo (probability, page 56).
Probability of jointly continuous random variables
Suppose $X$ and $Y$ are jointly continuous random variables. Prove that
\[\mathbb P(a < X \le b, c < Y \le d) = \int^d_c \int^b_a f_{X,Y}(x, y) \text d x \text d y\]
Todo (probability, page 58).
Seperating joint pdfs
Prove that if $X$ and $Y$ are jointly continuous with joint density $f _ {X, Y}$ then $X$ is a continuous random variable with density
\[f_X(x) = \int^\infty_{-\infty} f_{X,Y} (x, y) \text d y\]
and likewise
\[f_Y(y) = \int^\infty_{-\infty} f_{X,Y} (x, y) \text d x\]
Todo (probability, page 59).
Expectation and variance of sample mean
Suppose that $X _ 1, X _ 2, \ldots, X _ n$ form a random sample from a distribution with mean $\mu$ and variance $\sigma^2$. Prove that
- $\mathbb E [\overline X _ n] = \mu$
- $\text{var}(\overline X _ n) = \frac {\sigma^2} n$
Todo (probability, page 63).
Weak law of large numbers
Prove (under the assumption variance is finite) the weak law of large numbers:
Suppose that $X _ 1, \ldots$ are i.i.d. random variables with mean $\mu$. Then, for all $\varepsilon > 0$,
\[\mathbb P \left( \left \vert \frac 1 n \sum^n _ {i=1} X _ i - \mu \right \vert \le \varepsilon \right) \to 0\]
as $n \to \infty$.
Suppose that $X _ 1, \ldots$ are i.i.d. random variables with mean $\mu$. Then, for all $\varepsilon > 0$,
Todo (probability, page 65).