Lecture - Probability MT22, V
Flashcards
What is a useful fact about a family of independent events $A _ 1, A _ 2, \ldots, A _ n$?
You can take the complement of some of them and they are still independent.
What is $\mathbb{P}(X = 1)$ for $X \sim \text{Ber}(p)$?
What is $\mathbb{P}(X = 0)$ for $X \sim \text{Ber}(p)$?
What is $\mathbb{P}(X = k)$ for $X \sim \text{Bin}(n, p)$?
What is $\mathbb{P}(X = k)$ for $X \sim \text{Geom}(p)$?
When do you use $\mathbb{P}(X = k)$ for $X \sim \text{Geom}(p)$?
When working out the number of Bernouilli trials for the first success.
What is $\mathbb{P}(X = x _ i)$ for $X \sim \text{Unif}(\{x _ 1, x _ 2, \ldots, x _ n\})$?
What is $\mathbb{P}(X = k)$ for $X \sim \text{Poisson}(\lambda)$?
What’s the formula for $\mathbb{E}[X]$, the expectation of the discrete random variable $X$?
How can you think about the formula
\[\mathbb{E}[X] = \sum _ {x \in \text{Im}X} x\mathbb{P}(X = x)\]
?
Weighting the possible values of $x$ by their probability.
Let $h : \mathbb{R} \to \mathbb{R}$ and $Y := h(X)$. What is the formula for $\mathbb{E}[Y]$?
What does $\text{Im} X$ mean when talking about a discrete random variable?
The set of all values that $X$ can take.