Lecture - Probability MT22, V


Flashcards

What is a useful fact about a family of independent events $A _ 1, A _ 2, \ldots, A _ n$?


You can take the complement of some of them and they are still independent.

What is $\mathbb{P}(X = 1)$ for $X \sim \text{Ber}(p)$?


\[p\]

What is $\mathbb{P}(X = 0)$ for $X \sim \text{Ber}(p)$?


\[1-p\]

What is $\mathbb{P}(X = k)$ for $X \sim \text{Bin}(n, p)$?


\[\left(\begin{matrix}n \\\\ k\end{matrix}\right) p^k (1 - p)^{n-k}\]

What is $\mathbb{P}(X = k)$ for $X \sim \text{Geom}(p)$?


\[(1-p)^{k-1}p\]

When do you use $\mathbb{P}(X = k)$ for $X \sim \text{Geom}(p)$?


When working out the number of Bernouilli trials for the first success.

What is $\mathbb{P}(X = x _ i)$ for $X \sim \text{Unif}(\{x _ 1, x _ 2, \ldots, x _ n\})$?


\[\frac{1}{n}\]

What is $\mathbb{P}(X = k)$ for $X \sim \text{Poisson}(\lambda)$?


\[\frac{\lambda^k e^{-\lambda}}{k!}\]

What’s the formula for $\mathbb{E}[X]$, the expectation of the discrete random variable $X$?


\[\sum_{x \in \text{Im}X} x\mathbb{P}(X = x)\]

How can you think about the formula

\[\mathbb{E}[X] = \sum _ {x \in \text{Im}X} x\mathbb{P}(X = x)\]

?


Weighting the possible values of $x$ by their probability.

Let $h : \mathbb{R} \to \mathbb{R}$ and $Y := h(X)$. What is the formula for $\mathbb{E}[Y]$?


\[\sum_{x \in \text{Im} X} h(x)\mathbb{P}(X = x)\]

What does $\text{Im} X$ mean when talking about a discrete random variable?


The set of all values that $X$ can take.




Related posts