Lecture - Probability MT22, VII
Flashcards
How can you prove the linearity of expectation $\mathbb{E}[\alpha X + \beta Y] = \alpha\mathbb{E}[X] + \beta\mathbb{E}[Y]$?
Take $h(x, y) = \alpha x + \beta y$.
What are the two equivalent definitions of $\text{Var}(x)$?
\[\text{Var}(X) = \mathbb{E}[(X - \mathbb{E}[X])^2] = \mathbb{E}[X^2] - (\mathbb{E}[X])^2\]
How can you interpret the variance of a random variable?
The average squared distance from the mean.
What is $\text{Var}(aX + b)$?
\[a^2 \text{Var}(X)\]
Given that $X$ and $Y$ are independent, what is the formula for $\mathbb{E}[XY]$?
\[\mathbb{E}[XY] = \mathbb{E}[X]\mathbb{E}[Y]\]
What are the two equivalent definitions for $\text{Cov}(X, Y)$?
\[\text{Cov}(X, Y) = \mathbb{E}[(X - \mathbb{E}[X])(Y-\mathbb{E}[Y])] = \mathbb{E}[XY] - \mathbb{E}[X]\mathbb{E}[Y]\]
Given that $X$ and $Y$ are independent, what is $\text{Cov}(X, Y)$?
\[0\]
If $X$ and $Y$ are independent, then $\text{Cov}(X, Y) = 0$. Is the converse true?
No.
What is $\text{Var}(X+Y)$?
\[\text{Var}(X+Y) = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X, Y)\]
What is $\text{Cov}(X,X)$?
\[\text{Var}(X)\]