Probability MT22, Expectation


Flashcards

Discrete random variables

@Prove that for a non-negative continuous random variable $Z$,

\[\mathbb{E}[Z] = \int _ 0^\infty P(Z \ge z) \text dz\]

@todo.

What’s a trick for finding $\mathbb{E}[X^2]$ if you know $\mathbb{E}[X]$ and $\text{var}(X)$?


\[\mathbb{E}[X^2] = \text{var}(X) + \mathbb{E}[X]^2\]

What’s the formula for $\mathbb{E}[X]$, the expectation of the discrete random variable $X$?


\[\sum _ {x \in \text{Im}X} x\mathbb{P}(X = x)\]

How can you think about the formula

\[\mathbb{E}[X] = \sum _ {x \in \text{Im}X} x\mathbb{P}(X = x)\]

?


Weighting the possible values of $x$ by their probability.

Let $h : \mathbb{R} \to \mathbb{R}$ and $Y := h(X)$. What is the formula for $\mathbb{E}[Y]$?


\[\sum _ {x \in \text{Im} X} h(x)\mathbb{P}(X = x)\]

Suppose $h : \mathbb R \to \mathbb R$ and $X$ is a random variable. @Prove that

\[\mathbb E[h(X)] = \sum _ {x \in \text{Im} X} h(x) \mathbb{P}(X = x)\]

@todo (probability, page 19).

@important~

What’s the formula for $\mathbb{E}[X \vert B]$?


\[\sum _ {\text{Im} X} x\mathbb{P}(X = x \vert B)\]

What is $\mathbb{E}[h(X, Y)]$?


\[\sum _ {x \in \text{Im}X}\sum _ {y \in \text{Im} Y} h(x, y) p _ {X,Y}(x, y)\]

How can you rewrite $\mathbb{E}[aX + bY]$?


\[a \mathbb{E}[X] + b\mathbb{E}[Y]\]

How can you prove the linearity of expectation $\mathbb{E}[\alpha X + \beta Y] = \alpha\mathbb{E}[X] + \beta\mathbb{E}[Y]$?


Take $h(x, y) = \alpha x + \beta y$.

@Prove that if $X$ is a discrete random variable and $\mathbb E[X]$ exists, then

  • If $X$ non-negative, then $\mathbb E[X] \ge 0$
  • If $a, b \in \mathbb R$ then $\mathbb E[aX + b] = a\mathbb E[X] + b$

@todo (probability, page 20).

@Prove the partition theorem for expecations:

Suppose $B _ 1, B _ 2, \ldots$ is a partition of $\Omega$ such that $\mathbb P(B _ i) > 0$ for all $i \ge 0$. Then for any $A \in \mathcal F$,

\[\mathbb E[X] = \sum _ {i \ge 1} \mathbb E[A \vert B _ i] \mathbb P(B _ i)\]

@todo (probability, page 21).

@important~

Suppose $X$ and $Y$ are discrete random variables and $a, b \in \mathbb R$ are constants. @Prove that

\[\mathbb E[aX + bY] = a \mathbb E[X] + b \mathbb E[Y]\]

@todo (probability, page 24).

@important~

@Prove that if $X$ and $Y$ are independent discrete random variables whose expectations exist, then

\[\mathbb E[XY] = \mathbb E[X] \mathbb E[Y]\]

@todo (probability, page 25).

@important~

Continuous random variabless

Given a continuous random variable $X$ with density function $f _ X$, what is the mean or expectation $\mathbb{E}[X]$ of $X$?


\[\mathbb{E}[X] = \int^\infty _ {-\infty} xf _ X(x)\text{d}x\]

For a continuous random variable $X$ with PDF $f _ X$, and $h:\mathbb{R} \to \mathbb{R}$, what is $\mathbb{E}[h(X)]$ (provided the absolute version converges)?


\[\int^\infty _ {-\infty} h(x) f _ X(x)\text{d}x\]

For a continuous random variable $X$ with PDF $f _ X$, and $h:\mathbb{R} \to \mathbb{R}$, what condition is there for the $\mathbb{E}[h(x)]$ to exist?


\[\int^\infty _ {-\infty} \vert h(x) \vert f _ X(x)\text{d}x \text{ converges}\]

If $X \ge 0$ and $X$ is a continuous random variable, then what is an equivalent definition of the expectation $\mathbb{E}[X]$ using $\mathbb P(X > x)$?


\[\int^\infty _ 0 \mathbb P(X > x) \text{d}x\]

Suppose that $X$ is a continuous random variable with p.d.f. $f _ X$. @Prove that if $a, b \in \mathbb R$, $\mathbb E[aX + b] = a\mathbb E[X] + b$ and $\text{var}(aX + b) = a^2 \text{var}(X)$.


@todo (probability, page 54).




Related posts