Probability MT22, Random variables
Flashcards
@State the definition for a _ discrete _ random variable, and then the definition of just a random variable?
$X$ is a discrete random variable if
- $\text{Im}(X)$ is a finite or countable subset of $\mathbb R$
- $\forall x \in \text{Im}(X), \{\omega : X(\omega) = x\} \in \mathcal F$
$X$ is a random variable if
@todo.
If $X _ 1, X _ 2, X _ 3$ are identically distributed and independent, are $X _ 1 + X _ 2 + X _ 3$ and $3X _ 1$ the same?
No.
For an event $A$, what is the definition of the random variable $\mathbb{1} _ A$?
Discrete random variables
What sets is a discrete random variable $X$ a function on?
@State the two conditions for function $p _ X(x)$ to be a probability mass function.
- $\forall x: p _ X(x) \ge 0$
- $\sum _ {x\in\text{Im }X} p _ X(x) = 1$
What is the first condition for a function $X : \Omega \to \mathbb{R}$ to be a discrete random variable, in English (it’s about images)?
The image of $X$ is a countable set.
What is the second condition for a function $X : \Omega \to \mathbb{R}$ to be a discrete random variable (about what it means for $X = x$)?
Can you expand “the image of a discrete random variable $X$” into notation?
What is the probability mass function of a discrete random variable $X$?
Given a function $p _ X(x)$ that satisfies the conditions for being a probability mass function, how could we define a probability space $(\Omega, \mathcal{F}, \mathbb{P})$ and a random variable $X$ that corresponds to this probability mass function?
- $\Omega = \{x \in \mathbb{R} : p(x) \ne 0\}$
- $\mathcal{F} = \mathcal{P}(\Omega)$
- $\mathbb{P}(S) = \sum _ {\omega \in S} p(\omega)$
- $X(\omega) = \omega$
Suppose B is an event such that $\mathbb{P}(B) \ne 0$. What is the definition of the “conditional mass function of X given B”?
What is $\mathbb{E}[h(X, Y)]$ for discrete random variables $X$ and $Y$?
What does $\text{Im} X$ mean when talking about a discrete random variable?
The set of all values that $X$ can take.
How could you calculate $\mathbb{P}(X = x \vert B)$?
Jointly distributed discrete random variables
What must hold about any joint probability mass function $p _ {X, Y}(x, y)$?
How can you determine the marginal distribution $\mathbb{P}(X = x)$ given the joint distribution $\mathbb{P}(X = x, Y = y)$?
What is true about the joint distribution $p _ {X,Y}(x,y)$ and the marginal distributions $p _ X(x)$ and $p _ Y(y)$ when $X$ and $Y$ are independent?
How could you write the definition of the CDF $F _ X$ in the discrete case?
Continuous random variables
What is the only condition for $X: \Omega \to \mathbb{R}$ being a random variable?
What is the formula for the cumulative distribution function (CDF) $F _ X : \mathbb{R} \to \mathbb{R}$ of a random variable?
What is true about the monotonicity of the CDF?
It is monotonically increasing.
What is $\lim _ {x \to \infty} F _ X(x)$?
What is $\lim _ {x \to -\infty} F _ X(x)$?
How can you calculate (for $a < b$) $\mathbb{P}(a < x \le b)$ using the CDF $F _ X(x)$?
What’s the very specific inequality for $F _ X(b) - F _ X(a)$?
How is the probability density function related to the cumulative distribution function $F _ X$ for a _ continuous _ random variable?
What is the condition for a random variable to be a continuous random variable?
It can be written in terms of an integral.
For any continuous random variable, what is $\mathbb{P}(X = x)$?
Why doesn’t the fact that
\[\lim _ {n \to \infty} \mathbb{P}(A _ n) = \mathbb{P}\left(\bigcup^\infty _ {n = 1} A _ n\right)\]
for an increasing family of events contradict the fact that $\mathbb{P}(X = x) = 0$ for a continuous random variable?
Because the above only works for countable unions.
Given a continuous random variable $X$ with PDF $f _ X(x)$, and $Y = aX + b$, what’s the first step in determining the PDF $f _ Y(y)$?
Rewriting the CDF in terms of $X$:
\[F _ Y(y) = F _ X\left(\frac{y-b}{a}\right)\]You have a continuous random variable $X$ with PDF $f _ X(x)$, and $Y = aX + b$. You’ve written the CDF in terms of $X$ like $F _ Y(y) = F _ X\left(\frac{y - b}{a}\right)$. How do you now determine $f _ Y(y)$?
Differentiate $F _ X\left(\frac{y-b}{a}\right)$ with respect to $y$.
@Prove that if $F _ X$ is the cumulative density function then
- $F _ X$ is non-decreasing
- $\mathbb P(a < X \le b) = F _ X(b) - F _ X(a)$
- As $x \to \infty$, $F _ X(x) \to 1$.
- As $x \to -\infty$, $F _ X(x) \to 0$.
@todo (probability, page 47).
@important~
Suppose that $X$ is a continuous random variable with density $f _ X$ and that $h : \mathbb R \to \mathbb R$ is a strictly increasing differentiable function. @Prove that $Y = h(X)$ is a continuous random variable with p.d.f.
\[f _ Y(y) = f _ X(h^{-1}(y))\frac{\text d}{\text d y} h^{-1}(y)\]
@todo (probability, page 56).
@important~
Jointly distributed continuous random variables
For two continuous random variables $X, Y$ then what is $F _ {X,Y}(x,y)$ shorthand for?
What form does the joint CDF $F _ {X,Y}(x,y)$ have to be for $X$ and $Y$ to be jointly continuously distributed with PDF $f _ {X,Y}$?
For a jointly continuously distributed $(X, Y)$, what is $\mathbb{P}(a < X \le b, c < Y \le d)$ in terms of the joint CDF?
For “nice enough” $A \subseteq \mathbb{R}^2$, what is $\mathbb{P}((X,Y) \in A)$?
If $X, Y$ are jointly continuous with joint density function $f _ {X,Y}$, how can you recover $f _ X(x)$?
If $X, Y$ are jointly continuous with joint density function $f _ {X,Y}$, how can you recover $f _ Y(x)$?
What does it mean for continuous random variables $X$ and $Y$ to be independent?
If $f _ {X, Y} (x, y) = f _ X(x) f _ Y(y)$ for all $x$ and $y$.
How can you rewrite $F _ {X,Y}(x,y)$ for independent continuous random variables $x$ and $y$?
When you two random variables whose domain are $(0, 1)$, then what do probabilities correspond to?
Areas.
Suppose $X$ and $Y$ are jointly continuous random variables. @Prove that
\[\mathbb P(a < X \le b, c < Y \le d) = \int^d _ c \int^b _ a f _ {X,Y}(x, y) \text d x \text d y\]
@todo (probability, page 58).
@Prove that if $X$ and $Y$ are jointly continuous with joint density $f _ {X, Y}$ then $X$ is a continuous random variable with density
\[f _ X(x) = \int^\infty _ {-\infty} f _ {X,Y} (x, y) \text d y\]
and likewise
\[f _ Y(y) = \int^\infty _ {-\infty} f _ {X,Y} (x, y) \text d x\]
@todo (probability, page 59).