AIMA - Quantifying Uncertainty


In which we see how to tame uncertainty with numeric degrees of belief.

Flashcards

Why do we need probability theory?


Because it provides a reasonable framework for dealing with uncertainty.

What is the decision theory “equation”?


\[\text{decision theory} = \text{probability theory} + \text{utility theory}\]

What is the principle of maximum expected utility (MEU)?


An agent is rational if and only if it chooses the action that yields the highest expected utility, averaged over all possible outcomes of the action.

If logical assertions say whether a possible world exists or not, what do probabilistic assertions say?


How likely a given world is.

What is a prior or unconditional probability?


Degrees of belief in an event without any other information.

What is a posterior or conditional probability?


Degrees of belief in an event with additional information given.

Is

\[P(\text{Total} = 11)\]

a prior or posterior probability?


Prior.

Is

\[P(\text{Total} = 11 \vert \text{Die} _ 1 = 6)\]

a prior or posterior probability?


Posterior.

How do you pronounce

\[P(A \vert B)\]

?


“The probability of $A$ given $B$”

What is the product rule for

\[P(a \land b)\]

?


\[P(a | b)P(b)\]

What is the difference between

\[\pmb{P}(\pmb{x})\]

and

\[P(x)\]

?


The former is a probability distribution showing the different probabilities for different values of a vector $\pmb{x}$ and the latter is the probability of $x$ occuring.

What is a joint probability distribution?


A way of describing the probability for different combinations of variables.

What is the full joint probability distribution?


A way of describing the probability for every different combinations of random variables under consideration.

How can you reduce the size of a full joint probability distribution?


Use absolute and conditional independence to factor it down.

What is Bayes’ rule?


\[P(A | B) = \frac{P(B | A)P(A)}{P(B)}\]

What is Bayes’ rule, for determing the probability of a $\text{Cause}$ given its potential $\text{Effect}$?


\[P(\text{Cause} | \text{Effect}) = \frac{P(\text{Effect}) | P(\text{Cause})P(\text{Cause})}{\text{Effect}}\]

Why is Bayes’ rule sometimes written as

\[\pmb{P}(A \vert B) = \alpha \pmb{P}(B \vert A) \pmb{P}(B)\]

?


Because $\alpha$ is a normalisation factor that makes the values of the probability distribution sum to one.

What is conditional independence of $A$ and $B$ given $C$?


Where $A$ and $B$ are only independent after observing $C$ as it is the direct cause of both of them.

How could you state the conditional independence of $A$ and $B$ given $C$ in probability notation?


\[P(A \land B | C) = P(A | C)P(B | C)\]

What is as naive Bayes model?


A technique for estimating probabilities in a situation where you have a single cause with conditional independence opf all effect variables.

What is an example cause and effect in a naive Bayes model?


  • Cause: The category of a piece of text
  • Effect: Certain keywords appearing in the text

What is the formula for $P(\text{Cause} \vert \pmb{e})$ in a naive Bayes model?


\[\alpha P(\text{Cause}) \prod_{i} P(e_i | \text{Cause})\]



Related posts