Bayes

April 16, 2025

Abstract
How to apply it

A measure on a set S is a set function \mu from subsets of S to the real numbers that satisfies {\mu(E\cup F) = \mu(E) + \mu(F) - \mu(E\cap F)} for {E,F\subseteq S} and {\mu(\emptyset) = 0}; measures do not count things twice and the measure of nothing is 0.

Exercise. Show \mu(E\cup F) = \mu(E) + \mu(F) if E\cap F=\emptyset.

A probability measure P on a set S is a positive measure with mass 1 so P(E)\ge0 for E\subseteq S and P(S) = 1.

Subsets of S are events. The conditional expectation of an event E given the event F is {P(E|F) = P(E\cap F)/P(F)}. This makes E\mapsto P(E|F) a probability measure on F if {P(F)\not=0}.

Exercise. Show if E,F\subseteq B then {P(E \cup F|F) = P(E|F) + P(F|F) - P(E\cap F|F)}, {P(\emptyset|F) = 0}, and {P(F|F) = 1}.

Hint: Use P(E\cup F) = P(E) + P(F) - P(E\cap F).

Exercise. Show P(E|F) = P(E)P(F|E)/P(F).

FHint_. Use P(F|E) = P(F\cap E)/P(E).

This exercise establishes the simplest form of Bayes Theorem. It shows how to update the probability of an evert E given the information F occured.

A random variable is a function X\colon S\to\boldsymbol{{{R}}}. Its cumulative distribution function is F(x) = P(X\le x). It determines everything there is to know about X. Two random variables have the same law if they have the same cumulative distribution function.

Exercise. If \chi\colon\boldsymbol{{{R}}}\to\boldsymbol{{{R}}} is the identity function and P(\chi\le x) = \int_{-\infty}^x\,dF(x) then \chi and X have the same law.

Some wits call \chi the physicists random variable. It is a special case of the more general mathematical definition.

The first rule of Probablity Club is to specify a sample space and a probability measure on it.

Let S = \{(x_j,y_k)\} be a finite set of ordered pairs with P(\{(x_j, y_k)\}) = p_{jk} where p_{jk}\ge0 and \sum_{j,k} p_{jk} = 1.

Define random variables X, Y\colon S\to\boldsymbol{{{R}}} by X(x,y) = x and Y(x,y) = y.

Exercise. Show P(X = x_j) = \sum_k p_{jk} and P(Y = y_k) = \sum_j p_{jk}.

Hint: The set \{X = x_j\} = \cup_k \{(x_j, y_k)\} is a disjoint union.

Exercise. Show P(X = x_j|Y = y_k) = p_{jk}/\sum_j p_{jk}.

Hint: P(X = x_j|Y = y_k) = P(X = x_j, Y = y_k)/P(Y = y_k).

Using Bayes Theorem, {P(X = x_j|E[Y] = y) = P(X = x_j)P(E[Y] = y|X = x_j)/P(E[Y] = y)}