Skip to main content

Section 2.3 Probability

This section is coordinated with Freedman, Statistics [1], Chapters 13–15.

Subsection 2.3.1 Box Models

A box model consists of a box that contains tickets, each of which is equally likely to be selected on any single random draw from the box, as in a fair lottery or raffle. In this course, every chance process can be modeled in terms of a sequence of one or more random draws from a box of tickets. The sequence of draws can be conducted in two different ways: with replacement or without replacement. These terms refer to whether or not each ticket drawn from the box is replaced in the box before the next draw; either way, it is assumed that on any given draw, every ticket in the box at that moment has an equal chance to be selected.
The outcome of a sequence of random draws is the list of specific tickets that were drawn. That is, an outcome is a list of the form
\begin{equation*} \text{(1st ticket drawn), (2nd ticket drawn), ... , (last ticket drawn)}. \end{equation*}
The assumption that each ticket is equally likely to be selected on any given random draw implies that every draw sequence outcome is equally likely. A set of outcomes is called an event. We say an event happens or occurs if a sequence of random draws produces an outcome that belongs to that event. The probability that an event occurs is the fraction
\begin{equation*} \frac{\text{number of outcomes in the event}}{\text{total number of possible outcomes}}. \end{equation*}
We usually use capital letters to denote events. We write \(P(A)\) to denote the probability of an event \(A\text{.}\) Here is the basic definition of probability using these symbols.
\begin{equation} P(A) = \frac{\text{number of outcomes in } A}{\text{total number of possible outcomes}}\tag{2.3.1} \end{equation}

Example 2.3.1.

Consider the box with tickets labeled \(x,y,z\) and consider the chance process that selects two tickets, without replacement, from the box. The set of all possible outcomes is
\begin{equation*} \{(x,y), (x,z), (y,x), (y,z), (z,x), (z,y)\} \end{equation*}
where the pair \((y,x)\text{,}\) for example, denotes the outcome where \(y\) is selected on the first draw, and \(x\) is selected on the second draw. The event \(A = \text{ "get an } x \text{"}\) is the set
\begin{equation*} A=\{(x,y), (x,z), (z,x), (y,x)\} \end{equation*}
and the event \(B = \text{ "get an } x \text{ on draw 1"}\) is the set
\begin{equation*} B=\{(x,y), (x,z)\}. \end{equation*}
We have the following probabilities.
\begin{align*} P(A) \amp = P(\text{get an } x) = 4/6 = 2/3\\ P(B) \amp = P(\text{get an } x \text{ on draw 1}) = 2/6 = 1/3 \end{align*}
The point of the rest of this section is that we want to avoid having to explicitly list and count outcomes in order to calculate probabilities; we want to use shortcuts instead. First, we need more vocabulary.

Subsection 2.3.2 Complement, Intersection, Union of Events

The opposite of an event \(U\) (also called the complement of \(U\)) is the event that consists of all the outcomes not in \(U\text{.}\) For two events \(U,V\text{,}\) the event "\(U \AND V\)" (also called the intersection of \(U\) and \(V\)) is the set of all the outcomes shared by both \(U\) and \(V\text{.}\) The event "\(U \OR V\)" (also called the union of \(U\) and \(V\)) is the set of all the outcomes that are in \(U\) or in \(V\) or both. Events \(U,V\) are called mutually exclusive if the event "\(U \AND V\)" is empty.

Example 2.3.2.

Continuing Example 2.3.1, consider these events.
\begin{align*} C \amp = \text{ "get an } x \text{ on draw 2" } = \{(y,x), (z,x)\}\\ D \amp = \text{"get a }y \text{" } = \{(x,y), (z,y), (y,x), (y,z)\} \end{align*}
The we have the following.
\begin{align*} \text{opposite of } A \amp = \{(y,z), (z,y)\}\\ A \AND D \amp= \{(x,y), (y,x)\}\\ C \OR D \amp = \{(x,y), (z,y), (y,x), (y,z), (z,x)\} \end{align*}
The event \(B \AND C\) is the empty set, so \(B,C\) are mutually exclusive.

Subsection 2.3.3 Conditional Probability

Suppose we are about to make a sequence of draws from a box model and we are considering the probability that the outcome will be in some event \(V\text{.}\) Using definition (2.3.1), we have this.
\begin{equation} P(V) = \frac{\text{number of outcomes in } V}{\text{total number of possible outcomes}}\tag{2.3.2} \end{equation}
Suppose now that we are somehow guaranteed that the outcome of our draws will be in a particular event \(U\text{,}\) and that all of the outcomes in \(U\) are equally likely. This effectively reduces the pool of possible outcomes: any outcome that is not in \(U\) should not be considered in the top or the bottom of the fraction on the right side of (2.3.2). We define the conditional probability of V given U, denoted \(P(V|U)\), to be
\begin{equation} P(V|U) = \frac{\text{number of outcomes in } U \AND V}{\text{number of outcomes in } U}.\tag{2.3.3} \end{equation}
The conditional probability \(P(V|U)\) reflects our updated assessment of the likelihood that the outcome of our sequence of draws will be in \(V\text{,}\) in the presence of the knowledge that the outcome is in \(U\text{.}\) Dividing the top and bottom of the fraction on the right side of (2.3.3) by the total number of possible outcomes yields the following expression for conditional probability.
\begin{equation} P(V|U) = \frac{P(U \AND V)}{P(U)}\tag{2.3.4} \end{equation}

Example 2.3.3.

Continuing Example 2.3.1 and Example 2.3.2, we have the following.
\begin{align*} P(A|B) \amp = 1\\ P(B|C) \amp = 0\\ P(B|A) \amp = 1/2\\ P(A|D) \amp = 1/2 \end{align*}

Subsection 2.3.4 Independent and Dependent Events

If \(P(V) = P(V|U)\text{,}\) the events \(U,V\) are called independent. Otherwise, the events are called dependent. Using the word "independent" to describe events \(U,V\) reflects the fact that knowing whether or not an outcome is guaranteed to be in event \(U\) does not affect the way we assess the probability of event \(V\text{.}\)

Subsection 2.3.5 Probability Rules

The usefulness of the following equations is when something you want to know can be put in the form of the expressions on the left sides, and the expressions on the right sides are easier to compute. Most of the time, these rules save you from having to make explicit lists and counts of outcomes.
Rearranging (2.3.4), we have the following equation, called the multiplication rule.
\begin{equation*} P(U \AND V) = P(U) \cdot P(V|U) \end{equation*}
If \(U,V\) are independent, the multiplication rule simplifies a little.
\begin{equation*} P(U \AND V) = P(U) \cdot P(V) \;\;\;(U,V \text{ independent}) \end{equation*}
If \(U,V\) are mutually exclusive, we have the following equation, called the addition rule for mutually exclusive events.
\begin{equation*} P(U \OR V) = P(U) + P(V) \;\;\;(U,V \text{ mutually exclusive}) \end{equation*}
A special case of the addition rule is when \(V\) is the opposite of \(U\text{.}\) In this case, \(U \OR V\) is the set of all possible outcomes, so \(P(U \OR V) = 1\text{.}\) Rearranging the addition rule terms, we have the opposite rule .
\begin{equation*} P(U) = 1 - P(\text{ opposite of } U) \end{equation*}

Subsection 2.3.6 The Binomial Formula

Consider a box model in which the tickets are divided into two categories: \(W\) (for winning tickets) and \(L\) for (losing tickets). Let \(p\) be the probability that a single random draw is a winning ticket. In symbols we have the following.
\begin{align*} P(W) \amp = p\\ P(L) \amp = 1 - p \end{align*}
Now consider a sequence of random draws, with replacement. Let \(n\) be the number of draws in the sequence (\(n\) is some positive whole number), and let \(S\) be the event
\begin{equation*} S = \text{"get exactly } k \;\;W's \text{ and } (n-k)\;\; L\text{'s in } n \text{ draws"} \end{equation*}
where \(k\) is some whole number in the range from 0 to \(n\text{.}\) For example, the outcome \(WLWWL\) belongs to \(S\) for \(n=5\) and \(k=3\text{.}\) The binomial formula is
\begin{equation*} P(S) = {n \choose k} p^k (1-p)^{n-k} \end{equation*}
where \({n \choose k}\) is the quantity
\begin{equation*} {n \choose k} = \frac{n!}{k! (n-k)!}. \end{equation*}
The number \({n \choose k}\text{,}\) called a binomial coefficient, is the number of different \(n\)-letter words you can write with \(k\) \(W\)’s and \((n-k)\) \(L\)’s. For example, the number of different 5-letter words you can write with \(k=3\) \(W\)’s and \(n-k=2\) \(L\)’s is
\begin{equation*} {5 \choose 3} = \frac{5!}{3!2!} = 10. \end{equation*}
Indeed, here is the list of the 10 words.
\begin{gather*} WWWLL, WWLWL, WLWWL, LWWWL, WWLLW, \\ WLWLW, LWWLW, WLLWW, LWLWW, LLWWW \end{gather*}
For the box with 1 \(W\) and 3 \(L\)’s (so \(p=P(W)=1/4\)), the chance of getting exactly \(k=3\) \(W\)’s in \(n=5\) draws is \({5 \choose 3} (1/4)^3 (3/4)^2 \approx 8.79\%.\)
Here is the reasoning for why the binomial formula works, in brief: There are \(n \choose k\) outcomes with \(k\) \(W\)’s and \((n-k)\) \(L\)’s. All of these outcomes are mutually exclusive to one another, so the probability of getting an outcome sequence with exactly \(k\) \(W\)’s in in draws is the sum of the probabilities of each individual outcome (by the addition rule). Any particular sequence of \(k\) \(W\)’s and \((n-k)\) \(L\)’s has a probability of \(p^k (1-p)^{n-k}\) (by the multiplication rule for independent events). Put all this together and you get the binomial formula.