Select one of the keywords on the left…

ProbabilityRandom Variables

Reading time: ~15 min

An event may be regarded as function of the outcome of an experiment: based on the outcome, we can say that the event occurred or didn't occur. We will often be interested in specifying richer information about the outcome of an experiment than a simple yes or no. Specifically, we will often want to specify information in the form of a real number.

For example, suppose that you will receive a dollar for each head flipped in our two-fair-flips experiment. Then your payout X might be 0 dollars, 1 dollar, or 2 dollars. Because X represents a value which is random (that is, dependent on the outcome of a random experiment), it is called a random variable. A random variable which takes values in some finite or countably infinite set (such as \{0,1,2\}, in this case) is called a discrete random variable.

Since a random variable associates a real number to each outcome of the experiment, in mathematical terms a random variable is a function from the sample space to \mathbb{R}. Using function notation, the dollar-per-head payout random variable X satisfies

\begin{align*}X((\texttt{T}, \texttt{T})) &= 0, \\\ X((\texttt{H}, \texttt{T})) &= 1, \\\ X((\texttt{T}, \texttt{H})) &= 1, \text{ and} \\\ X((\texttt{H}, \texttt{H})) &= 2.\end{align*}

Note that a random variable X, as a function from \Omega to \mathbb{R}, does not have its own uncertainty: for each outcome \omega, the value of X(\omega) is consistently and perfectly well defined. The randomness comes entirely from thinking of \omega as being selected randomly from \Omega. For example, the amount X of money you'll take home from tomorrow's poker night is a random quantity, but the function which maps each poker game outcome \omega to your haul X(\omega) is fully specified by the rules of poker.

We can combine random variables using any operations or functions we can use to combine numbers. For example, suppose X_1 is defined to be the number of heads in the first of two coin flips. In other words, we define

\begin{align*}X_1((\texttt{T}, \texttt{T})) &= 0 \\\ X_1((\texttt{H}, \texttt{T})) &= 1 \\\ X_1((\texttt{T}, \texttt{H})) &= 0 \\\ X_1((\texttt{H}, \texttt{H})) &= 1,\end{align*}

and X_2 is defined to be the number of heads in the second flip. Then the random variable X_1 + X_2 maps each \omega \in \Omega to X_1(\omega) + X_2(\omega). This random variable is equal to X, since X(\omega) = X_1(\omega) + X_2(\omega) for every \omega \in \Omega.

Suppose that the random variable X represents a fair die roll and Y is defined to be the remainder when X is divided by 4.

Define a six-element probability space \Omega on which X and Y may be defined, and find \mathbb{P}(X - Y = k) for every integer value of k.

Solution. We set \Omega = \{(1, 1), (2, 2), (3, 3), (4, 0), (5, 1), (6, 2)\}. From the sample space, we see that for any integer value k, we have

\begin{align*}\mathbb{P}(X - Y = k) = \begin{cases} \frac{1}{2} & \text{if} \; k \in \{0, 4\} \\\ 0 & \text{otherwise.} \end{cases}\end{align*}

Consider a sample space \Omega and an event E \subset \Omega. We define the random variable \mathbf{1}_{E} : \Omega \rightarrow \{0,1\} by

\begin{align*}\mathbf{1}_{E} (\omega) = \begin{cases} 1 & \text{if} \: \omega \in E \\\ 0 & \text{otherwise}. \end{cases}\end{align*}

The random variable \mathbf{1}_{E} is called the indicator random variable for E. If F is another event, which of the following random variables are necessarily equal?

\mathbf{1}_{E} and XEQUATIONX1779XEQUATIONX


  • Since \mathbf{1}_E \cdot \mathbf{1}_F = 1 if and only if \mathbf{1}_E = 1 and \mathbf{1}_F = 1, we see that \mathbf{1}_{E \cap F} = \mathbf{1}_E \cdot \mathbf{1}_F.

  • Because \mathbf{1}_E + \mathbf{1}_F may be equal to 2 (on the intersection of E and F), we cannot have \mathbf{1}_{E \cup F} = \mathbf{1}_E + \mathbf{1}_F in general.

  • We observe that 1 - \mathbf{1}_{E^c} = \mathbf{1}_E because \mathbf{1}_{E^c} = 0 if and only if \mathbf{1}_E = 1.

Bruno Bruno