Overview

This course provides an introduction to probability using simulation and mathematical frameworks with emphasis on the probability concepts needed for more advanced study in statistical practice.

Topics covered include

  • probability spaces and random variables;
  • discrete and continuous probability distributions;
  • probability mass, density, and distribution functions;
  • expectations and variance;
  • independence and conditional probability; and
  • the law of large numbers, the central limit theorem, and sampling distributions.

01. Outcomes, Events, and Probability

What is probability? Consider the Monty Hall Dilemma example from the textbook (Dekking et al. 2005, sec. 1.3, p. 4). You are asked the following question: Suppose you’re on a game show, and you’re given the choice of three doors; behind one door is a car; behind the others, goats.

02. Conditional Probability and Independence

Example: Sharing Birthdays Suppose 3 students are randomly selected from a class. Assume they are all born in a non-leap year. What is the probability that none of them share a same birthday?

03. Discrete Random Variables

Example: A Two-Dice Game Suppose you play a dice game with your friend. You each take turns to roll a pair of fair dice. If the two numbers sum up to be greater than 7, you win \1.

04. Continuous Random Variables

Example: Infinite Wheel ## Warning: Using the `size` aesthetic in this geom was deprecated in ggplot2 3.4.0. ## ℹ Please use `linewidth` in the `default_aes` field and elsewhere instead. ## This warning is displayed once every 8 hours.

05. Expectation and Variance

Example: Coffee Shop Figure 1: Probability mass function of daily customer counts at Michael’s coffee shop. Suppose Michael opens a coffee shop and observes that the daily number of customers follows a particular distribution.

06. Variable Transformation

Variable Transformation The change-of-variable formula for expectations provides a useful method to compute the expectation of a transformed random variable. It is a shortcut in the sense that you do not need to derive the full distribution of the transformed random variable.

07. Joint Distribution

Example: Two Dice, Two Random Variables Adopted from Dekking et al. (2005) Section 9.1. Suppose you roll two six-sided, fair dice. Let \(S\) be the sum of the two rolls and \(M\) be the maximum of the two rolls.

08. Covariance and Correlation

Example: Radius to Volume Consider a circle with radius \(R\). Suppose \(R\) is a continuous random variable with probability density function \(f_R\): $$f_R(r)=\frac{3}{16}\left(r-2\right)^2\quad\text{for }0\le r\le 4.$$ Suppose we are interested in the expected values of the area of the circle \(A\).

09. Computation with Random Variables

Example: Bike Packing Planning Michael and William are planning for a bike packing trip for 7 days. As they each pack their bags, they are trying to decide the number of extra tire tubes they should take.

10. Law of Large Numbers

Example: Labour Force Survey (From StatCan website: https://www.statcan.gc.ca/eng/survey/household/3701) The Labour Force Survey (LFS) is a household survey carried out monthly by Statistics Canada. It is the only source of current, monthly estimates of total employment and unemployment…The survey is conducted in 54,000 households across Canada…to determine the characteristics of an entire population by using the answers of a much smaller, randomly chosen sample…

11. Central Limit Theorem

Distribution of Sample Mean Recall that the Law of Large Numbers implies that sample mean converges to the popultion mean. This holds true regardless of the underlying distribution of the population.