Today: Law of Large Numbers
Goal: start to understand error as it relates to sample size
Objectives:
- Distribution of the mean
- Chebyshev’s Inequality
- Law of Large Numbers
i.i.d.
Let
- each
is independent of each other - each
has mean - each
has variance
Let us now seek the distribution of the mean
- expected value
- variance
Distribution of Mean
If
Far from the Mean
Idea: we can get a sense of the probability that, for a particular boundary location
: population average : tolerance
Claim:
Chebyshev’s Inequality
For a random variable
That is, if we know the variance of a distribution, we can compute an upper bound for the probability of rare events!
Law of Large Numbers
The Law of Large Numbers basically combines Chebyshev’s Inequality with the earlier work for the distribution of the mean:
Idea: What happens when we observe a lot of data?
Taking the limit as
That is, the probability that the mean of a sample of random variables is ``far’’ from the inherent expected value eventually goes to zero.
Nerdy Example
How many numbers between zero and one do we have to add up to have a sum that is greater than one?
- Let
be i.i.d. - Let
be the amount of added together to get a sum greater than one - For a conservative estimate, suppose
, then
How many trials are needed so that the simulations converge to a mean within 0.01 of the true answer with at least 95 percent probability?
Looking Ahead
due Fri., Mar. 24:
- LHW8
no lecture on Mar. 24, Apr. 3
Exam 2 will be on Mon., Apr. 10