# Law of Large Numbers

Alongside the Central Limit Theorem, the Law of Large Numbers is equally important. The Weak Law of Large Numbers essentially states that the sample average converges in probability toward the expected value. There is also a Strong version which I won’t discuss for now. But, both versions are of great importance in science as they imply that large sample sizes are better for estimating population averages.

Mathematically, the Weak Law states that if we let $X_{1}, X_{2}, ...$ be a sequence of iid random variables each having finite mean $E[X_{i}]= \mu$, for any $\epsilon > 0$

$P\{|\frac{1}{n}\sum_{i=1}^{n} X_{i} -\mu| \geq \epsilon\} \longrightarrow 0$ as $n \longrightarrow \infty$

In order to demonstrate this, we shall first go through two Russian inequalities.

Markov’s inequality:
If $X$  is a random variable that takes only non-negative values then for any  $a > 0$,

$P \{X \geq a\} \leq \frac{E[X]}{a}$

Proof: For $a > 0$, let $I = \begin{cases} 1 \iff X \geq a \\ 0 \iff X < a\end{cases}$

and note that since $X \geq 0$, $I \leq \frac{X}{a}$

Taking expectations of he previous inequality yields $E[I] \leq \frac{E[X]}{a}$

$E[I] = P \{ X \geq a \}$ so we have  $P \{X \geq a\} \leq \frac{E[X]}{a}$

Chebyshev’s inequality:
If $X$  is a random variable with finite mean $\mu$ and finite variance
$\sigma^{2}$ then for any value $k> 0$,

$P\{|X -\mu| \geq k\} \leq \frac{\sigma^{2}}{k^{2}}$

Proof: Since $(X-\mu)^2 \geq 0$, we can apply Markov’s inequality with $a = k^2$ to obtain

$P \{(X-\mu)^{2} \geq k^{2}\} \leq \frac{E[(X-\mu)^{2}]}{k^{2}}$

Since $(X-\mu)^{2} \geq k^{2}$ iff $|X-\mu| \geq k$, we have

$P \{(X-\mu)^{2} \geq k^{2}\} \leq \frac{E[(X-\mu)^{2}]}{k^{2}} = \frac{\sigma^{2}}{k^{2}}$

With the above ingredients we are now ready to provide a proof of the Weak Law of Large Numbers:

Assuming that all random variables are iid with finite mean and finite variance,

$E[\frac{1}{n}\sum_{i=1}^{n} X_{i}] = \mu$ and $Var[\frac{1}{n}\sum_{i=1}^{n} X_{i}] = \frac{\sigma^2}{n}$

and it follows from Chebyshev’s inequality that

$P\{|\frac{1}{n}\sum_{i=1}^{n} X_{i} -\mu| \geq \epsilon\} \leq \frac{\sigma^2}{n \epsilon^{2}}$

Now, if we take the limit as $n \longrightarrow \infty$ we obtain
the desired result.