Skip to content

Weak Convergence

Also called convergence in distribution.

This is the classic “Law of Rare Events”. If we have a sequence of Binomial distributions where the number of trials nn goes to infinity and the probability of success pnp_n goes to 0 such that npnλn p_n \to \lambda, the distribution converges to a Poisson distribution.

Let XnBin(n,pn)X_n \sim \text{Bin}(n, p_n) with pn=λ/np_n = \lambda/n. The probability mass function (PMF) for a fixed kk is:

P(Xn=k)=(nk)pnk(1pn)nk=n(n1)(nk+1)k!(λn)k(1λn)nk=λkk!n(n1)(nk+1)nk1(1λn)neλ(1λn)k1\begin{aligned} \mathbb{P}(X_n = k) &= \binom{n}{k} p_n^k (1-p_n)^{n-k} \\ &= \frac{n(n-1)\dots(n-k+1)}{k!} \left(\frac{\lambda}{n}\right)^k \left(1-\frac{\lambda}{n}\right)^{n-k} \\ &= \frac{\lambda^k}{k!} \underbrace{\frac{n(n-1)\dots(n-k+1)}{n^k}}_{\to 1} \underbrace{\left(1-\frac{\lambda}{n}\right)^n}_{\to e^{-\lambda}} \underbrace{\left(1-\frac{\lambda}{n}\right)^{-k}}_{\to 1} \end{aligned}

Taking the limit as nn \to \infty:

limnP(Xn=k)=λkeλk!\lim_{n \to \infty} \mathbb{P}(X_n = k) = \frac{\lambda^k e^{-\lambda}}{k!}

This is the PMF of a Poisson(λ)(\lambda) distribution. Thus XndPois(λ)X_n \xrightarrow{d} \text{Pois}(\lambda).

A classic example is the convergence of a scaled Geometric distribution to an Exponential distribution.

Consider a sequence of independent Bernoulli trials with success probability pp. Let XpX_p be the number of trials until the first success. This follows a Geometric distribution with parameter pp:

P(Xp=n)=p(1p)n1P(Xp>n)=(1p)n\begin{aligned} \mathbb{P}(X_p = n) &= p(1-p)^{n-1} \\ \mathbb{P}(X_p > n) &= (1-p)^n \end{aligned}

for n=1,2,n = 1, 2, \dots. Note that Xp>nX_p > n means the first nn trials were failures.

As p0p \to 0, we analyze the distribution function of pXpp X_p, i.e., P(pXpx)\mathbb{P}(p X_p \le x).

limp0P(pXp>x)=limp0P(Xp>xp)=limp0(1p)x/p\begin{aligned} \lim_{p \to 0} \mathbb{P}(p X_p > x) &= \lim_{p \to 0} \mathbb{P}\left(X_p > \frac{x}{p}\right) \\ &= \lim_{p \to 0} (1-p)^{x/p} \end{aligned}

Recalling the limit definition of the exponential function limm(11m)m=e1\lim_{m \to \infty} (1 - \frac{1}{m})^m = e^{-1}. To apply this, we make the substitution m=1/pm = 1/p. As p0p \to 0, mm \to \infty.

limp0(1p)x/p=limm(11m)mx=[limm(11m)m]x=ex\lim_{p \to 0} (1-p)^{x/p} = \lim_{m \to \infty} \left(1 - \frac{1}{m}\right)^{mx} = \left[ \lim_{m \to \infty} \left(1 - \frac{1}{m}\right)^m \right]^x = e^{-x}

Thus:

limp0P(pXpx)=1exfor all x>0\lim_{p \to 0} \mathbb{P}(p X_p \le x) = 1 - e^{-x} \quad \text{for all } x > 0

This is precisely the CDF of an Exponential distribution with rate parameter λ=1\lambda = 1.

pXpdExp(1)as p0p X_p \xrightarrow{d} \text{Exp}(1) \quad \text{as } p \to 0

Convergence in Probability Implies Weak Convergence

Section titled “Convergence in Probability Implies Weak Convergence”

We know that convergence in probability is stronger than convergence in distribution. Here we provide a proof.

Does XndX    XnPXX_n \xrightarrow{d} X \implies X_n \xrightarrow{P} X? Generally No, because convergence in distribution does not require variables to be close in value (or even on the same space).

However, if the limit is a constant, the converse holds.