Skip to content

Modes of Convergence

We define three standard modes of convergence for random variables here. Convergence in distribution (Weak Convergence) is treated separately due to its depth.

Equivalent formulations:

  • P({ω:Xn(ω)X(ω)})=1\mathbb{P}(\{ \omega : X_n(\omega) \to X(\omega) \}) = 1
  • P({ω:limnXn(ω)X(ω)})=0\mathbb{P}(\{ \omega : \lim_{n \to \infty} X_n(\omega) \neq X(\omega) \}) = 0

Essentially, the set of sample paths ω\omega where the sequence fails to converge has probability zero.

Also called Mean Convergence. Let’s look at some context to have a better understanding of the definition.

  • XLpX \in L^p means E[Xp]<\mathbb{E}[|X|^p] < \infty.
  • Norm: In a general measure space, fp=(fpdμ)1/p\|f\|_p = (\int |f|^p \, d\mu)^{1/p}.
  • In a probability space (Lp(Ω,F,P)L^p(\Omega, \mathcal{F}, \mathbb{P})), recall that since expectation is the Lebesgue integration over the probability measure, this becomes: Xp=(E[Xp])1/p\|X\|_p = \left( \mathbb{E}[|X|^p] \right)^{1/p}
  • Thus, XnLpX    XnXp0X_n \xrightarrow{L^p} X \iff \|X_n - X\|_p \to 0.

In general, Convergence in Probability does not imply Almost Sure or LpL^p convergence.

The following example shows that convergence in probability does not imply almost sure convergence.

Consider Ω=[0,1]\Omega = [0,1] with Lebesgue measure. Let X=0X = 0. Define XnX_n as indicator functions of intervals that “scan” across [0,1][0,1] repeatedly.

  • X1=1[0,1]X_1 = \mathbb{1}_{[0,1]}
  • X2=1[0,1/2],X3=1[1/2,1]X_2 = \mathbb{1}_{[0,1/2]}, X_3 = \mathbb{1}_{[1/2,1]}
  • X4=1[0,1/3],X5=1[1/3,2/3],X6=1[2/3,1]X_4 = \mathbb{1}_{[0,1/3]}, X_5 = \mathbb{1}_{[1/3,2/3]}, X_6 = \mathbb{1}_{[2/3,1]}
  • And so on.

Typewriter Sequence

  • Convergence in Probability: Yes. The measure of the support of XnX_n is 1/k1/k (where kk is the denominator group), which goes to 0. Thus P(Xn>ϵ)0\mathbb{P}(|X_n| > \epsilon) \to 0.
  • Almost Sure: No. For any point ω[0,1]\omega \in [0,1], the interval will cover it infinitely often as the sequence cycles. Thus limXn(ω)\lim X_n(\omega) does not exist (it oscillates between 0 and 1). XnP0butXn̸ ⁣ ⁣a.s.0X_n \xrightarrow{P} 0 \quad \text{but} \quad X_n \not\!\!\xrightarrow{a.s.} 0
  • Note: This sequence does converge in LpL^p since E[Xnp]=length0\mathbb{E}[|X_n|^p] = \text{length} \to 0.

The following example shows that convergence in probability does not imply LpL^p convergence.

Consider the same space. Let XnX_n be tall, thin rectangles that maintain constant area.

Xn=n1[0,1/n]X_n = n \cdot \mathbb{1}_{[0, 1/n]}

Escaping Mass Sequence

  • Convergence in Probability: Yes. The support is [0,1/n][0, 1/n], which has measure 1/n01/n \to 0. P(Xn>ϵ)=1/n0\mathbb{P}(|X_n| > \epsilon) = 1/n \to 0
  • LpL^p Convergence: No. E[Xn01]=01/nndx=1↛0\mathbb{E}[|X_n - 0|^1] = \int_0^{1/n} n \, dx = 1 \not\to 0 (For p>1p > 1, E[Xnp]=np1\mathbb{E}[|X_n|^p] = n^{p-1} \to \infty). XnP0butXn̸L10X_n \xrightarrow{P} 0 \quad \text{but} \quad X_n \not\xrightarrow{L^1} 0
  • Note: This sequence does converge almost surely to 0. For any ω>0\omega > 0, eventually 1/n<ω1/n < \omega, so Xn(ω)=0X_n(\omega) = 0 for all large nn. (Convergence fails only at ω=0\omega=0, which has probability 0).