Skip to content

Characteristic Function

The characteristic function is a powerful tool in probability theory, often serving as the “probability version” of the Fourier transform.

Using Euler’s formula, we can express this as:

φX(t)=E[cos(tX)]+iE[sin(tX)]\varphi_X(t) = \mathbb{E}[\cos(tX)] + i\mathbb{E}[\sin(tX)]

One of the most useful properties of characteristic functions is handling sums of independent variables.

Let XPoi(λ)X \sim \text{Poi}(\lambda). The probability mass function is:

P(X=k)=eλλkk!,k=0,1,\mathbb{P}(X=k) = e^{-\lambda} \frac{\lambda^k}{k!}, \quad k=0, 1, \dots

We compute its characteristic function:

φX(t)=E[eitX]=k=0eitkeλλkk!=eλk=0(λeit)kk!=eλeλeit=eλ(eit1)\begin{aligned} \varphi_X(t) &= \mathbb{E}[e^{itX}] = \sum_{k=0}^\infty e^{itk} e^{-\lambda} \frac{\lambda^k}{k!} \\ &= e^{-\lambda} \sum_{k=0}^\infty \frac{(\lambda e^{it})^k}{k!} \\ &= e^{-\lambda} e^{\lambda e^{it}} = e^{\lambda(e^{it} - 1)} \end{aligned}

Now let YPoi(η)Y \sim \text{Poi}(\eta) be independent of XX. Using the convolution property:

φX+Y(t)=φX(t)φY(t)=eλ(eit1)eη(eit1)=e(λ+η)(eit1)\begin{aligned} \varphi_{X+Y}(t) &= \varphi_X(t) \varphi_Y(t) \\ &= e^{\lambda(e^{it} - 1)} e^{\eta(e^{it} - 1)} \\ &= e^{(\lambda + \eta)(e^{it} - 1)} \end{aligned}

We observe that this is exactly the characteristic function of a random variable distributed as Poi(λ+η)\text{Poi}(\lambda + \eta).

Inversion Formula (Formalizing Uniqueness)

Section titled “Inversion Formula (Formalizing Uniqueness)”

Let XN(0,1)X \sim \mathcal{N}(0, 1). The density is f(x)=12πex2/2f(x) = \frac{1}{\sqrt{2\pi}} e^{-x^2/2}.

φ(t)=eitx12πex2/2dx=12πex2/2+itxdx=et2/22πe(xit)2/2dx\begin{aligned} \varphi(t) &= \int_{-\infty}^\infty e^{itx} \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \, dx \\ &= \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty e^{-x^2/2 + itx} \, dx \\ &= \frac{e^{-t^2/2}}{\sqrt{2\pi}} \int_{-\infty}^\infty e^{-(x - it)^2/2} \, dx \end{aligned}

By substituting y=xity = x - it and noting that the integral of the Gaussian PDF is 1 (even with a complex shift, which can be rigorously justified using contour integration):

12πe(xit)2/2dx=1\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}} e^{-(x-it)^2/2} \, dx = 1

Thus:

φ(t)=et2/2\varphi(t) = e^{-t^2/2}

Let YN(μ,σ2)Y \sim \mathcal{N}(\mu, \sigma^2). We can write Y=μ+σXY = \mu + \sigma X where XN(0,1)X \sim \mathcal{N}(0, 1). Using the linear transformation property of CHFs (φaX+b(t)=eitbφX(at)\varphi_{aX+b}(t) = e^{itb}\varphi_X(at)):

φY(t)=eitμφX(σt)=eitμe(σt)2/2=eitμσ2t2/2\begin{aligned} \varphi_Y(t) &= e^{it\mu} \varphi_X(\sigma t) \\ &= e^{it\mu} e^{-(\sigma t)^2/2} \\ &= e^{it\mu - \sigma^2 t^2/2} \end{aligned}

Let YN(μ1,σ12)Y \sim \mathcal{N}(\mu_1, \sigma_1^2) and ZN(μ2,σ22)Z \sim \mathcal{N}(\mu_2, \sigma_2^2) be independent.

φY+Z(t)=φY(t)φZ(t)=exp(itμ1σ12t22)exp(itμ2σ22t22)=exp(it(μ1+μ2)(σ12+σ22)t22)\begin{aligned} \varphi_{Y+Z}(t) &= \varphi_Y(t) \varphi_Z(t) \\ &= \exp\left(it\mu_1 - \frac{\sigma_1^2 t^2}{2}\right) \exp\left(it\mu_2 - \frac{\sigma_2^2 t^2}{2}\right) \\ &= \exp\left(it(\mu_1 + \mu_2) - \frac{(\sigma_1^2 + \sigma_2^2)t^2}{2}\right) \end{aligned}

This is precisely the characteristic function of a Normal distribution with mean μ1+μ2\mu_1 + \mu_2 and variance σ12+σ22\sigma_1^2 + \sigma_2^2.

    Y+ZN(μ1+μ2,σ12+σ22)\implies Y+Z \sim \mathcal{N}(\mu_1 + \mu_2, \sigma_1^2 + \sigma_2^2)