Cumulative Distribution Functions

Keith A. Lewis

April 25, 2024

Abstract
Facts about cumulative distribution functions

We collect some facts about random variables, their cumulants, and deriatives that that are useful for option pricing.

Esscher

Recall the moment generating function of a random variable X is M(s) = E[e^{sX}] and its cumulant is κ(s) = \log M(s). Let M(s, x) = E[1(X\le x) e^{sX}] be the incomplete moment generating function.

The Esscher transform of the random variable X with parameter s has cumulative distribution function F_s(x) = M(s, x)/M(s) = E[1(X\le x) e^{s X - κ(s)}]. Let ε_s(x) = e^{s x - κ(s)} and note E[ε_s(X)] = 1 so F_s is a cdf.

Write E_s for the Esscher transformed measure so for any reasonable function g we have E_s[g(X)] = E[g(X)ε_s(X)] so ∂_s E_s[g(X)] = ∂_s E[g(X) ε_s(X)] = E[g(X) ε_s(X) (X - κ'(s))] = E_s[g(X) (X - κ'(s))] since ∂_s ε_s(x) = ε_s(x) (x - κ'(s)). Define the partial cumulant by κ(s, x) = \log M(s,x) so ∂_s F_s(x) = ∂_s e^{κ(s, x) - κ(s)} = F_s(x) (κ'(s, x) - κ'(s)).

Distributions

We gather some facts about the distributions and cumulants of particular random variables.

Discrete

A discrete random variable is defined by the values it can have, \{x_j\}, and the probability it takes on those values, P(X = x_j) = p_j, where p_j > 0 and \sum_j p_j = 1. It has cdf F(x) = \sum_j 1(x\le x_j) p_j and density f(x) = \sum_j δ_{x_j}(x) p_j where δ_a is the delta function, or point mass, at a. The Esscher transform of a discrete random variable is discrete and takes the same values with P(X_s = x_j) ε_s(x_j) p_j.

Normal

The standard normal random variable X has density φ(x) = \exp(-x^2/2)/\sqrt{2\pi}, -\infty < x < \infty. The cdf can be expressed in terms of error functions as Φ(x) = (1 + \operatorname{erf}(x/\sqrt{2}))/2 = 1 - \operatorname{erfc}(x/\sqrt{2})/2. Since \exp(sx - s^2/2) \exp(-x^2/2) = \exp(-(x - s)^2/2) the Esscher transformed density is φ_s(x) = φ(x - s).

The derivatives of the density are φ^{(n)}(x) = (-1)^nφ(x)H_n(x) where H_n are the Hermite polynomials. They satisfy the recurrence H_0(x) = 1, H_1(x) = x and H_{n+1}(x) = x H_n(x) - n H_{n-1}(x), n\ge 1.

Poisson

The Poisson distribution with parameter λ has density P(X = n) = e^{-λ}λ^n/n!, n\ge 0 and moment generating function M(s) = E[e^{s X}] = \exp(λ(e^s - 1)), s < λ. Since \exp(sn - λ(e^s - 1)) e^{-λ}λ^n = \exp(-λe^s) (λ e^s)^n the Esscher transformed distribution is also Poisson with parameter λe^s.

Taking a derivative with respect to s we have ∂_s E_s[g(X_λ)]] = λ e^s E[g(X_{λ e^s} + 1) - g(X_{λ e^s})] so ∂_s E_s[1(X_λ \le x)] = λ e^s e^{-λ e^s} (λe^s)^n/n! where λ e^s < n \le λ e^s + 1.

Exponential

The density of an exponential with parameter λ is f(x) = λ\exp(-λ x), x\ge 0. The moment generating function is M(s) = E[\exp sX] = \int_0^\infty \exp(sx) λ\exp(-λ x)\,dx = λ/(λ - s), s < λ. The Esscher transformed density is also a an exponential distribution with parameter λ with parameter λ - s.

Generalized Logistic

A generalized logistic random variate has probability density function f(α,b;x) = c e^{-βx}/(1 + e^{-x})^{α + β}, -\infty < x < \infty, where c = 1/B(α,β) is the Beta function. If α = 1 and β = 1 this is the standard logistic function. The Esscher transformed density is also a generalized logistic with parameters α + s, β - s.

Using u = 1/(1 + e^{-x}), so e^x = u/(1 - u) and dx = du/u(1-u) the moment generating function is \begin{aligned} E[e^{sX}] &= c \int_{-\infty}^\infty e^{sx} e^{-βx}/(1 + e^{-x})^{α + β}\,dx \\ &= c \int_0^1 (u/(1 - u))^{s-β} u^{α + β}\,du/u(1 - u) \\ &= c \int_0^1 u^{α + s - 1} (1 - u)^{β - s - 1}\,du \\ &= c B(α + s, β - s)\\ \end{aligned} where B(α,β) is the Beta function. Since 1 = cB(α, β) M(s) = B(α + s, β - s)/B(α, β). A similar calculation shows the incomplete moment generating function is M(s,x) = B(α + s, β - s; u)/B(α, β) = I_u(α + s, β - s) where u = 1/(1 + e^{-x}), B(α, β; u) is the incomplete Beta function, and I_u(α, β) is the regularized incomplete Beta function.

The Esscher transformed cumulative distribution function is F_s(x) = B(α + s, β - s; u)/B(α + s, β - s) = I_u(α + s, β - s).

Next we show ∂_x^n e^{-β x}(1 + e^{-x})^{- α - β} = \sum_{k=0}^n A_{n,k} (e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k} for coefficients A_{n,k}, 0\le k\le n, not depending on x. Clearly A_{0,0} = 1. \begin{aligned} ∂_x^n e^{-β x}(1 + e^{-x})^{- α - β} &= ∂_x\sum_{k=0}^{n-1} A_{n-1,k} (e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k} \\ &= \sum_{k=0}^{n-1} A_{n-1,k} \left((β + k)(e^{-x})^{β + k - 1}(-e^{-x})(1 + e^{-x})^{- α - β - k} + (e^{-x})^{β + k}(- α - β - k)(1 + e^{-x})^{- α - β - k - 1} (-e^{-x})\right)\\ &= \sum_{k=0}^{n-1} A_{n-1,k} \left(-(β + k)(e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k} + (α + β + k) (e^{-x})^{β + k + 1}(1 + e^{-x})^{- α - β - k - 1}\right)\\ &= \sum_{k=0}^{n-1} A_{n-1,k} (-(β + k))(e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k} + \sum_{k=1}^n A_{n-1,k-1} (α + β + k - 1) (e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k}\\ &= A_{n-1,0} (-β)(e^{-x})^β(1 + e^{-x})^{- α - β} + \sum_{k=1}^{n-1} A_{n-1,k} (-(β + k)) + A_{n-1,k-1}(α + β + k - 1)) (e^{-x})^{β + k}(1 + e^{-x})^{- α - β - k} + A_{n-1, n-1} (α + β + n - 1) (e^{-x})^{β + n}(1 + e^{-x})^{- α - β - n}\\ \end{aligned} so A_{n,0} = (-b)^n A_{n-1,0}, A_{n,k} = -(β + k)A_{n-1,k} + (α + β + k - 1) A_{n-1,k-1}, 0 < k < n, and A_{n,n} = (α + β + n - 1) A_{n-1, n-1}. If we define A_{n,-1} = 0 = A_{n,n+1} A_{n,k} = -(β + k)A_{n-1,k} + (α + β + k - 1) A_{n-1,k-1} for n > 0 and A_{0,0} = 1.

The cumulant of the generalized logistic is κ(s) = \log B(α + s, β - s)/B(α, β) = \log Γ(α + s) - \log Γ(α) + \log Γ(β - s) - \log Γ(β) using the fact B(α,β) = Γ(α)Γ(β)/Γ(α + β)

Recall the digamma function ψ(s) = Γ'(s)/Γ(s) is the derivative of the log of the Gamma function so κ^{(n+1)}(s) = ψ^{(n)}(α + s) - (-1)^n ψ^{(n)}(β - s) for n\ge 0. In particular the mean is κ'(0) = ψ(α) - ψ(β) and variance is κ''(0) = ψ'(α) + ψ'(β).

Let the subscripts 1 and 2 indicate the partial derivatives with respect to the first and second parameter respectively. Recall B_1(α,β;u) = B(α,β;u)\log u - u^a\;_3F_2(α, α, 1 - β; α + 1, α + 1; u) so B_1(u)/B(u) - B_1/B = \log u omitting the parameters α and β.

Using B(α,β;u) = 1 - B(β,α;1 - u) we have B_2(α,β;u) = -(\log (1 - u) - ψ(β) + ψ(β + α))B(β, α;1 - u) so B_2(u)/B(u) - B_2/B = -\log (1 - u). Since F_s(x) = B(α + s, β - s; u)/B(α + s, β - s) = B(u)/B \begin{aligned} ∂_s F_s(x) &= \frac{B (B_1(u) - B_2(u)) - B(u)(B_1 - B_2)}{B^2} \\ &= \frac{B(u)}{B}\left[\left(\frac{B_1(u)}{B(u)} - \frac{B_1}{B}\right) - \left(\frac{B_2(u)}{B(u)} - \frac{B_2}{B}\right)\right] \\ &= F_s(x) (\log u + \log (1 - u)). \end{aligned}

Recall B(α,β;u) = \frac{u^α}{α}\,_2F_1(α, 1 - β, α + 1;u). where \,_2F_1(a,b;c;x) = \sum_{n=0}^\infty\frac{(a)_n (b)_n}{(c)_n} x^n/n! is the hypergeometric function.

The derivatives of the hypergeometic function are ∂_x^n\,_2F_1(a, b; c; x) = \frac{(a)_n (b)_n}{(c)_n}\,_2F_1(a + n, b + n; c + n;x).

Acknowledgements

The author thanks Peter Carr and Bill Goff for their helpful comments and insightful suggestions. Any errors or omissions are due to the author.