Moment Generating Function#

Definition#

In probability theory and statistics, a moment generating function (MGF) is a function that uniquely determines the probability distribution of a random variable. Specifically, the MGF of a random variable is the expected value of the exponential function raised to a certain power of the random variable.

Definition 164 (Moment Generating Function)

For any random variable X, the moment-generating function (MGF) MX(s) is

MX(s)=E[esX].

and it is well defined if there exists a positive constant c such that MX(s) is finite for all sR with |s|c (i.e. s[c,c]]).

In other words, the MGF of X is the expected value of the exponential function raised to a certain power of X.

By the Law of the Unconscious Statistician, and Property 1 for the discrete case and Property 18 for the continuous case, we can easily see the following:

For the discrete case, the MGF is

MX(s)=xΩesxpX(x),

whereas in the continuous case, the MGF is

MX(s)=esxfX(x)dx.

The continuous case should remind us of the definition of a Laplace transform. For any function f(t), the Laplace transform is

L[f](s)=f(t)estdt.

From this perspective, we can interpret the MGF as the Laplace transform of the PDF. The argument s of the output can be regarded as the coordinate in the Laplace space. If s=jω, then MX(jω) becomes the Fourier transform of the PDF [Chan, 2021].

Example 38 (Moment Generating Function Example 1.)

Consider a random variable X with three states 0,1,2 and with probability masses 26,36,16 respectively. Find the MGF.


The moment-generating function is

MX(s)=E[esX]=es026+es136+es216=13+es2+e2s6.

Question and solution from page 324 of [Chan, 2021].

Example 39 (Moment Generating Function Example 2.)

Find the MGF for an exponential random variable.


The MGF of an exponential random variable can be found as

MX(s)=E[esX]=0esxλeλxdx=0λe(sλ)xdx=λλs, if λ>s

Question and solution from page 325 of [Chan, 2021].

Properties#

Theorem 57 (Moment Generating Function Properties)

The MGF has the properties that

  • MX(0)=1

  • dkdskMX(s)|s=0=E[Xk], for any positive integer k. In other words, if s is set to 0, then the k-th derivative of the MGF is the k-th moment of X.

Proof can be found in [Chan, 2021], page 325.

Example 40 (Moments of Bernoulli Random Variable)

Let X be a Bernoulli random variable with parameter p. Find the first two moments using MGF.


The MGF of a Bernoulli random variable is

MX(s)=E[esX]=es0pX(0)+es1pX(1)=(1)(1p)+(es)(p)=1p+pes.

The first and the second moment, using the derivative approach, are

E[X]=ddsMX(s)|s=0=dds(1p+pes)|s=0=pes|s=0=p,E[X2]=d2ds2MX(s)|s=0=d2ds2(1p+pes)|s=0=pes|s=0=p.

Question and solution from page 326 of [Chan, 2021].

The table below lists the MGF of some common discrete and continuous random variables.

 Distribution PMF/PDFE[X]Var[X]MX(s) Bernoulli pX(1)=p and pX(0)=1ppp(1p)1p+pes Binomial pX(k)=(nk)pk(1p)nknpnp(1p)(1p+pes)n Geometric pX(k)=p(1p)k11p1pp2pes1(1p)es Poisson pX(k)=λkeλk!λλeλ(es1) Gaussian fX(x)=12πσ2exp{(xμ)22σ2}μσ2exp{μs+σ2s22} Exponential fX(x)=λexp{λx}1λ1λ2λλs Uniform fX(x)=1baa+b2(ba)212esbesas(ba)

Why is it useful?#

Moments#

The moment generating function of X provides us with all the moments of X, which is why it is called the moment generating function.

Uniquely determines the distribution#

Moreover, the MGF uniquely determines the distribution, provided it exists. This means that if two random variables have the same MGF, they must have the same distribution. Therefore, finding the MGF of a random variable enables us to determine its distribution. This approach is particularly useful when working with sums of multiple independent random variables. The proof can be found here.

Further Readings#

  • Chan, Stanley H. “Chapter 6.1.1. Moment-generating function.” In Introduction to Probability for Data Science. Ann Arbor, Michigan: Michigan Publishing Services, 2021.

  • Pishro-Nik, Hossein. “Chapter 6.1.3. Moment Generating Functions.” In Introduction to Probability, Statistics, and Random Processes. Kappa Research, 2014.