Moment Generating Functions

This section develops and applies some of the properties of the moment-generating function. It turns out, despite its unlikely appearance, to be a very useful tool that can dramatically simplify certain calculations.

The moment-generating function (mgf) of a random variable $X$ is 

$M(t) =E(e^{t X })$

 if the expectation is defined. In the discrete case,

$M(t) =\sum_x e^{tx} p(x)$

and in the continuous case,

$M(t) =\int_{-\infty}^{\infty}e^{tx} f (x) dx$

The expectation, and hence the moment-generating function, may or may not exist for any particular value of $t$. In the continuous case, the existence of the expectation depends on how rapidly the tails of the density decrease; for example, because the tails of the Cauchy density die down at the rate $x^{−2}$, the expectation does not exist for any $t$ and the moment-generating function is undefined. The tails of the normal density die down at the rate $e^{−x^2}$ , so the integral converges for all $t$.


The $r $th moment of a random variable is $E(X^r )$ if the expectation exists. We have already encountered the first and second moments  that is, $E(X)$ and $E(X^2)$. Central moments rather than ordinary moments are often used: The $r$ th central moment is $E{[X − E(X)]^r }$. The variance is the second central moment and is a measure of dispersion about the mean. The third central moment, called the skewness, is used as a measure of the asymmetry of a density or a frequency function about its mean; if a density is symmetric about its mean, the skewness is zero. As its name implies, the moment-generating function has something to do with moments. To see this, consider the continuous case:

$M(t)=\int_{-\infty}^{\infty} e^{tx} f(x) dx$

The derivative of $M'(t)$ is

$M'(t)=\frac{\mathrm{d} }{\mathrm{d} t}\int_{-\infty}^{\infty} e^{tx} f(x) dx$

It can be shown that differentiation and integration can be interchanged, so that

$M'(t)=\int_{-\infty}^{\infty} x e^{tx} f(x) dx$

and $M'(0)$

$M'(0)=\int_{-\infty}^{\infty} x  f(x) dx=E(X)$

Differentiating $r$ times, we find

M^{(r )}(0) = E(X^r )$

It can further be argued that if the moment-generating function exists in an interval containing zero, then so do all the moments. We thus have the following property.


To find the moments of a random variable from the definition of expectation, we must sum a series or carry out an integration. The utility of Property B is that, if the mgf can be found, the process of integration or summation, which may be difficult, can be replaced by the process of differentiation, which is mechanical. We now illustrate these concepts using some familiar distributions




Comments

Popular posts from this blog

Foundations of Machine Learning CST 312 KTU CS Elective Notes

Random Variables- Discrete and Continuous

Joint Distributions