FACTOID # 9: The bookmobile capital of America is Kentucky.

 Home Encyclopedia Statistics States A-Z Flags Maps FAQ About

 WHAT'S NEW

SEARCH ALL

Search encyclopedia, statistics and forums:

(* = Graphable)

Encyclopedia > Characteristic function (probability theory)

In probability theory, the characteristic function of any random variable completely defines its probability distribution. On the real line it is given by the following formula, where X is any random variable with the distribution in question: Probability theory is the branch of mathematics concerned with analysis of random phenomena. ... In probability theory, a random variable is a quantity whose values are random and to which a probability distribution is assigned. ... In mathematics and statistics, a probability distribution is a function of the probabilities of a mutually exclusive and exhaustive set of events. ... In mathematics, the real numbers may be described informally as numbers that can be given by an infinite decimal representation, such as 2. ... In probability theory, a random variable is a quantity whose values are random and to which a probability distribution is assigned. ...

$varphi_X(t) = operatorname{E}left(e^{itX}right),$

where t is a real number, i is the imaginary unit, and E denotes the expected value. In mathematics, the real numbers may be described informally as numbers that can be given by an infinite decimal representation, such as 2. ... In mathematics, the imaginary unit (or sometimes the Latin or the Greek iota, see below) allows the real number system to be extended to the complex number system . ... In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...

If FX is the cumulative distribution function, then the characteristic function is given by the Riemann-Stieltjes integral In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the random variable X takes on a value less than... In mathematics, the Riemann-Stieltjes integral is a generalization of the Riemann integral, named after Bernhard Riemann and Thomas Joannes Stieltjes. ...

$operatorname{E}left(e^{itX}right) = int_Omega e^{itx},dF_X(x).,$

In cases in which there is a probability density function, fX, this becomes In mathematics, a probability density function (pdf) is a function that represents a probability distribution in terms of integrals. ...

$operatorname{E}left(e^{itX}right) = int_{-infty}^{infty} e^{itx} f_X(x),dx.$

If X is a vector-valued random variable, one takes the argument t to be a vector and tX to be a dot product. In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. ... In mathematics, the dot product, also known as the scalar product, is a binary operation which takes two vectors over the real numbers R and returns a real-valued scalar quantity. ...

Every probability distribution on R or on Rn has a characteristic function, because one is integrating a bounded function over a space whose measure is finite, and for every characteristic function there is exactly one probability distribution.

The core of the Lévy continuity theorem states that a sequence of random variables $scriptstyle (X_n)_{n=1}^infty$ where each $scriptstyle X_n$ has a characteristic function $scriptstyle varphi_n$ will converge in distribution towards a random variable $scriptstyle X$, The LÃ©vy continuity theorem in probability theory is the basis for one approach to the central limit theorem. ...

$X_n &# 0;ghtarrow{mathcal D} X qquadtextrm{as}qquad n to infty$

if

$varphi_n quad &# 0;ghtarrow{textrm{pointwise}} quad varphi qquadtextrm{as}qquad n to infty$

and $scriptstyle varphi(t)$ continuous in $scriptstyle t=0$ and $scriptstyle varphi$ is the characteristic function of $scriptstyle X$.

The Lévy continuity theorem can be used to prove the weak law of large numbers, see the proof using convergence of characteristic functions. // The law of large numbers (LLN) is any of several theorems in probability. ... Given X1, X2, ... an infinite sequence of i. ...

## The inversion theorem

More than that, there is a bijection between cumulative probability distribution functions and characteristic functions. In other words, two distinct probability distributions never share the same characteristic function. A bijective function. ...

Given a characteristic function φ, it is possible to reconstruct the corresponding cumulative probability distribution function F:

$F_X(y) - F_X(x) = lim_{tau to +infty} frac{1} {2pi} int_{-tau}^{+tau} frac{e^{-itx} - e^{-ity}} {it}, varphi_X(t), dt.$

In general this is an improper integral; the function being integrated may be only conditionally integrable rather than Lebesgue integrable, i.e. the integral of its absolute value may be infinite. It is recommended that the reader be familiar with antiderivatives, integrals, and limits. ... In mathematics, the integral of a function of one real variable can be regarded as the area of a plane region bounded by the graph of that function. ... In mathematics, the absolute value (or modulus[1]) of a real number is its numerical value without regard to its sign. ...

## Bochner-Khinchin theorem

Main article: Bochner's theorem

An arbitrary function $scriptstyle varphi$ is a characteristic function corresponding to some probability law $scriptstyle mu$ if and only if the following three conditions are satisfied: In mathematics, Bochners theorem characterizes the Fourier transform of a positive finite Borel measure on the real line. ...

(1) $scriptstyle varphi ,$ is continuous

(2) $scriptstyle varphi(0) = 1 ,$

(3) $scriptstyle varphi ,$ is a positive definite function In mathematics, the term positive-definite function may refer to a couple of different concepts. ...

## Uses of characteristic functions

Because of the continuity theorem, characteristic functions are used in the most frequently seen proof of the central limit theorem. The main trick involved in making calculations with a characteristic function is recognizing the function as the characteristic function of a particular distribution. The LÃ©vy continuity theorem in probability theory is the basis for one approach to the central limit theorem. ... A central limit theorem is any of a set of weak-convergence results in probability theory. ...

### Basic properties

Characteristic functions are particularly useful for dealing with functions of independent random variables. For example, if X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

$S_n = sum_{i=1}^n a_i X_i,,!$

where the ai are constants, then the characteristic function for Sn is given by

$varphi_{S_n}(t)=varphi_{X_1}(a_1t)varphi_{X_2}(a_2t)cdots varphi_{X_n}(a_nt). ,!$

In particular, $varphi_{X+Y}(t) = varphi_X(t)varphi_Y(t)$. To see this, write out the definition of characteristic function:

$varphi_{X+Y}(t)=Eleft(e^{it(X+Y)}right)=Eleft(e^{itX}e^{itY}right)=Eleft(e^{itX}right)Eleft(e^{itY}right)=varphi_X(t) varphi_Y(t)$.

Observe that the independence of X and Y is required to establish the equality of the third and fourth expressions.

### Moments

Characteristic functions can also be used to find moments of a random variable. Provided that the nth moment exists, characteristic function can be differentiated n times and-1...

$operatorname{E}left(X^nright) = i^{-n}, varphi_X^{(n)}(0) = i^{-n}, left[frac{d^n}{dt^n} varphi_X(t)right]_{t=0}. ,!$

### An example

The Gamma distribution with scale parameter θ and a shape parameter k has the characteristic function In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. ...

$(1 - theta,i,t)^{-k},!$

Now suppose that we have

X˜Γ(k1,θ) and Y˜Γ(k2,θ)

with X and Y independent from each other, and we wish to know what the distribution of X + Y is. The characteristic functions are

$varphi_X(t)=(1 - theta,i,t)^{-k_1},,qquad varphi_Y(t)=(1 - theta,i,t)^{-k_2}$

which by independence and the basic properties of characteristic function leads to

$varphi_{X+Y}(t)=varphi_X(t)varphi_Y(t)=(1 - theta,i,t)^{-k_1}(1 - theta,i,t)^{-k_2}=left(1 - theta,i,tright)^{-(k_1+k_2)}$

This is the characteristic function of the gamma distribution scale parameter θ and shape parameter k1 + k2, and we therefore conclude

X + Y˜Γ(k1 + k2,θ)

The result can be expanded to n independent gamma distributed random variables with the same scale parameter and we get

$forall i in {1,ldots, n} : X_i sim Gamma(k_i,theta) qquad Rightarrow qquad sum_{i=1}^n X_i sim Gammaleft(sum_{i=1}^nk_i,thetaright)$

## Related concepts

Related concepts include the moment-generating function and the probability-generating function. The characteristic function exists for all probability distributions. However this is not the case for moment generating function. In probability theory and statistics, the moment-generating function of a random variable X is wherever this expectation exists. ... In probability theory, the probability-generating function of a discrete random variable is a penis-like representation (the generating function) of the probability mass function of the random variable of nipples. ...

The characteristic function is closely related to the Fourier transform: the characteristic function of a probability density function p(x) is the complex conjugate of the continuous Fourier transform of p(x) (according to the usual convention; see [1]). In mathematics, the Fourier transform is a certain linear operator that maps functions to other functions. ... In mathematics, the complex conjugate of a complex number is given by changing the sign of the imaginary part. ... In mathematics, the continuous Fourier transform is a certain linear operator that maps functions to other functions. ...

$varphi_X(t) = langle e^{itX} rangle = int_{-infty}^{infty} e^{itx}p(x), dx = overline{left( int_{-infty}^{infty} e^{-itx}p(x), dx right)} = overline{P(t)},$

where P(t) denotes the continuous Fourier transform of the probability density function p(x). Likewise, p(x) may be recovered from $varphi_X(t)$ through the inverse Fourier transform: In mathematics, the continuous Fourier transform is a certain linear operator that maps functions to other functions. ...

$p(x) = frac{1}{2pi} int_{-infty}^{infty} e^{itx} P(t), dt = frac{1}{2pi} int_{-infty}^{infty} e^{itx} overline{varphi_X(t)}, dt.$

Indeed, even when the random variable does not have a density, the characteristic function may be seen as the Fourier transform of the measure corresponding to the random variable.

Results from FactBites:

 Probability - Gurupedia (2485 words) Probabilities are equivalently expressed as odds, which is the ratio of the probability of one event to the probability of all other events. To learn more about the basics of probability theory, see the article on probability axioms and the article on Bayes' theorem that explains the use of conditional probabilities in case where the occurrence of two events is related. Governments typically apply probability methods in environment regulation where it is called "pathway analysis", and are often measuring well-being using methods that are stochastic in nature, and choosing projects to undertake based on their perceived probable effect on the population as a whole, statistically.
 probability: Definition, Synonyms and Much More From Answers.com (3990 words) Probabilities are expressed as fractions (1⁄2, 1⁄4, 3⁄4), as decimals (.5,.25,.75), or as percentages (50%, 25%, 75%) between 0 and 1. Like other theories, the theory of probability is a representation of probabilistic concepts in formal terms—that is, in terms that can be considered separately from their meaning. Governments typically apply probability methods in environmental regulation where it is called "pathway analysis", and are often measuring well-being using methods that are stochastic in nature, and choosing projects to undertake based on statistical analyses of their probable effect on the population as a whole.
More results at FactBites »

Share your thoughts, questions and commentary here