In probability theory and statistics, the **Poisson distribution** is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate, and are independent of the time since the last event. The Poisson distribution can also be used for other specified intervals such as: distance, area or volume. A classic example is the probability of a certain number of bombs striking an random selected area from a group of equally sized areas. This example was applied to German V-1 buzz bombs (a flying bomb, the precurser to the guided missile) striking South London during WW II. On paper, South London was divided geographically into 576 areas each having 0.25km^{2} areas. Assuming the 535 bombs launched toward South London were done so with random targeting. Therefore, the probability of any number of bombs (0 to 535) striking any area of the 576, at random, can be calculated. For use in the Poisson distribution, the mean, λ, is the quotient of number of bombs divided by number of equally sized areas. Download high resolution version (1300x975, 152 KB) Wikipedia does not have an article with this exact name. ...
Image File history File links Download high resolution version (1200x900, 13 KB) Summary Generated with the followig R [r-project. ...
In mathematics, the support of a real-valued function f on a set X is sometimes defined as the subset of X on which f is nonzero. ...
In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ...
In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the random variable X takes on a value less than...
In mathematics, the gamma function is defined by a definite integral. ...
In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...
In probability theory and statistics, a median is a type of average that is described as the number dividing the higher half of a sample, a population, or a probability distribution, from the lower half. ...
In statistics, mode means the most frequent value assumed by a random variable, or occurring in a sampling of a random variable. ...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
Example of the experimental data with non-zero skewness (gravitropic response of wheat coleoptiles, 1,790) In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. ...
The far red light has no effect on the average speed of the gravitropic reaction in wheat coleoptiles, but it changes kurtosis from platykurtic to leptokurtic (-0. ...
Claude Shannon In information theory, the Shannon entropy or information entropy is a measure of the uncertainty associated with a random variable. ...
In probability theory and statistics, the moment-generating function of a random variable X is wherever this expectation exists. ...
In probability theory, the characteristic function of any random variable completely defines its probability distribution. ...
Probability theory is the branch of mathematics concerned with analysis of random phenomena. ...
This article is about the field of statistics. ...
In mathematics, a probability distribution is called discrete, if it is fully characterized by a probability mass function. ...
The distribution was discovered by Siméon-Denis Poisson (1781–1840) and published, together with his probability theory, in 1838 in his work *Recherches sur la probabilité des jugements en matières criminelles et matière civile* ("Research on the Probability of Judgments in Criminal and Civil Matters"). The work focused on certain random variables *N* that count, among other things, a number of discrete occurrences (sometimes called "arrivals") that take place during a time-interval of given length. If the expected number of occurrences in this interval is λ, then the probability that there are exactly *k* occurrences (*k* being a non-negative integer, *k* = 0, 1, 2, ...) is equal to Simeon Poisson. ...
1781 was a common year starting on Monday (see link for calendar). ...
1840 is a leap year starting on Wednesday (link will take you to calendar). ...
In probability theory, a random variable is a quantity whose values are random and to which a probability distribution is assigned. ...
Look up time in Wiktionary, the free dictionary. ...
The integers are commonly denoted by the above symbol. ...
where *e* is the base of the natural logarithm (*e* = 2.71828...) *k* is the number of occurrences of an event - the probability of which is given by the function *k*! is the factorial of *k* - λ is a positive real number, equal to the expected number of occurrences that occur during the given interval. For instance, if the events occur on average every 4 minutes, and you are interested in the number of events occurring in a 10 minute interval, you would use as model a Poisson distribution with λ = 10/4 = 2.5.
As a function of *k*, this is the probability mass function. The Poisson distribution can be derived as a limiting case of the binomial distribution. e is the unique number such that the value of the derivative of f (x) = ex (blue curve) at the point x = 0 is exactly 1. ...
For factorial rings in mathematics, see unique factorisation domain. ...
In mathematics, the real numbers may be described informally as numbers that can be given by an infinite decimal representation, such as 2. ...
A minute is a unit of time equal to 1/60th of an hour and to 60 seconds. ...
In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ...
In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...
The Poisson distribution is sometimes called a Poissonian, analogous to the term Gaussian for a Gauss or normal distribution. The normal distribution, also called the Gaussian distribution, is an important family of continuous probability distributions, applicable in many fields. ...
## Poisson noise and characterizing small occurrences
The parameter λ is not only the *mean* number of occurrences , but also its variance (see Table). Thus, the number of observed occurrences fluctuates about its mean λ with a standard deviation . These fluctuations are denoted as **Poisson noise** or (particularly in electronics) as **shot noise**. In statistics, mean has two related meanings: the arithmetic mean (and is distinguished from the geometric mean or harmonic mean). ...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
In probability and statistics, the standard deviation of a probability distribution, random variable, or population or multiset of values is a measure of the spread of its values. ...
Photon noise simulation. ...
The correlation of the mean and standard deviation in counting independent, discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, *even if that contribution is too small to be detected directly*. For example, the charge *e* on an electron can be estimated by correlating the magnitude of an electric current with its shot noise. If *N* electrons pass a point in a given time *t* on the average, the mean current is *I* = *eN* / *t*; since the current fluctuations should be of the order (i.e. the variance of the Poisson process), the charge *e* can be estimated from the ratio . An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced silver grains, not to the individual grains themselves. By correlating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided). Many other molecular applications of Poisson noise have been developed, e.g., estimating the number density of receptor molecules in a cell membrane. Electric current is the flow (movement) of electric charge. ...
Photon noise simulation. ...
In statistics, mean has two related meanings: the arithmetic mean (and is distinguished from the geometric mean or harmonic mean). ...
Electric current is the flow (movement) of electric charge. ...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
This article is about the chemical element. ...
In biochemistry, a receptor is a protein on the cell membrane or within the cytoplasm or cell nucleus that binds to a specific molecule (a ligand), such as a neurotransmitter, hormone, or other substance, and initiates the cellular response to the ligand. ...
Look up cell membrane in Wiktionary, the free dictionary. ...
## Related distributions - If and , then the difference
*Y* = *X*_{1} − *X*_{2} follows a Skellam distribution. - If and are independent, and
*Y* = *X*_{1} + *X*_{2}, then the distribution of *X*_{1} conditional on *Y* = *y* is a binomial. Specifically, . More generally, if *X*_{1}, *X*_{2},..., *X*_{n} are Poisson random variables with parameters λ_{1}, λ_{2},..., λ_{n} then - The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the expected number of successes remains fixed. Therefore it can be used as an approximation of the binomial distribution if
*n* is sufficiently large and *p* is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and *p* is smaller than or equal to 0.05. According to this rule the approximation is excellent if *n* ≥ 100 and *np* ≤ 10. ^{[1]} - For sufficiently large values of λ, (say λ>1000), the normal distribution with mean λ, and variance λ, is an excellent approximation to the Poisson distribution. If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., P(
*X* ≤ *x*), where (lower-case) *x* is a non-negative integer, is replaced by P(*X* ≤ *x* + 0.5). -
The Skellam distribution is the discrete probability distribution of the difference N1 âˆ’ N2 of two correlated or uncorrelated random variables N1 and N2 having Poisson distributions with different expected values Î¼1 and Î¼2. ...
In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...
In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...
In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...
The normal distribution, also called the Gaussian distribution, is an important family of continuous probability distributions, applicable in many fields. ...
In probability theory, if a random variable X has a binomial distribution with parameters n and p, i. ...
## Occurrence The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete nature (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Examples of events that may be modelled as a Poisson distribution include: It has been suggested that this article be split into multiple articles. ...
Space has been an interest for philosophers and scientists for much of human history. ...
- The number of cars that pass through a certain point on a road (sufficiently distant from traffic lights) during a given period of time.
- The number of spelling mistakes one makes while typing a single page.
- The number of phone calls at a call center per minute.
- The number of times a web server is accessed per minute.
- The number of roadkill (animals killed) found per unit length of road.
- The number of mutations in a given stretch of DNA after a certain amount of radiation.
- The number of unstable nuclei that decayed within a given period of time in a piece of radioactive substance. The radioactivity of the substance will weaken with time, so the total time interval used in the model should be significantly less than the mean lifetime of the substance.
- The number of pine trees per unit area of mixed forest.
- The number of stars in a given volume of space.
- The number of soldiers killed by horse-kicks each year in each corps in the Prussian cavalry. This example was made famous by a book of Ladislaus Josephovich Bortkiewicz (1868–1931).
- The distribution of visual receptor cells in the retina of the human eye.
- The number of light bulbs that burn out in a certain amount of time.
- The number of viruses that can infect a cell in cell culture.
- The number of hematopoietic stem cells in a sample of unfractionated bone marrow cells.
- The inventivity of an inventor over their career.
- The number of particles that "scatter" off of a target in a nuclear or high energy physics experiment.
A very large collections call centre in Lakeland, FL. A call centre or call center (see spelling differences) is a centralised office used for the purpose of receiving and transmitting a large volume of requests by telephone. ...
The inside/front of a Dell PowerEdge web server The term Web server can mean one of two things: A computer program that is responsible for accepting HTTP requests from clients, which are known as Web browsers, and serving them HTTP responses along with optional data contents, which usually are...
Road fauna or roadkill is a non-scientific term describing animals fatally struck by or ridden over by vehicles on roads and freeways. ...
It has been suggested that mutant be merged into this article or section. ...
The structure of part of a DNA double helix Deoxyribonucleic acid, or DNA, is a nucleic acid molecule that contains the genetic instructions used in the development and functioning of all known living organisms. ...
The nucleus of an atom is the very small dense region, of positive charge, in its centre consisting of nucleons (protons and neutrons). ...
Radioactivity may mean: Look up radioactivity in Wiktionary, the free dictionary. ...
Given an assembly of elements, the number of which decreases ultimately to zero, the lifetime (also called the mean lifetime) is a certain number that characterizes the rate of reduction (decay) of the assembly. ...
STAR is an acronym for: Organizations Society of Ticket Agents and Retailers], the self-regulatory body for the entertainment ticket industry in the UK. Society for Telescopy, Astronomy, and Radio, a non-profit New Jersey astronomy club. ...
For other uses, see Prussia (disambiguation). ...
Ladislaus Josephovich Bortkiewicz (August 7, 1868 - July 15, 1931) was a Russian economist and statistician of Polish descent. ...
Year 1868 (MDCCCLXVIII) was a leap year starting on Wednesday (link will display the full calendar) of the Gregorian Calendar (or a leap year starting on Monday of the 12-day slower Julian calendar). ...
Year 1931 (MCMXXXI) was a common year starting on Thursday (link will display full 1931 calendar) of the Gregorian calendar. ...
Human eye cross-sectional view. ...
This article refers to the sight organ. ...
For other uses, see Eye (disambiguation). ...
## How does this distribution arise? — The *law of rare events* In several of the above examples—for example, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the binomial distribution. However, the binomial distribution with parameters *n* and λ/*n*, i.e., the probability distribution of the number of successes in *n* trials, with probability λ/*n* of success on each trial, approaches the Poisson distribution with expected value λ as *n* approaches infinity. This limit is sometimes known as the **law of rare events**, although this name may be misleading because the events in a Poisson process need not be rare (the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution, but these events would not be considered rare). It provides a means by which to approximate random variables using the Poisson distribution rather than the more-cumbersome binomial distribution. In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...
Here are the details. First, recall from calculus that For other uses, see Calculus (disambiguation). ...
Let *p* = λ/*n*. Then we have As *n* approaches ∞, the expression over the first underbrace approaches 1; the second remains constant since "*n*" does not appear in it at all; the third approaches *e*^{−λ}; and the fourth expression approaches 1. Consequently the limit is More generally, whenever a sequence of binomial random variables with parameters *n* and *p*_{n} is such that the sequence converges in distribution to a Poisson random variable with mean λ (see, e.g., law of rare events). In probability theory, there exist several different notions of convergence of random variables. ...
## Properties - The mode of a Poisson-distributed random variable with non-integer λ is equal to , which is the largest integer less than or equal to λ. This is also written as floor(λ). When λ is a positive integer, the modes are λ and λ − 1.
- Sums of Poisson-distributed random variables:
- If follow a Poisson distribution with parameter and
*X*_{i} are independent, then also follows a Poisson distribution whose parameter is the sum of the component parameters. -
- All of the cumulants of the Poisson distribution are equal to the expected value λ. The
*n*th factorial moment of the Poisson distribution is λ^{n}. - The directed Kullback-Leibler divergence between Poi(λ
_{0}) and Poi(λ) is given by -
In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
-1...
The Touchard polynomials comprise a polynomial sequence of binomial type defined by where S(n, k) is a Stirling number of the second kind, i. ...
Combinatorics is a branch of pure mathematics concerning the study of discrete (and usually finite) objects. ...
In combinatorial mathematics, Dobinskys formula states that the number of partitions of a set of n members is This has come to be called the nth Bell number Bn, after Eric Temple Bell. ...
A partition of U into 6 blocks: an Euler diagram representation. ...
In statistics, mode means the most frequent value assumed by a random variable, or occurring in a sampling of a random variable. ...
The floor and fractional part functions In mathematics, the floor function of a real number x, denoted or floor(x), is the largest integer less than or equal to x (formally, ). For example, floor(2. ...
In probability theory and statistics, the moment-generating function of a random variable X is wherever this expectation exists. ...
// Cumulants of probability distributions In probability theory and statistics, the cumulants Îºn of the probability distribution of a random variable X are given by In other words, Îºn/n! is the nth coefficient in the power series representation of the logarithm of the moment-generating function. ...
In probability theory, the nth factorial moment of a probability distribution, also called the nth factorial moment of any random variable X with that probability distribution, is where is the falling factorial (confusingly, this same notation, the Pochhammer symbol (x)n, is used by some mathematicians, especially in the theory...
The concept of infinite divisibility arises in different ways in philosophy, physics, economics, order theory (a branch of mathematics), and probability theory (also a branch of mathematics). ...
In probability theory and information theory, the Kullback-Leibler divergence (or information divergence, or information gain, or relative entropy) is a natural distance measure from a true probability distribution P to an arbitrary probability distribution Q. Typically P represents data, observations, or a precise calculated probability distribution. ...
### Generating Poisson-distributed random variables A simple way to generate random Poisson-distributed numbers is given by Knuth, see References below. Donald Ervin Knuth ( or Ka-NOOTH[1], Chinese: [2]) (b. ...
**algorithm** *poisson random number (Knuth)*: **init**: **Let** L ← *e*^{−λ}, k ← 0 and p ← 1. **do**: k ← k + 1. Generate uniform random number u and **let** p ← p × u. **while** p ≥ L **return** k − 1. While simple, the complexity is linear in λ. There are many other algorithms to overcome this. Some are given in Ahrens & Dieter, see References below.
## Parameter estimation ### Maximum likelihood Given a sample of *n* measured values *k*_{i} we wish to estimate the value of the parameter *λ* of the Poisson population from which the sample was drawn. To calculate the maximum likelihood value, we form the log-likelihood function Maximum likelihood estimation (MLE) is a popular statistical method used to make inferences about parameters of the underlying probability distribution from a given data set. ...
Take the derivative of *L* with respect to *λ* and equate it to zero: Solving for *λ* yields the maximum-likelihood estimate of *λ*: Since each observation has expectation λ so does this sample mean. Therefore it is an unbiased estimator of λ. It is also an efficient estimator, i.e. its estimation variance achieves the Cramér-Rao lower bound (CRLB). In statistics, a biased estimator is one that for some reason on average over- or underestimates what is being estimated. ...
In statistics, the CramÃ©r-Rao inequality, named in honor of Harald CramÃ©r and Calyampudi Radhakrishna Rao, expresses a lower bound on the variance of an unbiased statistical estimator, based on Fisher information. ...
### Bayesian inference In Bayesian inference, the conjugate prior for the rate parameter *λ* of the Poisson distribution is the Gamma distribution. Let Bayesian inference is statistical inference in which evidence or observations are used to update or to newly infer the probability that a hypothesis may be true. ...
In Bayesian probability theory, a class of prior probability distributions p(Î¸) is said to be conjugate to a class of likelihood functions p(x|Î¸) if the resulting posterior distributions p(Î¸|x) are in the same family as p(Î¸). For example, the Gaussian family is conjugate to itself (or self-conjugate...
In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. ...
denote that *λ* is distributed according to the Gamma density *g* parameterized in terms of a shape parameter *α* and an inverse scale parameter *β*: In mathematics, a probability density function (pdf) is a function that represents a probability distribution in terms of integrals. ...
In probability theory and statistics, a shape parameter is a special kind of numerical parameter of a parametric family of probability distributions. ...
In statistics, if a family of probabiblity densities parametrized by a parameter s is of the form fs(x) = f(sx)/s then s is called a scale parameter, since its value determines the scale of the probability distribution. ...
Then, given the same sample of *n* measured values *k*_{i} as before, and a prior of Gamma(*α*, *β*), the posterior distribution is The posterior mean E[*λ*] approaches the maximum likelihood estimate in the limit as . The posterior predictive distribution of additional data is a Gamma-Poisson (i.e. negative binomial) distribution. In probability and statistics the negative binomial distribution is a discrete probability distribution. ...
In probability and statistics the negative binomial distribution is a discrete probability distribution. ...
## The "law of small numbers" The word **law** is sometimes used as a synonym of probability distribution, and *convergence in law* means *convergence in distribution*. Accordingly, the Poisson distribution is sometimes called the **law of small numbers** because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. *The Law of Small Numbers* is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898. Some historians of mathematics have argued that the Poisson distribution should have been called the Bortkiewicz distribution.^{[citation needed]} In mathematics and statistics, a probability distribution is a function of the probabilities of a mutually exclusive and exhaustive set of events. ...
Ladislaus Josephovich Bortkiewicz (August 7, 1868 - July 15, 1931) was a Russian economist and statistician of Polish descent. ...
Year 1898 (MDCCCXCVIII) was a common year starting on Saturday (link will display the full calendar) of the Gregorian calendar (or a common year starting on Monday of the 12-day-slower Julian calendar). ...
## See also In probability theory, a compound Poisson distribution is the probability distribution of a Poisson-distibuted number of independent identically-distributed random variables. ...
It has been suggested that this article be split into multiple articles. ...
In statistics, the Poisson regression model attributes to a response variable Y a Poisson distribution whose expected value depends on a predictor variable x (written in lower case because the model treats x as non-random, in the following way: (where log means natural logarithm). ...
Queueing theory (also commonly spelled queuing theory) is the mathematical study of waiting lines (or queues). ...
The Erlang distribution is a continuous probability distribution with wide applicability primarily due to its relation to the exponential and Gamma distributions. ...
Look up time in Wiktionary, the free dictionary. ...
The Skellam distribution is the discrete probability distribution of the difference N1 âˆ’ N2 of two correlated or uncorrelated random variables N1 and N2 having Poisson distributions with different expected values Î¼1 and Î¼2. ...
In mathematics, the gamma function is defined by a definite integral. ...
In combinatorial mathematics, Dobinskys formula states that the number of partitions of a set of n members is This has come to be called the nth Bell number Bn, after Eric Temple Bell. ...
-1...
In mathematics, especially complex analysis, the Schwarz formula says: if a complex-valued function is continuous on the disk and analytic inside, then: for where we set with real-valued functions . ...
In statistics, the Robbins lemma, named after Herbert Robbins, states that if X is a random variable with a Poisson distribution, and f is any function for which the expected value E(f(X)) exists, then Robbins introduced this proposition while developing empirical Bayes methods. ...
In statistics, empirical Bayes methods involve: An underlying probability distribution of some unobservable quantity assigned to each member of a statistical population. ...
## References **^** NIST/SEMATECH, '6.3.3.1. Counts Control Charts', *e-Handbook of Statistical Methods*, <http://www.itl.nist.gov/div898/handbook/pmc/section3/pmc331.htm> [accessed 25 October 2006] - Donald E. Knuth (1969).
*Seminumerical Algorithms*, The Art of Computer Programming, Volume 2. Addison Wesley. - Joachim H. Ahrens, Ulrich Dieter (1974). "Computer Methods for Sampling from Gamma, Beta, Poisson and Binomial Distributions".
*Computing* **12** (3): 223--246. DOI:10.1007/BF02293108. - Joachim H. Ahrens, Ulrich Dieter (1982). "Computer Generation of Poisson Deviates".
*ACM Transactions on Mathematical Software* **8** (2): 163--179. DOI:10.1145/355993.355997. Pearson can mean Pearson PLC the media conglomerate. ...
A digital object identifier (or DOI) is a standard for persistently identifying a piece of intellectual property on a digital network and associating it with related data, the metadata, in a structured extensible way. ...
A digital object identifier (or DOI) is a standard for persistently identifying a piece of intellectual property on a digital network and associating it with related data, the metadata, in a structured extensible way. ...
## External links |