 FACTOID # 4: Just 1% of the houses in Nevada were built before 1939.

 Home Encyclopedia Statistics States A-Z Flags Maps FAQ About

 WHAT'S NEW

SEARCH ALL

Search encyclopedia, statistics and forums:

(* = Graphable)

Encyclopedia > Logarithmic distribution
 Probability mass function Cumulative distribution function Parameters Support pmf cdf Mean Median Mode 1 Variance Skewness Kurtosis Entropy mgf Char. func.

In probability and statistics, the logarithmic distribution (also known as the logarithmic series distribution) is a discrete probability distribution. In mathematics, the support of a numerical function f on a set X is sometimes defined as the subset of X on which f is nonzero. ... In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ... In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the variable X takes on a value less than or... In probability (and especially gambling), the expected value (or (mathematical) expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects to win per bet if bets with identical odds... In probability theory and statistics, the median is a number that separates the highest half of a sample, a population, or a probability distribution from the lowest half. ... In statistics, the mode is the value that has the largest number of observations, namely the most frequent value or values. ... In probability theory and statistics, the variance of a random variable is a measure of its statistical dispersion, indicating how far from the expected value its values typically are. ... In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. ... In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ... Entropy of a Bernoulli trial as a function of success probability. ... In probability theory and statistics, the moment-generating function of a random variable X is The moment-generating function generates the moments of the probability distribution, as follows: If X has a continuous probability density function f(x) then the moment generating function is given by where is the ith... Some mathematicians use the phrase characteristic function synonymously with indicator function. The indicator function of a subset A of a set B is the function with domain B, whose value is 1 at each point in A and 0 at each point that is in B but not in A... The word probability derives from the Latin probare (to prove, or to test). ... Statistics is a type of data analysis whose practice includes the planning, summarizing, and interpreting of observations of a system possibly followed by predicting or forecasting of future events based on a mathematical model of the system being observed. ... The word discrete comes from the Latin word discretus which means separate. ... In mathematics, a probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. ...

The logarithmic distribution is derived from the Maclaurin series expansion of ln(1 − p), which is As the degree of the taylor series rises, it approaches the correct function. ...

From this we obtain the identity

.

This leads directly to the probability mass function of a Log(p)-distributed random variable: In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ... A random variable can be thought of as the numeric result of operating a non-deterministic mechanism or performing a non-deterministic experiment to generate a random result. ...

for , and where 0 < p < 1. Because of the identity above, the distribution is properly normalized.

The cumulative distribution function is In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the variable X takes on a value less than or...

where Β is the incomplete beta function. In mathematics, the incomplete beta function is a generalization of the beta function that replaces the definite integral of the beta function with an indefinite integral. ...

A Poisson mixture of Log(p)-distributed random variables has a negative binomial distribution. In other words, if N is a random variable with a Poisson distribution, and Xi, i = 1, 2, 3, ... is an infinite sequence of independent identically distributed random variables each having a Log(p) distribution, then In probability and statistics the negative binomial distribution is a discrete probability distribution. ... In probability theory and statistics, the Poisson distribution is a discrete probability distribution (discovered by SimÃ©on-Denis Poisson (1781â€“1840) and published, together with his probability theory, in 1838 in his work Recherches sur la probabilitÃ© des jugements en matiÃ¨res criminelles et matiÃ¨re civile [Research on the...

has a negative binomial distribution. In this way, the negative binomial distribution is seen to be a compound Poisson distribution. In probability theory, a compound Poisson distribution is the probability distribution of a Poisson-distibuted number of independent identically-distributed random variables. ... Results from FactBites:

 Probability distribution - Wikipedia, the free encyclopedia (1330 words) A probability distribution is a special case of the more general notion of a probability measure, which is a function that assigns probabilities satisfying the Kolmogorov axioms to the measurable sets of a measurable space. The rectangular distribution is a uniform distribution on [-1/2,1/2]. The triangular distribution on [ a, b ], a special case of which is the distribution of the sum of two uniformly distributed random variables (the convolution of two uniform distributions).
More results at FactBites »

Share your thoughts, questions and commentary here
Press Releases | Feeds | Contact