 FACTOID # 3: South Carolina has the highest rate of violent crimes and aggravated assaults per capita among US states.

 Home Encyclopedia Statistics States A-Z Flags Maps FAQ About

 WHAT'S NEW

SEARCH ALL

Search encyclopedia, statistics and forums:

(* = Graphable)

Encyclopedia > Zeta distribution
 Probability mass function Plot of the Zeta PMF on a log-log scale. (Note that the function is only defined at integer values of k. The connecting lines do not indicate continuity.) Cumulative distribution function Parameters Support pmf cdf Mean Median N/A Mode Variance Skewness Kurtosis Entropy mgf Char. func.

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the integer value k is given by the probability mass function fk(s): Image File history File links Download high resolution version (1300x975, 185 KB) Probability mass function for the Zeta distribution File links The following pages link to this file: Zeta distribution ... Image File history File links Download high resolution version (1300x975, 128 KB) Cumulative mass function for the Zeta distribution File links The following pages link to this file: Zeta distribution ... In mathematics, the support of a numerical function f on a set X is sometimes defined as the subset of X on which f is nonzero. ... In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ... In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the variable X takes on a value less than or... In probability (and especially gambling), the expected value (or (mathematical) expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects to win per bet if bets with identical odds... In probability theory and statistics, the median is a number that separates the highest half of a sample, a population, or a probability distribution from the lowest half. ... In statistics, the mode is the value that has the largest number of observations, namely the most frequent value or values. ... In probability theory and statistics, the variance of a random variable is a measure of its statistical dispersion, indicating how far from the expected value its values typically are. ... In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. ... In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ... Entropy of a Bernoulli trial as a function of success probability. ... In probability theory and statistics, the moment-generating function of a random variable X is The moment-generating function generates the moments of the probability distribution, as follows: If X has a continuous probability density function f(x) then the moment generating function is given by where is the ith... Some mathematicians use the phrase characteristic function synonymously with indicator function. The indicator function of a subset A of a set B is the function with domain B, whose value is 1 at each point in A and 0 at each point that is in B but not in A... Probability theory is the mathematical study of probability. ... Statistics is a type of data analysis whose practice includes the planning, summarizing, and interpreting of observations of a system possibly followed by predicting or forecasting of future events based on a mathematical model of the system being observed. ... In mathematics, a probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. ... A random variable can be thought of as the numeric result of operating a non-deterministic mechanism or performing a non-deterministic experiment to generate a random result. ...

where ζ(s) is the Riemann zeta function. In mathematics, the Riemann zeta function is a function which is of paramount importance in number theory, because of its relation to the distribution of prime numbers. ...

It can be shown that these are the only probability distributions for which the multiplicities of distinct prime factors of X are independent random variables. In number theory, the prime factors of a positive integer are the prime numbers that divide into that integer exactly, without leaving a remainder. ... A random variable can be thought of as the numeric result of operating a non-deterministic mechanism or performing a non-deterministic experiment to generate a random result. ...

The zeta distribution is equivalent to the Zipf distribution for infinite N. Indeed the terms "Zipf distribution" and the "zeta distribution" are often used interchangeably. This article may be too technical for most readers to understand. ...

## The case s = 1 GA_googleFillSlot("encyclopedia_square");

ζ(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if See harmonic series (music) for the (related) musical concept. ...

exists where N(A,n) is the number of members of A less than or equal to n, then

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless the second limit given above exists and is equal to

log10(d + 1) − log10(d),

in accord with Benford's law. Benfords law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit 1 occurs much more often than the others (namely about 30% of the time). ...

Other "power-law" distributions

The Cauchy-Lorentz distribution, named after Augustin Cauchy, is a continuous probability distribution with probability density function where x0 is the location parameter, specifying the location of the peak of the distribution, and Î³ is the scale parameter which specifies the half-width at half-maximum (HWHM). ... ... The Pareto distribution, named after the Italian economist Vilfredo Pareto, is a power law probability distribution found in a large number of real-world situations. ... This article may be too technical for most readers to understand. ... The Zipf-Mandelbrot law (also known as the Pareto-Zipf law) is a power-law distribution on ranked data, named after the Harvard linguistics professor George Kingsley Zipf (1902-1950) who suggested regularity in texts, and the mathematician Benoit Mandelbrot (born November 20, 1924), who generalized it. ... Results from FactBites:

 Zipf's law - Wikipedia, the free encyclopedia (987 words) In the tail of the Yule-Simon distribution the frequencies are approximately The log-normal distribution is the distribution of a random variable whose logarithm is normally distributed, useful when small fluctuations multiply a quantity rather than add to it. In the parabolic fractal distribution, the logarithm of the frequency is a quadratic polynomial of the logarithm of the rank.
 Zeta distribution - Wikipedia (452 words) In probability theory and statistics, the zeta distribution is a discrete probability distribution. The zeta distribution is equivalent to the Zipf distribution for infinite N. Indeed the terms "Zipf distribution" and the "zeta distribution" are often used interchangeably.
More results at FactBites »

Share your thoughts, questions and commentary here
Press Releases | Feeds | Contact