FACTOID # 13: New York has America's lowest percentage of residents who are veterans.
 
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
   
 
WHAT'S NEW
 

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

 

 

(* = Graphable)

 

 


Encyclopedia > Bernoulli distribution
Bernoulli
Probability mass function
Cumulative distribution function
Parameters p>0, (real)
Support k={0,1},
Probability mass function (pmf) begin{matrix} q & mbox{for }k=0 p~~ & mbox{for }k=1 end{matrix}
Cumulative distribution function (cdf) begin{matrix} 0 & mbox{for }k<0 q & mbox{for }0leq k<11 & mbox{for }kgeq 1 end{matrix}
Mean p,
Median N/A
Mode begin{matrix} 0 & mbox{if } q > p 0, 1 & mbox{if } q=p 1 & mbox{if } q < p end{matrix}
Variance (pq)^2,
Skewness frac{q-p}{sqrt{pq}}
Excess Kurtosis frac{6p^2-6p+1}{p(1-p)}
Entropy -qln(q)-pln(p),
mgf q+pe^t,
Char. func. q+pe^{it},

In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jakob Bernoulli, is a discrete probability distribution, which takes value 1 with success probability p and value 0 with failure probability q = 1 − p. So if X is a random variable with this distribution, we have: In mathematics, the real numbers may be described informally in several different ways. ... In mathematics, the support of a real-valued function f on a set X is sometimes defined as the subset of X on which f is nonzero. ... In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ... In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the random variable X takes on a value less than... In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are... In probability theory and statistics, a median is a number dividing the higher half of a sample, a population, or a probability distribution from the lower half. ... In statistics, mode means the most frequent value assumed by a random variable, or occurring in a sampling of a random variable. ... In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ... Example of the experimental data with non-zero skewness (gravitropic response of wheat coleoptiles, 1,790) In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. ... In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ... Entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function Entropy is a concept in thermodynamics (see thermodynamic entropy), statistical mechanics and information theory. ... In probability theory and statistics, the moment-generating function of a random variable X is wherever this expectation exists. ... In probability theory, the characteristic function of any random variable completely defines its probability distribution. ... Probability theory is the mathematical study of phenomena characterized by randomness or uncertainty. ... Template:Otherusescccc A graph of a bell curve in a normal distribution showing statistics used in educational assessment, comparing various grading methods. ... To meet Wikipedias quality standards, this article or section may require cleanup. ... In mathematics, a probability distribution is called discrete, if it is fully characterized by a probability mass function. ... In mathematics and statistics, a probability distribution, more properly called a probability density, assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. ...

Pr(X=1) = 1- Pr(X=0) = p.!

The probability mass function f of this distribution is In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ...

f(k;p) = left{begin{matrix} p & mbox {if }k=1,  1-p & mbox {if }k=0,  0 & mbox {otherwise.}end{matrix}right.

The expected value of a Bernoulli random variable X is Eleft(Xright)=p, and its variance is In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are... In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...

textrm{var}left(Xright)=pleft(1-pright).,

The kurtosis goes to infinity for high and low values of p, but for p = 1 / 2 the Bernoulli distribution has a lower kurtosis than any other probability distribution, namely -2. In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ...


The Bernoulli distribution is a member of the exponential family. In probability and statistics, an exponential family is any class of probability distributions having a certain form. ...


Related distributions

  • If X_1,dots,X_n are independent, identically distributed random variables, all Bernoulli distributed with success probability p, then Y = sum_{k=1}^n X_k sim mathrm{Binomial}(n,p) (binomial distribution).

In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...

See also

Image:Bvn-small.png Probability distributionsview  talk  edit ]
Univariate Multivariate
Discrete: BernoullibinomialBoltzmanncompound Poissondegenerate • Gauss-Kuzmin • geometrichypergeometriclogarithmicnegative binomialparabolic fractalPoissonRademacherSkellamuniform • Yule-Simon • zetaZipf • Zipf-Mandelbrot Ewensmultinomialmultivariate Polya
Continuous: BetaBeta primeCauchychi-squareDirac delta functionErlangexponentialexponential powerFfading • Fisher's z • Fisher-Tippett • Gammageneralized extreme valuegeneralized hyperbolicgeneralized inverse Gaussian • Half-Logistic • Hotelling's T-square • hyperbolic secant • hyper-exponential • hypoexponential • inverse chi-square • inverse Gaussianinverse gammaKumaraswamyLandauLaplaceLévy • Lévy skew alpha-stable • logistic • log-normal • Maxwell-Boltzmann • Maxwell speednormal (Gaussian) • normal inverse Gaussian • ParetoPearsonpolarraised cosineRayleigh • relativistic Breit-Wigner • Rice • Student's t • triangular • type-1 Gumbel • type-2 Gumbel • uniform • Variance-Gamma • Voigtvon MisesWeibullWigner semicircleWilks' lambda DirichletKentmatrix normalmultivariate normal • von Mises-Fisher • Wigner quasi • Wishart
Miscellaneous: Cantorconditionalexponential family • infinitely divisible • location-scale family • marginalmaximum entropy • phase-type • posteriorprior • quasi • samplingsingular

  Results from FactBites:
 
math lessons - Probability distribution (1245 words)
A probability distribution is a special case of the more general notion of a probability measure, which is a function that assigns probabilities satisfying the Kolmogorov axioms to the measurable sets of a measurable space.
The rectangular distribution is a uniform distribution on [-1/2,1/2].
The triangular distribution on [a, b], a special case of which is the distribution of the sum of two uniformly distributed random variables (the convolution of two uniform distributions).
  More results at FactBites »

 
 

COMMENTARY     


Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:

 


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m