In probability theory and statistics, the **Bernoulli distribution**, named after Swiss scientist Jakob Bernoulli, is a discrete probability distribution, which takes value 1 with success probability *p* and value 0 with failure probability *q* = 1 − *p*. So if *X* is a random variable with this distribution, we have: In mathematics, the real numbers may be described informally in several different ways. ...
In mathematics, the support of a real-valued function f on a set X is sometimes defined as the subset of X on which f is nonzero. ...
In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ...
In probability theory, the cumulative distribution function (abbreviated cdf) completely describes the probability distribution of a real-valued random variable, X. For every real number x, the cdf is given by where the right-hand side represents the probability that the random variable X takes on a value less than...
In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...
In probability theory and statistics, a median is a number dividing the higher half of a sample, a population, or a probability distribution from the lower half. ...
In statistics, mode means the most frequent value assumed by a random variable, or occurring in a sampling of a random variable. ...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
Example of the experimental data with non-zero skewness (gravitropic response of wheat coleoptiles, 1,790) In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. ...
In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ...
Entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function Entropy is a concept in thermodynamics (see thermodynamic entropy), statistical mechanics and information theory. ...
In probability theory and statistics, the moment-generating function of a random variable X is wherever this expectation exists. ...
In probability theory, the characteristic function of any random variable completely defines its probability distribution. ...
Probability theory is the mathematical study of phenomena characterized by randomness or uncertainty. ...
Template:Otherusescccc A graph of a bell curve in a normal distribution showing statistics used in educational assessment, comparing various grading methods. ...
To meet Wikipedias quality standards, this article or section may require cleanup. ...
In mathematics, a probability distribution is called discrete, if it is fully characterized by a probability mass function. ...
In mathematics and statistics, a probability distribution, more properly called a probability density, assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. ...
The probability mass function *f* of this distribution is In probability theory, a probability mass function (abbreviated pmf) gives the probability that a discrete random variable is exactly equal to some value. ...
The expected value of a Bernoulli random variable *X* is , and its variance is In probability theory the expected value (or mathematical expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff (value). Thus, it represents the average amount one expects as the outcome of the random trial when identical odds are...
In probability theory and statistics, the variance of a random variable (or somewhat more precisely, of a probability distribution) is a measure of its statistical dispersion, indicating how its possible values are spread around the expected value. ...
The kurtosis goes to infinity for high and low values of *p*, but for *p* = 1 / 2 the Bernoulli distribution has a lower kurtosis than any other probability distribution, namely -2. In probability theory and statistics, kurtosis is a measure of the peakedness of the probability distribution of a real-valued random variable. ...
The Bernoulli distribution is a member of the exponential family. In probability and statistics, an exponential family is any class of probability distributions having a certain form. ...
## Related distributions
- If are independent, identically distributed random variables, all Bernoulli distributed with success probability p, then (binomial distribution).
In probability theory and statistics, the binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. ...
## See also |