The posterior probability can be calculated by Bayes' theorem from the prior probability and the likelihood function. Jump to: navigation, search Bayes theorem is a result in probability theory. ...
In statistics, a likelihood function is a conditional probability function considered a function of its second argument with its first argument held fixed, thus: and also any other function proportional to such a function. ...
Similarly a **posterior probability distribution** is the conditional probability distribution of the uncertain quantity given the data. It can be calculated by multiplying the prior probability distribution by the likelihood function, and then dividing by the normalizing constant. For example Jump to: navigation, search In mathematics, a probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. ...
A prior probability is a marginal probability, interpreted as a description of what is known about a variable in the absence of some evidence. ...
In statistics, a likelihood function is a conditional probability function considered a function of its second argument with its first argument held fixed, thus: and also any other function proportional to such a function. ...
The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics. ...
gives the posterior probability density function for a random variable *X* given the data *Y*=*y*, where In mathematics, a probability density function (pdf) serves to represent a probability distribution in terms of integrals. ...
*f*_{X}(*x*) is the prior density of X, - is the likelihood function as a function of
*x*, - is the normalizing constant, and
- is the posterior density of
*X*. |