# Posterior Probability distribution

## What is Posterior Probability?

Posterior probability is the probability an event will happen after all evidence or background information has been taken into account. It is closely related to prior probability, which is the probability an event will happen before you taken any new evidence into account. You can think of posterior probability as an adjustment on prior probability:

Posterior probability = prior probability + new evidence (called ‘likelihood’)

For example, historical data suggests that around 60% of students who start college will graduate within 6 years. This is the prior probability. However, you think that figure is much lower, so set out to collect new data. The evidence you collect suggests that the true figure is closer to 50%; This is the posterior probability.

## Origin of the Terms

The words posterior and prior come from the Latin a priori. The definition of “a priori” is:

“…relating to what can be known through an understanding of how certain things work [i.e., a hypothesis] rather than by observation” ~ Miriam Webster.”

## What is a Posterior Distribution?

The posterior distribution is a way to summarize what we know about uncertain quantities in Bayesian analysis. It is a combination of the prior distribution and the likelihood function, which tells you what information is contained in your observed data (the “new evidence”). In other words, the posterior distribution summarizes what you know after the data has been observed. The summary of the evidence from the new observations is the likelihood function.

Posterior Distribution = Prior Distribution + Likelihood Function (“new evidence”)

Posterior distributions are vitally important in Bayesian Analysis. They are, in many ways, the goal of the analysis and can give you:

• Interval estimates for parameters,
• Point estimates for parameters,
• Prediction inference for future data,
• Probabilistic evaluations for a hypothesis.