Abstract

Maximum A Posteriori (MAP) is an estimation of parameters given the data as well as some prior hypothesis.

Given historical data , we want to estimate the parameters . For maximum a posteriori (MAP), we already have some pre-existing hypothesis about our probabilities, termed a prior , and use our new data to update our hypothesis to get the posterior, .. In other words, we find thatโ€™s most likely explained by as well as our prior:

The prior has an influence on the final probability density, and in practice biases our model toward a smoother, simpler distribution.

Example

Weโ€™ll illustrate this concept with a coin-flip example. Let be a set of coin-flip results, with heads and tails, and let be the probability of the coin landing heads. Weโ€™ll find that maximizes the probability of given that occurred. For conjugacy, let

follow the same family of distributions as the likelihood and posterior. Then, we can maximize the posterior with ๐Ÿช™ Bayesโ€™ Theorem.