A regression model uses a distribution to model some function from predictors to . Here, weโ€™ll consider the classical linear regression model, where we assume the data is generated by

where noise .

Our model has the form

which looks very much like the ๐Ÿ›Ž๏ธ Normal Model except with the mean being determined by . Within these predictor variables , we assume thereโ€™s an intercept (constant at ).

Non-Informative Prior

Our normal model has the improper non-informative prior and , so for our normal regression model, weโ€™ll use a similar and .

The joint posterior is

Again breaking this up, we have the conditional posterior

where stands for multivariate normal, and .

The marginal posterior is

Sampling

To sample estimates for and as well as new , we perform the following:

  1. Sample .
  2. Set .
  3. Sample .
  4. Sample .

Conjugate Prior

Weโ€™ll keep the non-informative prior for , but weโ€™ll now use a conjugate prior for ,

The posterior is

The conditional posterior is

where and ,

and the marginal posterior for is the same as above except with our new definition for .

Connection to Machine Learning

We often want sparsity in our modelโ€”for as many coefficients to be zero as possible, thereby giving us a subset of important predictors with non-zero coefficients. We can encode this belief into our conjugate prior as

which pulls each toward zero.

Without this prior (using the non-informative prior), we get the least square estimate. With the prior, we have a penalty term in our minimization objective.

  1. With the sparsity Normal prior above, the penalty becomes which is called ridge regression.
  2. With a Laplace prior , we have the penalty , called LASSO regression.

See ๐Ÿฆ Linear Regression and โšฝ๏ธ Regularization Penalties for more details.

Poisson Model

Regression can also be done with the ๐Ÿ›ฉ๏ธ Poisson Model. We have

where .

Using a non-informative joint prior , we have the posterior

We can use ๐Ÿงฑ Grid Sampling to get parameter samples , which we can then use for prediction.