Overfitting occurs when a model learns training noise along with the actual correlations; in other words, the modelโs performance becomes dependent on the training data, resulting in low training error but high test error.
The degree of overfitting depends on model complexity and regularization. With regularization, we decrease model complexity to reduce overfitting, but this also increases training error. This phenomenon is commonly known as the bias-variance tradeoff.
Bias and Variance
Simple models donโt fit data well but generalized better; complex models can fit data perfectly but generalize terribly. Therefore, simple models have high bias while complex models have high variance.
For estimate
Bias-Variance Decomposition
These two terms are related by bias-variance decomposition. For model