Underfitting
Underfitting
Underfitting occurs when a machine learning model is too simple to capture the underlying pattern in the data, resulting in poor performance on both training and unseen data.
What is Underfitting?
Underfitting means the model fails to learn enough from the training data. It shows high errors during training and testing because it cannot capture important trends.
Causes of Underfitting
- Model Too Simple: Using a linear model for data with complex, nonlinear relationships.
- Insufficient Features: Missing important input variables.
- Too Much Regularization: Over-penalizing complexity can oversimplify the model.
- Not Enough Training: Stopping training too early.
Signs of Underfitting
- Low accuracy on both training and validation/test data.
- High bias in the model.
How to Detect Underfitting
- Compare training and validation errors — both will be high.
- Plot learning curves showing poor performance from the start.
Techniques to Fix Underfitting
- Increase Model Complexity: Use more complex algorithms or add polynomial features.
- Add More Features: Use relevant input variables.
- Reduce Regularization: Allow model more flexibility.
- Train Longer: Give the model more time to learn patterns.
- Feature Engineering: Create new features that capture important information.
Example
Fitting a straight line to data that follows a curved pattern will result in underfitting because the model is too simple to capture the curve.
Related Pages
SEO Keywords
underfitting in machine learning, underfitting meaning, causes of underfitting, underfitting vs overfitting, fixing underfitting, model complexity, machine learning errors