What is overfitting?
Overfitting occurs when the model has a high variance, i.e., the model performs well on the training data but does not perform accurately in the evaluation set.
What is Overfitting in Deep Learning [+10 Ways to Avoid It]
How to avoid overfitting?
In deep learning, there are several ways to avoid overfitting.
- Regularization
- L1, L2
- Dropout
- Cross Validation
Basically, these techniques are either forgetting what ML learned or penalizing learning itself.
But it requires too much time to complete ML and is inefficient to amend after completion.
Early stopping and learning rate optmization
They are more direct way to stop overfitting. During iteration(epochs), if your performance on test data underperforms to the previous result, just stop learning.
The other one is just decreasing learning rate to moderate the overfitting.