Determining the Number of Times to Iterate Learning - Epoch
Much like you gain a deeper understanding of a book by reading it multiple times, a machine learning model benefits from repeated exposure to data.
In the same manner, a machine learning model needs to learn from the given data multiple times to make more accurate predictions.
The hyperparameter that determines how many times the model will learn from the entire dataset is the Epoch
.
Why Epoch is Important
Learning from the data just once may not be sufficient for the model to grasp adequate patterns.
Hence, it should learn from the data repeatedly to adjust and make increasingly better predictions.
However, an excessive or insufficient number of epochs might hinder effective learning.
When the Epoch is Too Few
The model may not fully learn and might experience Underfitting
.
This means it doesn’t learn the data properly, potentially reducing its ability to predict new data accurately.
When the Epoch is Too Many
The model might conform too closely to the training data, resulting in Overfitting
.
In this case, it performs well on the training data but might lose accuracy when introduced to new data.
Setting the Epoch
Typically, values between 10 and 100 are used when setting epochs, but you should find the optimal value based on the data's size and complexity.
If the performance improvement subsides as training progresses, using techniques like Early Stopping
can help prevent unnecessary training.
In the next lesson, we will explore Overfitting
, which can occur when the number of epochs is excessively high.
Want to learn more?
Join CodeFriends Plus membership or enroll in a course to start your journey.