Exploring Underfitting in Depth
In this lesson, we'll dive deeper into underfitting
.
Underfitting occurs when an AI model does not sufficiently learn the patterns of the training data, resulting in poor performance on both the training data and new data.
Understanding Underfitting Figuratively
Imagine a child starting to learn about dinosaurs.
Initially, when they hear the word Tyrannosaurus
, they only associate it with a large animal with big teeth that walks on two legs
.
If you show this child several dinosaur pictures and ask, Pick out the Tyrannosaurus, and if the child hasn't learned enough about dinosaurs, they might not identify the Tyrannosaurus correctly. The child may think small dinosaurs or four-legged ones are Tyrannosauruses.
This child hasn't learned enough information about Tyrannosaurus to distinguish dinosaurs properly. This state is what we refer to as underfitting
.
Solutions to Underfitting
There are several ways to address underfitting:
1. Increase Model Complexity
Enhance the model's complexity to better capture data patterns. For example, use a model with more features or a more structured neural network.
2. Adjust Hyperparameters
Adjusting hyperparameters can help resolve underfitting as follows:
Learning Rate
Adjust the learning rate appropriately so that the model can fully learn.
A learning rate that's too high can make the learning unstable, but if it's too low, underfitting may occur.
Batch Size
The batch size must be set appropriately for the model to adequately learn the data patterns.
A batch size that's too small can make the learning unstable, but if it's too large, underfitting may result.
Number of Epochs
Increase the number of epochs so the model can learn effectively.
Too few epochs can lead to underfitting.
We've now discussed the concept of underfitting and some solutions.
In the next lesson, we'll wrap up what we've learned with a simple quiz.
Want to learn more?
Join CodeFriends Plus membership or enroll in a course to start your journey.