learning curves of a classification algorithm
I a trying to understand this learning curve of a classification problem. But I am not sure what to infer. I believe that I have overfitting but I cannot sure.
- Very low training loss that’s very slightly increasing upon adding training examples.
- Gradually decreasing validation loss (without flattening) upon adding training examples. However, I do not see any gap at the end of the lines something that is usually can be found in an overfitting model
On the other hand, I might have underfitting as:
- Learning curve of an underfit model has a low training loss at the beginning which gradually increases upon adding training examples and stay flat, indicating the addition of more training examples can’t improve the model performance on unseen data
- Training loss and validation loss are close to each other at the end
However, the train error is not to big something that usually is found on underfitting models
I am confused Can you please provide me with some advice?
Topic bias variance classification machine-learning
Category Data Science