How can i deal with this overfitting?
I trained my data over 40 epochs but got finally this shape. How can I deal with this problem? Please as I used 30.000 for training and 5000 for testing and
lr_schedule = keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate=4e-4,
decay_steps=50000,
decay_rate=0.5)
should I increase the number of data in testing or make changes in the model?
EDIT
After I add regularization I got this shape and the loss started from a number greater than before in the previous shape, does that normal?
Is this good training or is there still a problem?
Topic learning-rate overfitting deep-learning
Category Data Science