How can i deal with this overfitting?

I trained my data over 40 epochs but got finally this shape. How can I deal with this problem? Please as I used 30.000 for training and 5000 for testing and

lr_schedule = keras.optimizers.schedules.ExponentialDecay(
    initial_learning_rate=4e-4,
    decay_steps=50000,
    decay_rate=0.5)

should I increase the number of data in testing or make changes in the model?

EDIT

After I add regularization I got this shape and the loss started from a number greater than before in the previous shape, does that normal?

Is this good training or is there still a problem?

Topic learning-rate overfitting deep-learning

Category Data Science


Here are some proposal. I would need to see the code to be more specific.

Did you randomize your data and split to train and validation parts?

Have you applied any dropout to your learning process?

Did you normalize the data?

It seems that your model use quite different set of data, having them randomly organized could solve your issue. On the other hand, a 10% drop out could often avoid overfitting issues because it resets a part of neural network weights. Lack of normalization could also block the neurons to specific ranges of data and explain bad results in the validation dataset.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.