Training loss stuck in the starting epochs but then starts decreasing. What could be the reason for it?

I am training a model where I found a unique problem that for starting 4 epochs, my loss did not change with the epochs but after that, it started changing. Could it be because of the high learning rate, local minima or something else like some regularisation parameter is high??

Topic mlp deep-learning neural-network classification machine-learning

Category Data Science


It could be any of those factors. The root cause can be found through experimentation. Hold everything constant and change a single factor. Then systematically change each factor while holding all other factors constant.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.