Training loss stuck in the starting epochs but then starts decreasing. What could be the reason for it?
I am training a model where I found a unique problem that for starting 4 epochs, my loss did not change with the epochs but after that, it started changing. Could it be because of the high learning rate, local minima or something else like some regularisation parameter is high??
Topic mlp deep-learning neural-network classification machine-learning
Category Data Science