Learning rate of 0 still changes weights in Keras

I just trained a model (SGD) with keras and was wondering why the change of accuracy and loss from epoch to epoch doesn't really decrease that much when I lower the learning rate. So I tested what happens when I set the learning rate to 0 and to my surprise, accuracy and loss still changed from epoch to epoch and I can't find an explanation for that. Does anyone know why this could be happening?

Topic sgd learning-rate keras

Category Data Science


If your learning rate is set lower, training will progress very slowly because you are making very tiny updates to the weights. However, if you set learning rate higher, it can cause undesirable divergent behavior in your loss function. So when you set learning rate lower you need to set higher number of epochs.

The reason for change when you set learning rate to 0 is beacuse of Batchnorm. If you have batchnorm in your model, remove it and try. Look at these link, link

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.