What issue is there, when training this network with gradient descent?

Suppose we have the following fully connected network made of perceptrons with a sign function as the activation unit, what issue arises, when trying to train this network with gradient descent?

Topic perceptron backpropagation gradient-descent

Category Data Science


what issue arises, when trying to train this network with gradient descent?

The activation function is sign function or signum function (A little modified).
So, its Derivative will be 0 at all the points

Hence, the Gradient descent won’t be able to make progress in updating the weights and backpropagation will fail.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.