Convolutional layer dropout layer in keras
According to classical paper
http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf
dropout operation affects not only training step but also test step - we need to multiply all neuron output weights by probability p.
But in keras library I found the following implementation of dropout operation:
retain_prob = 1. - level
...
random_tensor = rng.binomial(x.shape, p=retain_prob, dtype=x.dtype)
...
x *= random_tensor
x /= retain_prob
return x
(see https://github.com/fchollet/keras/blob/master/keras/backend/theano_backend.py)
Why x is divided by retain_prob while it should be multiplied? Or I'm just confused, and multiplying weights is equivelent to dividing output value?
Topic keras dropout theano neural-network machine-learning
Category Data Science