Can I apply Dropout In layers other than Fully Connected layers in CNN

I have read and seen that in CNN we apply DROPOUT layer between the FULLY CONNECTED layers to reduce overfitting.

Can we also apply the dropout layer between the CONV layers and the POOL layers. I have not seen models with this method applied. Will it help in overfitting when applied between these layers or are there any disadvantages to it?

Topic cnn dropout deep-learning

Category Data Science


In a CNN, each neuron produces one feature map. Since spatial dropout which is normally used for CNN's is per-neuron, dropping a neuron means that the corresponding feature map is dropped.

Pooling usually operates separately on each feature map, so it should not make any difference if you apply dropout before or after pooling.

Yes it should help with preventing overfitting. The added advantage is you can get uncertainty from the network at zero cost if using dropout.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.