Should I set higher dropout prob if there are plenty of data?
I have some excessive amount of data for the size of NN I am able to teach in a reasonable time.
If I feed all the data into the network it stops learning at some point and a resulting model shows all signs of being overfit.
Intuitively if I increase dropout prob the model should learn less aggressively from data and gain from more data being fed into it.
Is my logic sound?
Topic overfitting dropout regularization deep-learning neural-network
Category Data Science