Implementing Dropout for Recurrent Layers in Keras + Theano
I am looking to implement recurrent dropout (where recurrent connections between memory units of a recurrent layer such as LSTM/GRU/RNN are randomly set to 0) in Keras 2.3.1 on Theano backend on Python 3.6.
As of Keras 2.1.3, dropout for recurrent layers are no longer supported for the Theano backend:
Documenting this change further: the motivation for removing this feature in Theano is simply that although it would be technically possible to make it work, it would be hacky, i.e. it would reduce code readability, code maintainability, and importantly, it would be bug-prone. Since Theano development has been discontinued, we expect increasingly less Keras users to rely on the Theano backend, and thus the trade-off between supporting RNN dropout in Theano and having a nice and bug-free RNN codebase is tipping towards the latter.
Unfortunately, I am unable to use a TensorFlow backend due to some server limitations nor roll back to an earlier version of Keras.
Any suggestions on how this may be implemented?
Topic keras theano rnn neural-network
Category Data Science