How to tune learning rate with HParams Dashboard on Tensorflow?
In Tensorflow documentation, it is shown how to tune several hyperparameters but not the learning rate.I have searched how to tune learning rate using HParams dashboard but could not find much. The only example is another question on github but it does not work.Can you please give me some suggestions on this?Should I use a callback function?Or provide different learning rates in hp_optimizer as in the question in github? Or something else?
Parts of my code is below:
HP_NUM_UNITS = hp.HParam('num_units', hp.Discrete([16, 32]))
HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))
HP_L_RATE= hp.HParam('learning_rate', hp.Discrete([0.0005, 0.001]))
def train_model(hparams):
model= tf.keras.models.Sequential([
tf.keras.layers.InputLayer(input_shape=(None,43), dtype=tf.float64),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(hparams[HP_NUM_UNITS], return_sequences=True)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(hparams[HP_NUM_UNITS], activation='relu')),
tf.keras.layers.Dense(1)])
model.compile(optimizer=hparams[HP_OPTIMIZER],loss='mae',metrics=['accuracy'])
callback = tf.keras.callbacks.LearningRateScheduler(hparams[HP_L_RATE])
model.fit(train_dataset, epochs=100,callbacks=[callback]) #, validation_data=test_dataset
_,loss=model.evaluate(test_dataset)
return loss
Problem is I cannot figure out how and where to insert hparams[HP_L_RATE] in the train_model().As you can see, I have tried to use a callback function to implement hparams[HP_L_RATE], but it does not work.
Thank you,
Topic hyperparameter-tuning learning-rate tensorflow deep-learning
Category Data Science