Activation Function Hyperparameter Optimisation

If I have a model, say:

  def build_model(self, hp):                                            
    model = Sequential()
      model.add(Dense(hp.Choice('units', [12,16,20,24]), hp.Choice(activation, [elu, 
                      exponential, gelu, hard_sigmoid, 
                      linear, relu, selu, sigmoid, 
                      softmax, softplus, softsign, 
                      swish, tanh])))
      
    model.add(Dense(4, hp.Choice(activation, [elu, 
                      exponential, gelu, hard_sigmoid, 
                      linear, relu, selu, sigmoid, 
                      softmax, softplus, softsign, 
                      swish, tanh])))
    optimizer=tf.keras.optimizers.SGD(learning_rate=1e-5)
    model.compile(loss='mse', optimizer=optimizer, metrics=['mse'])
    return model

and I want to span the space where the activation functions change on each layer, I believe that hp.Choice will choose one, only, activation function, for the whole model each time I run a Hyperparameter optimisation.

How do I set it up so that my model chooses a potentially different activation for each layer? I currently can't see any documentation about it, and I looked at defining my own new metrics as such and seeing if I can insert it this way but it is suggested new metrics for hyperparameters is not advised.

Thanks!

Topic hyperparameter-tuning machine-learning-model keras hyperparameter machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.