Multiple activation functions with TensorFlow estimator DNNClassifier

I just want to know if is it possible to use tf.estimator.DNNClassifier with multiple different activation functions. I mean, could I use a DNNClassifier estimator which use different activation functions for different layers?

For example, if I have a three layers model, could I use for the first layer a sigmoid function, for the second one a ReLu function and finally for the last one a tanh function?

I would like to know if it isn't possible to do it with DNNClassifier how can I do it by a easy way.

Topic tensorflow python machine-learning

Category Data Science


So this is a very old question, but for anyone coming from Google:

According to the documentation provided by Tensor Flow, tf.estimator.DNNClassifier has the parameter activation_fn which is described by:

Activation function applied to each layer. If None, will use tf.nn.relu

Therefore, this model only takes one activation function and uses it on all layers. That being said, Tensor Flow notes:

Warning: Estimators are not recommended for new code. Estimators run v1.Session-style code which is more difficult to write correctly, and can behave unexpectedly, especially when combined with TF 2 code. Estimators do fall under our compatibility guarantees, but will receive no fixes other than security vulnerabilities. See the migration guide for details.

Therefore, to solve this problem and be compatible with Tensor Flow's recommendations one can use keras to create the desired DNN layer by layer. As shown in the example here:

import tensorflow as tf

inputs = tf.keras.Input(shape=(3,))
x = tf.keras.layers.Dense(4, activation=tf.nn.relu)(inputs)
outputs = tf.keras.layers.Dense(5, activation=tf.nn.softmax)(x)
model = tf.keras.Model(inputs=inputs, outputs=outputs)

The activation function for each layer can be specified when creating each layer. For more on keras.layers see here.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.