So this is a very old question, but for anyone coming from Google:
According to the documentation provided by Tensor Flow, tf.estimator.DNNClassifier
has the parameter activation_fn
which is described by:
Activation function applied to each layer. If None, will use tf.nn.relu
Therefore, this model only takes one activation function and uses it on all layers. That being said, Tensor Flow notes:
Warning: Estimators are not recommended for new code. Estimators run v1.Session-style code which is more difficult to write correctly, and can behave unexpectedly, especially when combined with TF 2 code. Estimators do fall under our compatibility guarantees, but will receive no fixes other than security vulnerabilities. See the migration guide for details.
Therefore, to solve this problem and be compatible with Tensor Flow's recommendations one can use keras to create the desired DNN layer by layer. As shown in the example here:
import tensorflow as tf
inputs = tf.keras.Input(shape=(3,))
x = tf.keras.layers.Dense(4, activation=tf.nn.relu)(inputs)
outputs = tf.keras.layers.Dense(5, activation=tf.nn.softmax)(x)
model = tf.keras.Model(inputs=inputs, outputs=outputs)
The activation function for each layer can be specified when creating each layer. For more on keras.layers see here.