How to calculate the fully connected neural network footprint for each layer?
I'm using a MADDPG (RL algorithm) and trying to find out the memory footprint for each layer in it. Maddpg: https://github.com/openai/maddpg The neural network is described here.
def mlp_model(input, num_outputs, scope, reuse=False, num_units=64, rnn_cell=None):
# This model takes as input an observation and returns values of all actions
with tf.variable_scope(scope, reuse=reuse):
out = input
out = layers.fully_connected(out, num_outputs=num_units, activation_fn=tf.nn.relu)
out = layers.fully_connected(out, num_outputs=num_units, activation_fn=tf.nn.relu)
out = layers.fully_connected(out, num_outputs=num_outputs, activation_fn=None)
return out
I wanted to calculate the memory footprint when training for each layer.
Topic tensorflow memory reinforcement-learning python
Category Data Science