GradCAM heatmap all negative
I've fully trained the VGG16 model on my dataset, resulting in a 97% validation accuracy. I'm using the code from this github: https://github.com/PowerOfCreation/keras-grad-cam but for 2 of my 16 classes, I'm getting all negative gradients, resulting in a heatmap with all null values. The other classes generate normal consistent maps which clear similar patterns.
The problem is obviously occurring here because the 0 is always the max value.
cam = np.maximum(cam, 0)
heatmap = cam / np.max(cam)
The only solution I've found so far is rescaling cam to (-1,1). It gives me a map for those 2 classes, without significantly altering the maps for the other 14 classes.
Here is my model:
vgg16 = VGG16(include_top = False, weights = None,input_shape =(INPUT_SIZE,INPUT_SIZE,3))
last_layer = vgg16.output
last_layer = Flatten(name = 'flatten')(last_layer)
last_layer = Dense(NUM_CLASSES, activation = 'softmax')(last_layer)
model = Model(input = vgg16.input,output = last_layer)
I had additional dense layers but removed them trying to solve this, but since the last conv layer is the last layer of vgg16, it doesn't make a difference. I've tried running the gradcam from the original training script and by loading the saved model from '.h5'
Is this right? Are all the gradients for those classes negative? Is rescaling an acceptable solution? I don't know where the error is coming from, I suspect it's from this area, but I'm not sure how to troubleshoot.
# conv_output = [l for l in model.layers if l.name is layer_name][0].output
conv_output = model.get_layer(layer_name).output
grads = normalize(_compute_gradients(loss, [conv_output])[0])
gradient_function = K.function([model.input], [conv_output, grads])
output, grads_val = gradient_function([image])
output, grads_val = output[0, :], grads_val[0, :, :, :]
Topic heatmap cnn keras python
Category Data Science