Keras/Theano custom loss calculation - working with tensors
I'm struggling to write some tensor manipulation code for a custom loss function I'm using in Keras.
Basically, I'm trying to modify a binary_crossentropy loss by adding a weight that is calculated from a particular feature.
First thing I do is pass in my extra feature data to the custom loss by appending it to the y_true like this:
y_trainModded = numpy.append(y_train, MyExtraData, axis=1)
Which is then passed to the fit function like:
model.fit(X_train, y_trainModded, epochs=2500, .....)
Then extracted to make it usable like this:
def myCustomLoss(data, y_pred):
y_true = data[:,:2]
MyExtraData = data[:,2]
...
...
So far, that all works fine. However, I'm struggling with a section where I want to only select the MyExtraData where I predicted '1'. Intuitively, this would simply be something like:
ExtraDataWherePredicted1 = MyExtraData[y_pred 0]
However, we're dealing with tensors, not numpy arrays. I tried casting to numpy arrays using eval(), but that didn't work. I also tried various approaches using keras.backend operations such as:
WherePredicted1 = K.greater( y_pred,0)
ExtraDataWherePredicted1 = tf.boolean_mask(MyExtraData, WherePredicted1)
Which I could then use to weight my loss such as:
return K.mean(K.binary_crossentropy(y_pred,y_true), axis=-1)-(K.mean(ExtraDataWherePredicted1))
But anything I try throws out various errors...I just can't figure out how to calculate ExtraDataWherePredicted1. I'm also finding it super hard to debug the loss function because I can't print() anything inside it, so it's very hard to double check to see if the arrays/tensors are what I expect them to be.
Any help would be appreciated!
Topic keras theano deep-learning python machine-learning
Category Data Science