Should the cost function be zero using TensorFlow's sigmoid_cross_entropy_with_logits?
I'm building a CNN to make a binary classification (1 or zero). For this, I'm using the cost function sigmoid_cross_entropy_with_logits
.
But for some reason, the cost using this function is never equal to zero even if the prediction is equal to the correct valuel.
I tried plotting the output using the formula on TensorFlow's website: https://www.tensorflow.org/api_docs/python/tf/nn/sigmoid_cross_entropy_with_logits
This formula:
max(x, 0) - x * z + log(1 + exp(-abs(x)))
And by making this plot, I realized that it really isn't zero when the outputs are equal. For example, if z = 0 and x = 0, the result of this function is ~0.693.
This isn't really making sense to me. Can someone shed some light on why it isn't zero when the prediction is correct?
Topic cost-function tensorflow python machine-learning
Category Data Science