Why is my validation loss never INcreasing?

I am currently training different neural networks for the binary classification of images. When using the logistic regression, my validation loss never increases, even not after 5000 epochs. I thought that at some point overfitting happens and the validation loss always increases. Does anybody know why this does not happen?
Category: Data Science

Why is cross entropy based on Bernoulli or Multinoulli probability distribution?

When we use logistic regression, we use cross entropy as the loss function. However, based on my understanding and https://machinelearningmastery.com/cross-entropy-for-machine-learning/, cross entropy evaluates if two or more distributions are similar to each other. And the distributions are assumed to be Bernoulli or Multinoulli. So, my question is: why we can always use cross entropy, i.e., Bernoulli in regression problems? Does the real values and the predicted values always follow such distribution?
Category: Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.