Naive bayes expectation maximization vs logistic regression for binary classification

Assuming I'm dealing with binary classification.

For what kind of data Naive bayes using expectation maximization would give a better solution and for what kind of data logistic regression would be the better choice?

Topic naive-bayes-classifier logistic-regression classification

Category Data Science


On a very high level -

Naive Bayes is a probabilistic model based on Bayes theorem and it is scale-invariant. That means scaling and normalizing the data won't affect your model's performance. It is a batch learning algorithm which means model parameters are directly computed without searching using methods like gradient descent. There is no need of iterating over data again and again.

Logistic Regression is also a probabilistic model based on sigmoid activation. Model parameters are not computed directly but searched using techniques like gradient descent over feature space. This means scaling and normalizing the data will affect your model's performance. You need to iterate on the data many times.

A good explanation can be found here - https://dataespresso.com/en/2017/10/24/comparison-between-naive-bayes-and-logistic-regression/

Choosing between models depends on your dataset and techniques like cross-validation will let you know which model should be chosen.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.