Laplacian smoothing on Class Probability (Naive bayes)

I am implementing a Naive Bayes classifier in Python from scratch. The instructions I have asks that I incorporate Laplacian Smoothing with K=1 to computing the probability that a message belongs to a given class. I have two classes, ham and spam (E-mail spam filtering problem). So one example without smoothing would be if I had X spam messages and Y spam messages:

P(X) = X / (X+Y) --> Probability that a class is spam in my dataset P(Y) = Y / (X+Y) --> Probability that a class is ham in my dataset

I don't know how to incorporate Laplacian Smoothing to this simple formula.

Topic probabilistic-programming naive-bayes-classifier

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.