What hyperparameter values does the LDA mallet model use by default? Is it true that the formula to calculate alpha = 5.0/n(topics)?

I am trying to figure out the default $\alpha$ $\eta$ values used by mallet LDA, but there is not a lot of information on this. I did find a couple of answers, with no proper references, saying that symmetric $\alpha$ can be calculated with 5.0/num_topics? Why is that? Why can't I use 1.0/num_topics to calculate the symmetric $\alpha$, just like in standard LDA? Can someone please help me understand and link me to references?

Thanks in advance.

Topic dirichlet lda nlp python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.