Does N-gram language model for text generation are more efficient than Neural Network language models?

I recently build an language model with N-gram model for text generation and for change I started exploring Neural Network for text generation. One thing I observed that the previous model results were better than the LSTM model even when both where built using same corpus.

Topic lstm text-generation ngrams rnn nlp

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.