Use pretrained word vectors over custom trained word2vecs
Currently i'm working on a sentiment analysis research project using LSTM networks.
As the input I convert sentences into set of vectors using word2vec.
And there are some well pretrained word vectors like Google word2vec.
My problem is, is there are any advantages of using custom trained word2vecs(train using a dataset which related to our domain, such as user reviews of electronic items) over pretrained ones.
Whats the best option
use a pretrained word2vec
Train our own word2vec using a dataset related to the domain
Can any one help me on this Thanks
Topic word lstm word2vec word-embeddings
Category Data Science