Prediction using words which were not in training in a CNN with pre-trained word embeddings
In sentence classification using pre-trained embeddings(fasttext) in a CNN, how does the CNN predict the category of a sentence when the words were not in the training set?
I think the trained model contains weights, these weights are not updated in the prediction stage, are they?. Then, what happens when the words in the sentence (for which the cnn will predict a category) were not seen in the training? I think they do not have a word vector, only the words that were found in the training.
Topic deep-learning nlp machine-learning
Category Data Science