GloVe dot product optimized for non-comutative data whilst the operation itself being commutative

To my current knowledge, GloVe word vectors dot product are optimized to be the

w_i ⋅ w_j = log⁡(P(ⅈ|j))

The probability being computed from a cooccurance matrix. However, dot product is a commutative operation, whilst the log probablity isn't. Is this issue being adressed in GloVe? Am I missing something?

Topic mathematics stanford-nlp word2vec word-embeddings nlp

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.