One common GloVe equation (see, for example, slide 24 of the public slides for the second lecture in
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 2 - Neural Classifiers
) is:
w_i \dot w_j = log P(i | j)
Image:

BUT in general
this is wrong. As a dot product, the left-hand side is commutative in i, j while as a conditional probability, the right hand side is not.
Can someone indicate the correct statement? Or please tell me what I'm missing?
Thanks!
- Nick