I am doing a thesis about sentence embeddings, and basically if you wanna encode a sentence using word embeddings, you simply create an embedding for each word in the sentence and then average the vectors to get the sentence vector. Averaging works surprisingly good, there are some articles about why a simple average can keep semantic meaning of the whole sentence (it has something to do with the high dimensionality of the embeddings). But I have to tell you that GloVe performs quite poorly for sentence embeddings. From my tests, word2vec or FastText work better for this, but if you really want to have a well performing model, I would suggest you to try some dedicated sentence embeddings (like sent2vec or InferSent). Sent2vec is quite easy to use, just follow the instructions on their GitHub (
https://github.com/epfml/sent2vec).
Cheers