In simple terms, LookupTable transforms your vocab indices into tensors via convolution operation. For example, a train phrase "a green frog" will be represented in your data let's say as a set of numbers {1, 22, 15}. What LookupTable does is that it transforms each number into a n-dimensional tensor:
1 --> LookupTable --> [44 32 21 43 231]
22 --> LookupTable --> [2 12 94 12 33]
15 --> LookupTable --> [899 22 39 2 0]
This way, your neural network will be able to learn the tensors during backprop. You can get rid of LookupTable and use pretrained embeddings like word2vec or GloVe which already come as learned n-dimensional vectors.