Yes, & this is generally the case with word-vectors (word2vec, FastText, GloVe, etc). Their coordinates only have meaning in comparison to other vectors that were co-trained into the same model.
So you can't just append some vectors from another model into an existing set, and have the various similarities/directions work between those added vectors, & the original vectors, for usual word-vector benefits.
There are ways people have improvised to force word-vectors into existing coordinate systems. As once example, section 2.2 ("Vocabulary Expansion") of this 2015 paper describes one strategy to learn-a-projection from one set of vectors, to another, using words they have in common, but with the benefit of moving the unique/extra words in one to another. Inside Gensim, the `TranslationMatrix` class offers a similar functionality, but I'm not sure of its overall utility for such a purpose:
Still, training one combined word-vector model, on a generous corpus with good example usages of all words of interest, is likely to be the most-simple/robust approach.
- Gordon