Embedding layer initializer

141 views
Skip to first unread message

thomas.al...@gmail.com

unread,
Sep 25, 2018, 11:09:34 AM9/25/18
to Keras-users
Whats the difference between initializing a Keras Embedding layer using "embeddings_initializer" versus setting "weights" directly as in:

# create embedding matrix (from pretrained embeddings for example)
embeding_matrix = ...

# create the embedding layer this way
embedded_sequences = Embedding(num_words,
                            EMBEDDING_DIM,
                      ----> embeddings_initializer=Constant(embedding_matrix),
                            input_length=MAX_SEQUENCE_LENGTH,
                            trainable=False,
                            )(sequence_input)

# or this way? 
embedded_sequences = Embedding(num_words,
                            EMBEDDING_DIM,
                      ----> weights=[embedding_matrix],
                            input_length=MAX_SEQUENCE_LENGTH,
                            trainable=False,
                            )(sequence_input)

Are they the same?  Is one better than the other?

Sergey O.

unread,
Sep 25, 2018, 11:59:17 AM9/25/18
to thomas.al...@gmail.com, Keras-users
The initializer is intended to be used with one of the random initializers:

I suppose you could hack the initializer and give it a set of pre-trained weights...
Assuming it works, it shouldn't be any different than re-assigning the weights after random initialization.

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/48614b2e-573e-4df0-8c62-ed6142d526fa%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages