Generation of word embedding in tensor2tensor-transformer model
44 views
Skip to first unread message
siddhant sharma
unread,
Jul 13, 2019, 2:08:38 AM7/13/19
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to tensor...@googlegroups.com
Hi All,
I am new to tensor2tensor library and trying to understand the code flow and various essential components.
I am not able locate the python file where the word embedding are crated for the time. Are the word embedding for all tokens created while the generate data method is called? The class or method which tokens in a word or sentence and generates the word embedding.
Any guidance and clue in order to understand where word embedding are created for the first and at what stage it would be helpful for me to connect the dots.
Reverences Siddhant Sharma
Martin Popel
unread,
Jul 13, 2019, 1:42:57 PM7/13/19
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to siddhant sharma, tensor2tensor
Hi Siddhant,
by default, word embeddings are treated as all other weights in the Transformer model,
i.e. they are initialized randomly and trained together with the whole network via backpropagation.