On Wednesday, Jun 14, 2017 at 1:21 PM, <smole...@gmail.com> wrote:
While running train command, I can see that the network is minimizing two things: train loss and train ppl.What is ppl? Is that perplexity? Isn't train perplexity the same as train loss?
--
You received this message because you are subscribed to the Google Groups "fairseq Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fairseq-user...@googlegroups.com.
To post to this group, send email to fairse...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/fairseq-users/68724b72-c966-416a-a905-4f9fff9cc40b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.