Hi, Evan,
> dense -> relu -> dropout -> (other layers)
>
> or
>
> dense -> dropout -> relu -> (other layers)
isn't that producing the exact same results? You can think of the dropout as a mask over your array, e.g., say your "dense" output is
[1, 2, 3, 4, 5]
then you apply dropout and it becomes e.g.,
[1*1.5, 0, 3*1.5, 0, 5*1.5]
Then, if you apply relu, it is still
[1*1.5, 0, 3*1.5, 0, 5*1.5]
If you do it the other way round, dense -> relu -> dropout, you have
relu([1, 2, 3, 4, 5]) -> [1, 2, 3, 4, 5]
dropout([1, 2, 3, 4, 5]) -> [1*1.5, 0, 3*1.5, 0, 5*1.5]
Best,
Sebastian
> --
> You received this message because you are subscribed to the Google Groups "Discuss" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to
discuss+u...@tensorflow.org.
> To post to this group, send email to
dis...@tensorflow.org.
> To view this discussion on the web visit
https://groups.google.com/a/tensorflow.org/d/msgid/discuss/1e9fc77a-2874-4e1f-a769-22724bd51214%40tensorflow.org.