Do TimeDistributed layers share weights?

370 views
Skip to first unread message

matthew...@gmail.com

unread,
Jun 1, 2016, 3:14:44 PM6/1/16
to Keras-users
This is a pretty simple question but the documentation is unclear on this. For example, if I add a layer like TimeDistributed(Convolution2D(64, 3, 3), input_shape=(10, 3, 299, 299)) like in the documentation, will the same filters be applied to each frame in my video? 

Brian McMahan

unread,
Jun 2, 2016, 4:52:26 AM6/2/16
to Keras-users, matthew...@gmail.com
yes. 

it's best to check the code when the documentation isn't explicit about some feature. Specifically, if you look at https://github.com/fchollet/keras/blob/master/keras/layers/wrappers.py#L114 you will see that the TimeDistributed has an internal variable, self.layer, and it applies that layer repeatedly over the time dimension.  Note that the second case of that call method just reshapes it to put the time and sample dimensions together, which also means that same layer is applied to all of your samples. 
Reply all
Reply to author
Forward
Message has been deleted
0 new messages