Hey,
I'm currently working on RNN with LSTM layers. I implemented a simple RNN with a single LSTM layer and adjacent fully connected layers etc. Training and testing works fine.
However, I would like to vary the batch size for testing purposes directly in Python. After initializing the net and loading the caffemodel file with the trained weights, I reshape the input blobs 'data' and 'clip' (my sequence marker).
deploy_net.blobs['data'].reshape(batch_size, stream, channels, height, width)
deploy_net.blobs['label'].reshape(batch_size, label_dim)
Afterwards, an error occurs when I want to reshape the whole net or compute the forward pass (in this example from 100 to 80).
F1010 16:08:49.571087 5596 recurrent_layer.cpp:187] Check failed: T_ == bottom[0]->shape(0) (100 vs. 80) input number of timesteps changed
I assume that is due to the unrolled implementation of recurrent layers in Caffe. Is there a way to newly setup the LSTM layer in Python after changing the batch size of its input blobs?
Thanks in advance,
Moritz