model.add(LSTM(4, input_dim=look_back)) // 4 LSTM blocks
But how many cells it has in keras?
for example i assume 4 memory blocks and 2 cells. Please see the architecture below. is this right?
In stacked LSTM
model.add(LSTM(4, return_sequences=True))
model.add(LSTM(4, return_sequences=False))
for example i assume 4 memory blocks and 2 cells. Please see the architecture below. is this right?
in that case, in the attached image in the hidden layer i have 4 lstm blocks (layer1) these feedback to another lstm layer which has same 4 lstm blocks. Am i right?
I would like to know the LSTM architecture of imdb_lstm example.