LSTM Class - Keras.layers.lstm

32 views
Skip to first unread message

Denis Dal Soler

unread,
Nov 13, 2024, 4:34:12 AM11/13/24
to Keras-users
Hi
is there someone that can please explain me the architecture that this instruction makes (better graphically)?
model.add(LSTM(128, return_sequences=True, input_shape=(look_back, 1)))
In some discussion I found that 128 are the LSTM units of the hidden LSTM layer, in others I found 128 are the LSTM cells of the layer (and each cell has look_back LSTM unit).
Can someone that knows the python code of this instruction (where I can find it?) tell me which kind of architecture it makes? How many single LSTM (meaning forget gate, input gate, output gate)   are there?? 128?128xlook_back?Others?
Thanks
Denis

Denis Dal Soler

unread,
Nov 13, 2024, 4:34:58 AM11/13/24
to Keras-users
Hi
is there someone that can please explain me the architecture that this instruction makes (better graphically)?
model.add(LSTM(128, return_sequences=True, input_shape=(look_back, 1)))
In some discussion I found that 128 are the LSTM units of the hidden LSTM layer, in others I found 128 are the LSTM cells of the layer (and each cell has look_back LSTM unit).
Can someone that knows the python code of this instruction (where I can find it?) tell me which kind of architecture it makes? How many single LSTM (meaning forget state, input state, output state)   are there?? 128?128xlook_back?Others?
Thanks
Denis



--
Reply all
Reply to author
Forward
Message has been deleted
0 new messages