update LSTM model for new data

501 views
Skip to first unread message

devakar verma

unread,
Jan 8, 2018, 6:15:26 AM1/8/18
to Keras-users
If we have an LSTM model already created from the one-year dataset. Then I want to update the model on the daily basis from the daily dataset. How can I achieve it?

# Lets suppose I have created a model:
model = Sequential()
model.add(LSTM(12, batch_input_shape=(1, train_X.shape[1], train_X.shape[2]), return_sequences=True, name='layer-1'))
model.add(LeakyReLU(alpha=.001))
model.add(LSTM(8, return_sequences=True, name='layer-2'))
model.add(LeakyReLU(alpha=.001))
model.add(Dense(1, name='output-layer'))
model.compile(loss='mse', optimizer='adam', metrics=['mse', 'mae', 'mape'])
history = model.fit(train_X, train_y, epochs=100, batch_size=1, validation_data=(test_X, test_y), verbose=2, shuffle=False)
model.save('model-5.h5')

I want to update this model after loading the saved model with the subsequent dataset.

Sergii Myskov

unread,
Jan 18, 2018, 7:05:03 AM1/18/18
to Keras-users
If you want to change the last layer, building something on a top of model. Something like this:

from keras.models import Model

model
= load_model()

layer
= model.layers[-1].output
layer
= Dense(100)(layer)
layer
= BatchNormalization()(layer)
layer
= Actiation('relu')(layer)
layer
= Dense(1, name='new_out')(layer)

model
= Model(inputs=model.inputs, outputs=[layer])
model
.compile(...)
model
.fit(...)

devakar verma

unread,
Jan 23, 2018, 12:31:47 AM1/23/18
to Keras-users
Hi Sergi,
I don't want to update model by adding more layers. I just want to update the model when new dataset arrives (or you can say retrain the existing model with new forthcoming dataset).

Sergii Myskov

unread,
Jan 23, 2018, 3:51:05 PM1/23/18
to Keras-users
Hi Devakar.

Sorry, I misunderstood your question.

I think the only difference is a sequence length (I hope it is).

If you really want to pass an exact batch size—and it's only the case if you want to run stateful model,—you can try this:

def get_model(X):
    model = Sequential()
    model.add(LSTM(12, batch_input_shape=(1, X.shape[1], X.shape[2]), return_sequences=True, name='layer-1'))
    model.add(LeakyReLU(alpha=.001))
    model.add(LSTM(8, return_sequences=True, name='layer-2'))
    model.add(LeakyReLU(alpha=.001))
    model.add(Dense(1, name='output-layer'))
    model.compile(loss='mse', optimizer='adam', metrics=['mse', 'mae', 'mape'])

   
return model

train_X
, train_y = load_data() # Your year-basis dataset.
model
= get_model(train_X)

model
.fit(train_X, train_y, epochs=100, batch_size=1, validation_data=(test_X, test_y), verbose=2, shuffle=False)
model.save('model-5.h5')

train_X
, train_y = load_data() # Your daily-basis dataset.
model_daily
= get_model(train_X)
model_daily
.set_weights(model.get_weights())
# Or alternatively:
# model.load_weights('model-5.h5')

Or you event can make your batch size and sequence length variable by modifying your initial layer parameters:
model = Sequential()
model.add(LSTM(12, input_dim=(None, None, train_X.shape[2]), return_sequences=True, name='layer-1'))
...

devakar verma

unread,
Jan 25, 2018, 6:32:08 AM1/25/18
to Keras-users
can i do something like this:

model = load_model('saved_model.h5')
model.fit(new_train_X, new_train_y, ....)

just loading the saved model and retraining with the new subsequent dataset. Will it able start with previous weights of the saved model. 

Sergii Myskov

unread,
Jan 25, 2018, 6:57:33 AM1/25/18
to Keras-users
I guess your new dataset has different input shapes, so this may be an issue in your case (input data shape mismatch).

Also in your case first two data dimensions are batch size and sequence length, which do not affect the weight matrices.

So, just do not specify an exact batch size and sequence length (use None for them)—it's the case for stateless model,—or generate exact the same model with a new input shape and just load weights from your stored model as I said in previous comment.
Reply all
Reply to author
Forward
0 new messages