use LSTM to predict sequence of real number

407 views
Skip to first unread message

Wuxia Zhao

unread,
May 31, 2015, 11:14:19 AM5/31/15
to keras...@googlegroups.com
    As I wish to do something about sequence prediction/generation,  the goal is that, given a sequence and to predict the next sequence
For example, given a straight line with slope k, I use the formal sequence (such as [0.01, 0.02, 0.03, ..., 0.80])as training set, and I hope the network can find the pattern the line goes, then after training I feed the network with the formal sequence and hope it can generate the following sequence like [0.81, 0.82, ...].
    I only use one LSTM layer to form a simple network, and the codes in Keras is shown below:
|---------------------------------------------------------------------------------------|
| model=Sequential()
| model.add(LSTM(1, 128, init='glorot_normal',  truncated_gradient=-1))
| model.add(Dense(128,1))
| model.add(Activation('relu'))
| rms=RMSprop(clipnorm=5.0)
| model.compile(loss='mse', optimizer=rms)
|---------------------------------------------------------------------------------------|
    However, after training 100 epochs and the loss converges to nearly 0.003, I feed the model one sample, and predict one number at a time, iterates 100 times to generate the next 100 numbers. It shows to be not linear, just like this:
 the red line is the groundtruth, and the blue line is the predictions which should be a line too but not.

    I think this linear pattern is so simple that the network doesn't need to be complicated. Is there anything wrong with my model arthitecture or my train method (format of input and output), or even my idea is not right(if so, what should I do to predict sequence)?

    Thanks!

Wuxia Zhao

unread,
May 31, 2015, 11:17:22 AM5/31/15
to keras...@googlegroups.com
PS:
my training dataset is lines with same slope which I think is the simplest.
Reply all
Reply to author
Forward
0 new messages