How can I output a non-negative prediction in a regression problem?

3,497 views
Skip to first unread message

Chong Wang

unread,
Sep 24, 2016, 10:55:25 PM9/24/16
to Keras-users
I am doing a "many-to-many" regression problem, in which the predictions are time durations. A time duration should be a non-negative value. The Y_train does not contain any negative numbers.

My Keras RNN is as below. The issue is, when the training error decreases to a certain level, the outputted prediction sometimes is negative, which does not make sense in my application.


model = Sequential()
model
.add(LSTM(input_shape=(timestep_num, feat_num), output_dim=256, activation='tanh', return_sequences=True))
model
.add(LSTM(output_dim=64, activation='tanh', return_sequences=True))
model
.add(TimeDistributed(Dense(output_dim=1, activation='linear')))
model
.compile(loss='mean_squared_error', optimizer=optimizer, metrics=['mean_squared_error'])


I think I probably have to try other activation functions because 'linear' does not guarantee that the outputs are non-negative. (I have tried to use 'relu' instead of 'linear' or 'tanh' as the activation function. But 'relu' always makes all predictions zero.)

Is there any suggestion?

anyt...@gmail.com

unread,
Nov 6, 2016, 8:28:56 AM11/6/16
to Keras-users
Hi Chong,

I had a similar problem, where some of the LSTM-network outputs were negative. It happened especially when training with data that contained a lot of zeros.
But I found a way to prevent this behavior without effecting the predictions that were already good before.

What I did is to use the "relu" activation function in LSTM and additionally constraint the weights of the dense output layer to be non-negative.

So maybe you could try it like this:
from keras.constraints import nonneg

.....

model = Sequential()
model
.add(LSTM(input_shape=(timestep_num, feat_num), output_dim=256, activation='relu', return_sequences=True))
model
.add(LSTM(output_dim=64, activation='relu', return_sequences=True))
model
.add(TimeDistributed(Dense(output_dim=1, activation='linear', W_constraint=nonneg())))
model
.compile(loss='mean_squared_error', optimizer=optimizer, metrics=['mean_squared_error'])


I hope this might help you.
Cheers.
Reply all
Reply to author
Forward
Message has been deleted
0 new messages