On 30.03.2017 00:59,
kevinand...@gmail.com wrote:
> This is the approach I am going for, but I am unable to access the
> parameters in the loss function. The indicators come from my input
> features, but I cannot see those in the Keras loss function - all I have
> is the y_true and y_pred Tensors to work with.
>
> I would need to somehow access the current input feature vector within
> the loss function - is this possible?
Two ways, actually. One is to pass it as one of the dimensions of your
network's output and then take it from the `y_true` parameter inside the
loss function, another is to use a concatenate layer [1] to merge the
input layer and your actual output into synthetic output, then
deconstruct these two components from the `y_pred` argument inside your
loss function.
Obviously, the second approach is much easier if you use the functional API.
Remember that the shape of your network's actual output doesn't have to
directly agree with the shape of the matrix you pass as the `y`
parameter in the `Model.fit` method. As long as your loss function can
deal with that, it will work. For the first case you can do sth like the
example below; the second case would make use of y_pred the same way.
def loss(y_true, y_pred):
# y_pred.shape is (37, 42)
# y_true.shape is (37, 42, 2)
mask = y_true[:, :, 1] # now the mask.shape is (37, 42)
y_true = y_true[:, :, 0] # and y_true.shape is also (37, 42)
return (… your actual loss function code goes here …)
[1]
https://keras.io/layers/merge/
--
Tomasz Melcer