Creating python layer to be used up in prototxt file for multi label cross entropy loss. But the training process stops suddenly on trying to create the particular loss layer. Have checked my $PYTHONPATH to be parent directory where my caffe-layer-class-file is stored.
Attaching code of the Custom Layer
import caffe
import numpy as np
from scipy import special
class CustomSigmoidCrossEntropyLossLayer(caffe.Layer):
def setup(self, bottom, top):
# check for all inputs
if len(bottom) != 2:
raise Exception(
'Need two inputs (scores and labels) to compute sigmoid crossentropy loss.')
def reshape(self, bottom, top):
# check input dimensions match between the scores and labels
if bottom[0].count != bottom[1].count:
raise Exception('Inputs must have the same dimension.')
# difference would be the same shape as any input
self.diff = np.zeros_like(bottom[0].data, dtype=np.float32)
# layer output would be an averaged scalar loss
top[0].reshape(1)
def forward(self, bottom, top):
score = bottom[0].data
label = bottom[1].data
first_term = score * label
second_term = np.maximum(score, 0)
third_term = np.log(1 + np.exp(-1 * np.absolute(score)))
top[0].data[...] = np.sum(first_term - second_term - third_term)
sig = special.expit(score)
self.diff = (sig - label)
# if np.isnan(top[0].data):
# exit()
def backward(self, top, propagate_down, bottom):
bottom[0].diff[...] = self.diff
Also pls find the logs attached. Just attaching the last 4-5 lines here.
I1205 12:01:17.921049 7947 net.cpp:84] Creating Layer flatten_labels
I1205 12:01:17.921056 7947 net.cpp:406] flatten_labels <- label
I1205 12:01:17.921067 7947 net.cpp:380] flatten_labels -> flattened_label
I1205 12:01:17.921087 7947 net.cpp:122] Setting up flatten_labels
I1205 12:01:17.921094 7947 net.cpp:129] Top shape: 8 1000 (8000)
I1205 12:01:17.921098 7947 net.cpp:137] Memory required for data: 3706681088
I1205 12:01:17.921105 7947 layer_factory.hpp:77] Creating layer loss
And after message its creating loss layer gets terminated.