I'm training a net with last SIgmoid layer that used only on phase of TEST, and Sigmoid Cross Entropy Loss, since loss has its own sigmoid layer within.
And that last sigmoid layer prints its whole training batch output to the console, I do not understand why he does that.
Here is that piece of net:
layer {
name: "sigmoid"
bottom: "ip2"
top: "en1neuron"
type: "Sigmoid"
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "en1neuron"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SigmoidCrossEntropyLoss"
bottom: "ip2"
bottom: "label"
top: "loss"
}