layer {
    name: "data"
    top:  "data"
    top:  "label"
    type: "ImageData"
    include {
        phase: TEST
    }
    image_data_param {
        source: "/home/esterlein/Orlanet/test_data.txt"
        batch_size: 5
        new_height: 256
        new_width: 256
        shuffle: true
    }
    transform_param {
        crop_size: 227
        mirror: true
    }
}layer {
    name: "data"
    top:  "data"
    top:  "label"
    type: "Input"
    input_param { shape: { dim: 1 dim: 3 dim: 256 dim: 256 }}
}layer {
    name: "fc6"
    top:  "fc6"
    type: "InnerProduct"
    bottom: "drop5"
    inner_product_param {
        num_output: 2
        weight_filler {
            type: "xavier"
            std: 0.1
        }
    }
}
layer {
    name: "prob"
    top:  "prob"
    type: "SoftmaxWithLoss"
    bottom: "fc6"
    bottom: "label"
}
layer {
    name: "accuracy"
    top:  "accuracy"
    type: "Accuracy"
    bottom: "fc6"
    bottom: "label"
    include {
        phase: TEST
    }
}Check failure stack trace:
... 
caffe::SoftmaxWithLossLayer<>::Reshape()
caffe::Net<>::Init() caffe::Net<>::Net()
...
Check failed: outer_num_ * inner_num_ == bottom[1]->count() (1 vs. 196608) Number of labels must match number of predictions; e.g., if softmax axis == 1 and prediction shape is (N, C, H, W), label count (number of labels) must be N*H*W, with integer values in {0, 1, ..., C-1}.
Ok, 1x3x256x256 = 196608, but why I need this label count?
I have a file "labels.txt" as in the example "classification.cpp":
Why labels != classes?
What should I do with SoftmaxWithLoss and input dimentions?