Hi,
I am trying to train a very simple classification network on 13 classes defined as:
name: "myOwnNet"
layers {
name: "data"
type: HDF5_DATA
top: "data"
top: "label"
hdf5_data_param {
source: "/path/to/input.txt"
batch_size: 1
}
include: { phase: TRAIN }
}
layers {
name: "data"
type: HDF5_DATA
top: "data"
top: "label"
hdf5_data_param {
source: "/path/to/input2.txt"
batch_size: 1
}
include: { phase: TEST }
}
layers {
name: "fc1"
type: INNER_PRODUCT
bottom: "data"
top: "fc1"
inner_product_param {
num_output: 13
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layers {
name: "loss"
type: SOFTMAX_LOSS
bottom: "fc1"
bottom: "label"
top: "loss"
}
However when I try to train it I get the following error:
" F0609 14:23:01.101641 12048 softmax_loss_layer.cpp:42] Check failed: outer_num_ * inner_num_ == bottom[1]->count() (1 vs. 13) Number of labels must match number of predictions; e.g., if softmax axis == 1 and prediction shape is (N, C, H, W), label count (number of labels) must be N*H*W, with integer values in {0, 1, ..., C-1}. "
It looks like there is a dimension mismatch with my labels but I cannot figure out where.
I have built the hdf5 file using matlab from a label vector of 13 * 140 (where is 140 is my number of sample).
I used code from
#1746Here is the hdf5 content using h5disp in matlab:
Group '/'
Dataset 'data'
Size: 100x100x8x140
MaxSize: 100x100x8xInf
Datatype: H5T_IEEE_F64LE (double)
ChunkSize: 100x100x8x10
Filters: deflate(9)
FillValue: 0.000000
Dataset 'label'
Size: 13x140
MaxSize: 13xInf
Datatype: H5T_IEEE_F32LE (single)
ChunkSize: 13x10
Filters: deflate(9)
FillValue: 0.000000
Here is what caffe loads:
I0609 14:23:01.023202 12048 net.cpp:127] Top shape: 1 8 100 100 (80000)
I0609 14:23:01.023232 12048 net.cpp:127] Top shape: 1 13 (13)
Any idea about what is wrong?
Thanks