Float multi-label of SoftmaxWithLossLayer

22 views
Skip to first unread message

Wu Huan

unread,
Oct 7, 2017, 10:36:01 PM10/7/17
to Caffe Users
In traditional, the label is single integer number. During the runtime, the label is stretched out into a vector. Assume the label is 3, then the output is `(0, 0, 0, 1, 0, 0, 0, 0, 0, 0)`. But now I have a probability vector, such as `(0.1, 0.1, 0.01, 0.78, 0.01, 0, 0, 0, 0, 0)`.   

Because of the float number, I should use hdf5 instead of lmdb. But the `SoftmaxWithLoss` layer only support single integer label rather than float vector 
label. 

**How do I train the model in Caffe?**  

**train_prototxt**

    name: "LeNet"
    layer {
      name: "feature"
      type: "HDF5Data"
      top: "data"
      top: "label"
      include {
        phase: TRAIN
      }
      hdf5_data_param {
        source: "h5_list.txt" 
        batch_size: 1
      }
    }
    layer {
      name: "ip1"
      type: "InnerProduct"
      bottom: "data"
      top: "ip1"
      param {
        lr_mult: 1
      }
      param {
        lr_mult: 2
      }
      inner_product_param {
        num_output: 250
        weight_filler {
          type: "xavier"
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "relu1"
      type: "ReLU"
      bottom: "ip1"
      top: "ip1"
    }
    layer {
      name: "ip2"
      type: "InnerProduct"
      bottom: "ip1"
      top: "ip2"
      param {
        lr_mult: 1
      }
      param {
        lr_mult: 2
      }
      inner_product_param {
        num_output: 10
        weight_filler {
          type: "xavier"
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "loss"
      type: "SoftmaxWithLoss"
      bottom: "ip2"
      bottom: "label"
      top: "loss"
    }

****
Reply all
Reply to author
Forward
0 new messages