Accuracy of siamese Net does not change

126 views
Skip to first unread message

Stefano Lombardi

unread,
Mar 8, 2016, 3:40:33 AM3/8/16
to Caffe Users
Hi, i have a siamese network with a twin nets, followed by a sum layer connected with a 2 ouputs inner product and a softmax final layer.

I imposed fixed weights on the last fully connected layer because I want that the network learn the differences between the pairs of images only through the twin net.
the following is the last layer:

layer {
  name: "diff"
  type: "Eltwise"
  bottom: "feat"
  bottom: "feat_p"
  top: "diff"
  eltwise_param {
    operation: SUM
    coeff: 1
    coeff: -1
  }
}

layer {
  name: "absdiff"
  bottom: "diff"
  top: "absdiff"
  type: "AbsVal"
}


layer {
  name: "dist1"
  type: "InnerProduct"
  bottom: "absdiff"
  top: "dist1"
  param {
    name: "exitfeat_w1"
    lr_mult: 0
  }
  param {
    name: "exitfeat_b1"
    lr_mult: 0
  }
  inner_product_param {
    num_output: 1
    weight_filler {
      type: "constant"
      value: -1.0
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}


layer {
  name: "dist2"
  type: "InnerProduct"
  bottom: "absdiff"
  top: "dist2"
  param {
    name: "exitfeat_w2"
    lr_mult: 0
  }
  param {
    name: "exitfeat_b2"
    lr_mult: 0
  }
  inner_product_param {
    num_output: 1
    weight_filler {
      type: "constant"
      value: 1.0
    }
    bias_filler {
      type: "constant"
      value: -1
    }
  }
}

layer {
  name: "concat"
  bottom: "dist1"
  bottom: "dist2"
  top: "out_concat"
  type: "Concat"
  concat_param {
    axis: 1
  }
}

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "out_concat"
  bottom: "label"
  top: "loss"
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "out_concat"
  bottom: "label"
  top: "accuracy"
  include: { phase: TEST }
}



with feat and feat_p the two last inner product of the twin nets.

During the training/testing phase the loss
seems to change but the accuracy does not change and remains constant at 0.5:

0308 09:35:10.852556 14981 solver.cpp:408]     Test net output #0: accuracy = 0.50005
I0308 09:35:10.852681 14981 solver.cpp:408]     Test net output #1: loss = 1.06415 (* 1 = 1.06415 loss)
I0308 09:35:12.723263 14981 solver.cpp:236] Iteration 500, loss = 0.492998
I0308 09:35:12.723312 14981 solver.cpp:252]     Train net output #0: loss = 0.492998 (* 1 = 0.492998 loss)
I0308 09:35:12.723325 14981 sgd_solver.cpp:106] Iteration 500, lr = 0.00964069
I0308 09:35:33.345005 14981 solver.cpp:236] Iteration 600, loss = 0.58549
I0308 09:35:33.345042 14981 solver.cpp:252]     Train net output #0: loss = 0.58549 (* 1 = 0.58549 loss)
I0308 09:35:33.345052 14981 sgd_solver.cpp:106] Iteration 600, lr = 0.0095724
I0308 09:35:50.429996 14981 solver.cpp:236] Iteration 700, loss = 0.424228
I0308 09:35:50.430145 14981 solver.cpp:252]     Train net output #0: loss = 0.424228 (* 1 = 0.424228 loss)
I0308 09:35:50.430158 14981 sgd_solver.cpp:106] Iteration 700, lr = 0.00950522
I0308 09:36:04.644081 14981 solver.cpp:236] Iteration 800, loss = 0.70144
I0308 09:36:04.644130 14981 solver.cpp:252]     Train net output #0: loss = 0.70144 (* 1 = 0.70144 loss)
I0308 09:36:04.644148 14981 sgd_solver.cpp:106] Iteration 800, lr = 0.00943913
I0308 09:36:26.321671 14981 solver.cpp:236] Iteration 900, loss = 0.617041
I0308 09:36:26.321853 14981 solver.cpp:252]     Train net output #0: loss = 0.617041 (* 1 = 0.617041 loss)
I0308 09:36:26.321877 14981 sgd_solver.cpp:106] Iteration 900, lr = 0.00937411
I0308 09:36:39.365902 14981 solver.cpp:461] Snapshotting to binary proto file siamese_baseline_iter_1000.caffemodel
I0308 09:36:39.516826 14981 sgd_solver.cpp:269] Snapshotting solver state to binary proto file siamese_baseline_iter_1000.solverstate
I0308 09:36:39.528353 14981 solver.cpp:340] Iteration 1000, Testing net (#0)
I0308 09:36:58.519906 14981 solver.cpp:408]     Test net output #0: accuracy = 0.4998
I0308 09:36:58.520026 14981 solver.cpp:408]     Test net output #1: loss = 0.858909 (* 1 = 0.858909 loss)
I0308 09:36:58.551651 14981 solver.cpp:236] Iteration 1000, loss = 0.462376
I0308 09:36:58.551690 14981 solver.cpp:252]     Train net output #0: loss = 0.462376 (* 1 = 0.462376 loss)
I0308 09:36:58.551703 14981 sgd_solver.cpp:106] Iteration 1000, lr = 0.00931012



any suggestions? thanks a lot!

levat...@gmail.com

unread,
Apr 18, 2016, 7:18:23 AM4/18/16
to Caffe Users
Hi , I want to ask you a question.Like you just described,could I write as this?:bottom is ''absdiff ''.and maybe the fc layer's output number is 2.
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "absdiff"

  bottom: "label"
  top: "accuracy"
  include: { phase: TEST }
}

在 2016年3月8日星期二 UTC+8下午4:40:33,Stefano Lombardi写道:

anis....@shopedia.fr

unread,
Nov 8, 2016, 5:18:55 AM11/8/16
to Caffe Users
Hi please the label take 1 or zero ?
Reply all
Reply to author
Forward
0 new messages