I just encounter with another problem,In my net,I also use the triplet to training, I have converted the cifar10 dataset to two file,one includes the anchor file and other includes the sim/dis images for the anchor. Then I convert them to the lmdb data .
When I begin to training, the initialization of the net seems ok, but when it start do do testing iteration before training,there comes the error, it is suspending, how is your net going on.I think that may be there is something wrong in my lmdb file,but I just can't find the problem.
I1119 18:29:24.176877 7445 net.cpp:80] Setting up loss_p
I1119 18:29:24.176889 7445 net.cpp:83] Top shape: 1 1 1 1 (1)
I1119 18:29:24.176899 7445 net.cpp:90] Memory required for data: 25171972
I1119 18:29:24.176909 7445 net.cpp:126] loss_p needs backward computation.
I1119 18:29:24.176919 7445 net.cpp:157] This network produces output loss_p
I1119 18:29:24.177197 7445 net.cpp:418] Collecting Learning Rate and Weight Decay.
I1119 18:29:24.177270 7445 net.cpp:168] Network initialization done.
I1119 18:29:24.177276 7445 net.cpp:169] Memory required for data: 25171972
I1119 18:29:24.178000 7445 solver.cpp:53] Solver scaffolding done.
I1119 18:29:24.178010 7445 solver.cpp:173] Solving
I1119 18:29:24.178155 7445 solver.cpp:243] test_net_id: 1
I1119 18:29:24.178163 7445 solver.cpp:303] Iteration 0, Testing net (#0)
I1119 18:29:24.178176 7445 net.cpp:612] Copying source layer cifar10
I1119 18:29:24.178181 7445 net.cpp:613] Ignoring source layer cifar10
I1119 18:29:24.179338 7445 solver.cpp:314] 10
I1119 18:29:24.179355 7445 net.cpp:495] FORWARD
I1119 18:29:24.179361 7445 net.cpp:482] LAYERS: 195
I1119 18:29:24.179368 7445 net.cpp:455] end: 194
E1119 18:29:24.179376 7445 net.cpp:458] Forwarding nuswide
I1119 18:29:24.179574 7445 net.cpp:460] bottomvector:0
I1119 18:29:24.179587 7445 data_layer_tri.cpp:459] prefetch_data_.count: