How to define a layer with parameter include

695 views
Skip to first unread message

王驰航

unread,
Jul 7, 2015, 11:43:24 AM7/7/15
to caffe...@googlegroups.com
In python we can do something like this:
 
    n = caffe.NetSpec()
    n.data, n.label = L.Data(batch_size=batch_size, backend=P.Data.LMDB, source=lmdb,
                             transform_param=dict(scale=1./255), ntop=2)
    n.conv1 = L.Convolution(n.data, kernel_size=5, num_output=20, weight_filler=dict(type='xavier'))

But how do we define a layer which has flag include:{phase: TEST}?

I tried something like:

n.accuracy = L.Accuracy(n.ip1, n.label, incluude=dict(phase=caffe.TEST))

But apparently doesn't work

I-Chao Shen

unread,
Aug 31, 2015, 10:58:23 AM8/31/15
to Caffe Users
I have the same issue.
My problem is that I want to indicate different data layers for train and test, respectively.
However, when I trace the code in caffe_pb2, I found I can use 'include', but I can not successfully use it.

KaitBristowe

unread,
Aug 31, 2015, 1:48:42 PM8/31/15
to Caffe Users
I don't think you can include TEST/TRAIN phases.  I would just make a test net and a train net and then specify both of these in the solver prototxt:

net: "train net"
test_net: "test_net"
test_interval: X
test_iter: X

徐珍琦

unread,
Sep 15, 2015, 10:56:18 PM9/15/15
to Caffe Users
I'm facing the same problem. How do you deal with this issue?

在 2015年7月7日星期二 UTC+8下午11:43:24,Chihang Wang写道:

Georg Waltner

unread,
Oct 2, 2015, 2:57:27 AM10/2/15
to Caffe Users
This should work:

n.accuracy = L.Accuracy(n.ip1, n.label, include=[dict(phase=1)])

HTH, Georg.

Shai Bagon

unread,
Mar 8, 2016, 10:17:14 AM3/8/16
to Caffe Users
How can I do this for an input layer? That is having a data layer that has the SAME TOP name for TRAIN and TEST phases?

Jeremy Rutman

unread,
Mar 17, 2016, 12:30:57 PM3/17/16
to Caffe Users
i bump this question, have the same issue.
obviously there will be a conflict if you do 

n.data, n.label = L.Data(batch_size=batch_size, backend=P.Data.LMDB, source=test_lmdb,
                             transform_param=dict(scale=1./255), ntop=2,include=[dict(phase=1)])
and then try to define same variables using a different db with include=[dict(phase=0)]
The mechanism by which pycaffe allows dicts with multiple uses of the same key is with lists - for instance to get
  transform_param {
    scale: 0.00392156862745
    mean_value: 112
    mean_value: 123
    mean_value: 136
  }
we do transform_param=dict(scale=1./255,mean_value=[meanB,meanG,meanR])
so conceivably to get multiple layer defs with the same name one could analogously try a list  

n.data, n.label = [L.Data(batch_size=batch_size, backend=P.Data.LMDB, source=train_lmdb, transform_param=dict(scale=1./255), ntop=2,include=[dict(phase=0) ,
                   L.Data(batch_size=batch_size, backend=P.Data.LMDB, source=lmdb,
                             transform_param=dict(scale=1./255), ntop=2,include=[dict(phase=1)]) ]


i will try it as soon as i obtain two donuts

Jeremy Rutman

unread,
Mar 17, 2016, 1:06:28 PM3/17/16
to Caffe Users
on second thought that will not work as a,b = f(x)  can't be replaced by a,b=[f(x1),f(x2)]
furthermore there are no donuts

Marina V

unread,
May 26, 2016, 4:10:20 AM5/26/16
to Caffe Users


Hey, have you figured this one out?

Jeremy Rutman

unread,
Jul 4, 2016, 11:06:32 AM7/4/16
to Caffe Users
I think I wound up using Lisa's answer namely separate definitions of training and test nets
Reply all
Reply to author
Forward
0 new messages