use conv1 to conv5 and redefines fc6 to fc8 into fc6-conv to fc8-conv and changes the types from "InnerProduct" to "Convolution" (so this way you do not need the net_surgery which is a bit artificial, after all, "a fully connected layer is just a 1x1 convolution", as master lecun once said)
--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/a7711932-1777-4e5a-b241-f3a5b5640edb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
#w, h are reference bounding box size
#return variables are matrices, you can get IoU by calling SI/SU
def compute_pascal_mapping(pd,w,h,imwidth,imheight):
boxes = np.mgrid[0:imheight,0:imwidth]
bx1 = boxes[1]-w/2
by1 = boxes[0]-h/2
bx2 = boxes[1]+w/2
by2 = boxes[0]+h/2
SA=h*w
SB = (pd[3]-pd[1])*(pd[2]-pd[0])
min2 = np.minimum(bx2,pd[2])
max0 = np.maximum(bx1,pd[0])
min3 = np.minimum(by2,pd[3])
max1 = np.maximum(by1,pd[1])
SI=np.maximum(0,min2-max0)*np.maximum(0,min3-max1)
SI=SI.astype(np.float)
SU=SA+SB-SI
return SI,SU,SA,SB