How to use MemoryData with Pycaffe

1,066 views
Skip to first unread message

Mark Verleg

unread,
May 10, 2015, 9:02:22 PM5/10/15
to caffe...@googlegroups.com
This is based on https://github.com/BVLC/caffe/issues/2246 which turned out to not be bug, so better discussed here.

Basically, the way I thought data was provided to memory layer (in pycaffe) is apparently not the correct way. I thought it would be the same as when data is not explicitly a layer. I can't find how to do it when it's a layer...

--- from the issue ---

With code like this:

net = Net(network_path, TRAIN)
data = array([[sample] for sample in samples[:128]])  # samples are 70x70 ndarrays

And network that starts like:

name: "pymemorydatanet"
layer {
  name: "data"
  type: "MemoryData"
  top: "data"
  top: "label"
  memory_data_param {
    batch_size: 128
    channels: 1
    height: 70
    width: 70
  }
}
layer {
  name: "conv1"
...

I get the error

File "/home/mark/mlip1/caffe/python/caffe/pycaffe.py", line 84, in _Net_forward
  raise Exception('Input blob arguments do not match net inputs.')

I found that this is because list(net._inputs) is [], while I think it should be [0].

Indeed, for a network like

name: "CaffeNet"
input: "data"
input_dim: 1
input_dim: 3
input_dim: 227
input_dim: 227
layer {
  name: "conv1"
...

list(net._inputs) = [0] and things work.

Thomas

unread,
Jul 6, 2015, 5:00:00 PM7/6/15
to caffe...@googlegroups.com
I have the same issue with a HDF5 data layer when I load the network.
Did you find a workaround to load the data in pycaffe?

Mark Verleg

unread,
Jul 6, 2015, 5:09:15 PM7/6/15
to caffe...@googlegroups.com
Hello,

Well, technically yes... I wrote Python code to create lmdb databases dynamically, then used the `input_dim` style without data layer. It worked, but it's obviously ugly, convoluted and slow.

I stopped using Caffe after that project, so I don't know if there's been any development.

Good luck!

Mark
Reply all
Reply to author
Forward
0 new messages