Hello,
How do you correctly specify a MemoryData layer using the caffe.NetSpec() functionality in python?
I am trying to do a simple regression using pycaffe. I've defined a network as below. The middle bits are not so important at the moment, they may be right or wrong - it is the MemoryData layers that cause problems.
My goal is to train the network prediction based on the data (from the first MemoryData layer) to match the target values (the second MemoryData layer).
n = caffe.NetSpec()
n.data = caffe.layers.MemoryData(batch_size=Nbatch, channels=n_statevar, height=1, width=1)
n.target = caffe.layers.MemoryData(batch_size=Nbatch, channels=n_actions, height=1, width=1)
n.data_activator = caffe.layers.TanH(n.data)
n.fc1 = caffe.layers.InnerProduct(n.data, num_output=64, weight_filler=dict(type='xavier'))
n.fc1_activator = caffe.layers.TanH(n.fc1)
n.fc2 = caffe.layers.InnerProduct(n.fc1_activator, num_output=n_actions, weight_filler=dict(type='xavier'))
n.fc2_activator = caffe.layers.TanH(n.fc2)
n.predvalues = caffe.layers.InnerProduct(n.fc2_activator)
n.loss = caffe.layers.EuclidianLoss(n.predvalues, n.target)
The above code writes a prototxt which looks OK but it must be offending somehow: When I try to load it into caffe.SGDSolver it complains about the number of tops from the MemoryData layer:
F0806 10:13:40.205143 9295 layer.hpp:358] Check failed: ExactNumTopBlobs() == top.size() (2 vs. 1) MemoryData Layer produces 2 top blob(s) as output.
I have seen in C++ examples that a "label" blob is associated with the MemoryDataLayer, which I suppose I could just leave hanging/not use in this case (?). But how to specify the prototxt correctly from Python? The resulting prototxt has the following layer def
layer {
name: "data"
type: "MemoryData"
top: "data"
memory_data_param {
batch_size: 100
channels: 2
height: 1
width: 1
}
}
Suggestions would be greatly appreciated!
As the goal is to have a convenient way to load data into the solver and iteratively train it, any alternative route avoiding MemoryDataLayer is also of interest.
Thanks