Training in Memory and then using trained model?

154 views
Skip to first unread message

Charles West

unread,
Apr 5, 2016, 2:51:12 PM4/5/16
to Caffe Users
Hello,

I'm trying to load data into a MemoryDataLayer to train a model and then use the Forward function to use the trained model to actually do regression.  My model appears to train (how well, I do not know) but segfaults when I try to use the Forward function to get output.  I do not know if the issue is in the architecture of the network or my C++ code.  I would really appreciate any insight into what might be causing the issue.  I really don't know where I am going wrong (do you somehow need to change the model to eliminate the data layer and validation layer?).

I've attached the .prototxt file for the solver, C++ file for the program and example output when it runs.

Any help would be very appreciated.

Thanks,
Charlie West

#include<cmath>

#include<caffe/caffe.hpp>
#include<caffe/layers/memory_data_layer.hpp>

#include<memory>


using caffe::Caffe;

int main(int argc, char **argv)
{
//Create training data as big vector, as library seems to prefer
std
::vector<float> inputVector;
std
::vector<float> desiredOutputVector;

for(double x=0.0; x <= 2.0*M_PI; x+=.001)
{
inputVector
.emplace_back(x);
desiredOutputVector
.emplace_back(.5*sin(x)+.5);
}

//Set to run on CPU
Caffe::set_mode(Caffe::CPU);

//Read prototxt file to get network structure
caffe
::SolverParameter solver_param;

caffe
::ReadProtoFromTextFileOrDie("sinApproximator.prototxt", &solver_param);


//Create solver using loaded params
std
::unique_ptr<caffe::Solver<float> > solver(caffe::SolverRegistry<float>::CreateSolver(solver_param));

//Load the data into the memory layer
caffe
::MemoryDataLayer<float> &dataLayerImplementation = (*boost::static_pointer_cast<caffe::MemoryDataLayer<float> >(solver->net()->layers()[0]));

dataLayerImplementation
.Reset(inputVector.data(), desiredOutputVector.data(), inputVector.size());

//Train the network
solver
->Solve();

//Try to get output to look at from the network
for(int index = 0; index < inputVector.size(); index++)
{
caffe
::Blob<float> inputBlob(1,1,1,1); //A single batch

float *array = inputBlob.mutable_cpu_data();

array
[0] = inputVector[index];
std
::vector<caffe::Blob<float>*> inputBlobVector{&inputBlob};

const std::vector<caffe::Blob<float>*> &result = solver->net()->Forward(inputBlobVector);

float value = result[0]->cpu_data()[0];
printf
("We got a %f\n", value); //Seg faults here
}

return 0;
}


log.txt
main.cpp
sinApproximator.prototxt
Reply all
Reply to author
Forward
0 new messages