Hello world in C++ -> approximating a SIN function

134 views
Skip to first unread message

Charles West

unread,
Apr 3, 2016, 11:55:40 PM4/3/16
to Caffe Users
Hello all,

I'm just getting started with Caffe and am excited to get more familiar with the project.  As part of that I am trying to get more familiar with the C++ API.

I know that the usual way of doing things is to use text descriptions of protobuf objects and protobuf objects, trained with the project's executable.  However, I am really interested in using the C++ interface so that I can specify parameter sweeps and potentially be able to automatically generate data to train with.  As part of that, I'm trying to get the minimal example by trying to approximate a sin function.  I know the RELU network size is too small, but does the below look OK otherwise (it compiles)?

Thanks,
Charlie

Enter code here...#include<cmath>

#include<caffe/caffe.hpp>
#include<caffe/layers/memory_data_layer.hpp>

#include<memory>


using caffe::Caffe;

int main(int argc, char **argv)
{
//Create training data as big vector, as library seems to prefer
std
::vector<float> inputVector;
std
::vector<float> desiredOutputVector;
for(double x=0.0; x <= 2.0*M_PI; x+=.001)
{
inputVector
.emplace_back(x);
desiredOutputVector
.emplace_back(.5*sin(x)+.5);
}

Caffe::set_mode(Caffe::CPU);

//Set layer characteristics similar to proto def
caffe
::LayerParameter dataLayer;
dataLayer
.set_name("data");
dataLayer
.set_type("MemoryData");
(*dataLayer.add_top()) = "data";
(*dataLayer.add_top()) = "expected";
caffe
::MemoryDataParameter &MemoryDataParameters =  (*dataLayer.mutable_memory_data_param());
MemoryDataParameters.set_batch_size(1);
MemoryDataParameters.set_channels(1);
MemoryDataParameters.set_height(1);
MemoryDataParameters.set_width(1);

caffe
::LayerParameter firstLayerProduct;
firstLayerProduct
.set_name("ip1");
firstLayerProduct
.set_type("InnerProduct");
caffe
::InnerProductParameter &firstLayerInnerProductParams = (*firstLayerProduct.mutable_inner_product_param());
(*firstLayerProduct.add_bottom()) = "data";
(*firstLayerProduct.add_top()) = "ip1";


firstLayerInnerProductParams
.set_num_output(1);
caffe
::FillerParameter &weightFiller = (*firstLayerInnerProductParams.mutable_weight_filler());
weightFiller
.set_type("xavier");
caffe
::FillerParameter &biasFiller = (*firstLayerInnerProductParams.mutable_bias_filler());
biasFiller
.set_type("constant");

caffe
::LayerParameter firstLayerActivation;
firstLayerActivation
.set_name("firstLayerActivation");
firstLayerActivation
.set_type("ReLU");
(*firstLayerActivation.add_bottom()) = "ip1"; //Inplace operation
(*firstLayerActivation.add_top()) = "ip1";

caffe
::LayerParameter secondLayerProduct;
secondLayerProduct
.set_name("ip2");
secondLayerProduct
.set_type("InnerProduct");
caffe
::InnerProductParameter &secondLayerInnerProductParams = (*secondLayerProduct.mutable_inner_product_param());
(*secondLayerProduct.add_bottom()) = "ip1";
(*secondLayerProduct.add_top()) = "ip2";


secondLayerInnerProductParams
.set_num_output(1);
caffe
::FillerParameter &weightFiller2 = (*secondLayerInnerProductParams.mutable_weight_filler());
weightFiller
.set_type("xavier");
caffe
::FillerParameter &biasFiller2 = (*secondLayerInnerProductParams.mutable_bias_filler());
biasFiller
.set_type("constant");

caffe
::LayerParameter lossLayer;
lossLayer
.set_name("loss");
lossLayer
.set_type("SoftmaxWithLoss");
(*lossLayer.add_bottom()) = "ip2";
(*lossLayer.add_bottom()) = "expected";


//Top/Bottom blob names define connectivity between layers


caffe
::NetParameter networkDefinition;
networkDefinition
.set_name("BobNet");

(*networkDefinition.add_layer()) = dataLayer;
(*networkDefinition.add_layer()) = firstLayerProduct;
(*networkDefinition.add_layer()) = firstLayerActivation;
(*networkDefinition.add_layer()) = secondLayerProduct;
(*networkDefinition.add_layer()) = lossLayer;

caffe
::SolverParameter solver_param;

(*solver_param.mutable_net_param()) = networkDefinition;

std
::unique_ptr<caffe::Solver<float> > solver(caffe::SolverRegistry<float>::CreateSolver(solver_param));

caffe
::MemoryDataLayer<float> &dataLayerImplementation = (*boost::static_pointer_cast<caffe::MemoryDataLayer<float> >(solver->net()->layers()[0]));

dataLayerImplementation
.Reset(inputVector.data(), desiredOutputVector.data(), inputVector.size());

solver
->Solve();

return 0;
}



Reply all
Reply to author
Forward
0 new messages