"Trying to copy blobs of different sizes" in caffe

191 views
Skip to first unread message

Michael Philippov

unread,
Jan 12, 2017, 6:17:25 AM1/12/17
to Caffe Users
Hello!
These are my dims in my deploy file.

layer {
  name
: "data"
  type
: "Input"
  top
: "data"
  input_param
{ shape: { dim: 64 dim: 1 dim: 28 dim: 28 } }
}

This is my code:

Введите код.    Net< float > net( "/home/phil/caffeData/lenet.prototxt", TEST );
    net
.CopyTrainedLayersFrom("/home/phil/caffeData/_iter_5000.caffemodel");


   
Datum datum;
   
if( !ReadImageToDatum("/home/phil/caffeData/cat.jpg", 64, 28, 28, &datum) )
   
{
        qDebug
() << QObject::tr( "Невозможно прочесть изображение!" );
       
return;
   
}

   
Blob< float > * blob = new Blob< float >(1, datum.channels(), datum.height(), datum.width());

   
BlobProto proto;
    proto
.set_num(1);
    proto
.set_channels( datum.channels() );
    proto
.set_height(datum.height());
    proto
.set_width(datum.width());

   
const int data_size = datum.channels() * datum.height() * datum.width();
   
int size_in_datum = std::max<int>(datum.data().size(),
                                      datum
.float_data_size());

   
for (int i = 0; i < size_in_datum; ++i)
   
{
        proto
.add_data(0.);
   
}

   
const string& data = datum.data();
   
if (data.size() != 0)
   
{
       
for (int i = 0; i < size_in_datum; ++i)
       
{
            proto
.set_data(i, proto.data(i) + (uint8_t)data[i]);
       
}
   
}

    blob
->FromProto(proto);

    vector
< Blob< float > * > bottom;
    bottom
.push_back(blob);
   
float type = 0.0;
   
const vector< Blob< float > * >& result = net.Forward( bottom, &type );..

When I try to run it, I have a mistake like "

"Trying to copy blobs of different sizes". what's wrong?

Przemek D

unread,
Jan 13, 2017, 9:48:34 AM1/13/17
to Caffe Users
Sounds like a typical error encountered when trying to load weights from a caffemodel generated for a different prototxt. Would be helpful if you attached the proto you attempt to load as well as the one you used to train your network.
Reply all
Reply to author
Forward
0 new messages