Check failed: ShapeEquals(proto) shape mismatch (reshape not set)

152 views
Skip to first unread message

Rafael Ruiz Muñoz

unread,
Jun 10, 2015, 11:46:16 AM6/10/15
to caffe...@googlegroups.com
Hello.

I have this error and I have tried to take a look in Internet but I got nothing clear.

I trained my net successfully with around 82% of accuracy.

Now I'm trying to try it with an image through this code:

python python/classify.py --model_def examples/imagenet/imagenet_deploy.prototxt --pretrained_model caffe_mycaffe_train_iter_10000.caffemodel --images_dim 64,64 data/mycaffe/testingset/cat1/113.png foo --mean_file data/mycaffe/mycaffe_train_mean.binaryproto

yes, my images are 64x64, 

these are the last lines I'm getting:

I0610 15:33:44.868100 28657 net.cpp:194] conv3 does not need backward computation.
I0610 15:33:44.868110 28657 net.cpp:194] norm2 does not need backward computation.
I0610 15:33:44.868120 28657 net.cpp:194] pool2 does not need backward computation.
I0610 15:33:44.868130 28657 net.cpp:194] relu2 does not need backward computation.
I0610 15:33:44.868142 28657 net.cpp:194] conv2 does not need backward computation.
I0610 15:33:44.868152 28657 net.cpp:194] norm1 does not need backward computation.
I0610 15:33:44.868162 28657 net.cpp:194] pool1 does not need backward computation.
I0610 15:33:44.868173 28657 net.cpp:194] relu1 does not need backward computation.
I0610 15:33:44.868182 28657 net.cpp:194] conv1 does not need backward computation.
I0610 15:33:44.868192 28657 net.cpp:235] This network produces output fc8_pascal
I0610 15:33:44.868214 28657 net.cpp:482] Collecting Learning Rate and Weight Decay.
I0610 15:33:44.868238 28657 net.cpp:247] Network initialization done.
I0610 15:33:44.868249 28657 net.cpp:248] Memory required for data: 3136120
F0610 15:33:45.025965 28657 blob.cpp:458] Check failed: ShapeEquals(proto) shape mismatch (reshape not set)
*** Check failure stack trace: ***
Aborted (core dumped)

I've tried to not setting the --mean_file and more things, but my shots are over. 

This is my imagenet_deploy.prototxt which I've modified in some parameters to debug, but didn't work anything.


name
: "MyCaffe"
input
: "data"
input_dim
: 10
input_dim
: 3
input_dim
: 64
input_dim
: 64
layer
{
  name
: "conv1"
  type
: "Convolution"
  bottom
: "data"
  top
: "conv1"
  convolution_param
{
    num_output
: 64
    kernel_size
: 11
    stride
: 4
 
}
}
layer
{
  name
: "relu1"
  type
: "ReLU"
  bottom
: "conv1"
  top
: "conv1"
}
layer
{
  name
: "pool1"
  type
: "Pooling"
  bottom
: "conv1"
  top
: "pool1"
  pooling_param
{
    pool
: MAX
    kernel_size
: 3
    stride
: 2
 
}
}
layer
{
  name
: "norm1"
  type
: "LRN"
  bottom
: "pool1"
  top
: "norm1"
  lrn_param
{
    local_size
: 5
    alpha
: 0.0001
    beta
: 0.75
 
}
}
layer
{
  name
: "conv2"
  type
: "Convolution"
  bottom
: "norm1"
  top
: "conv2"
  convolution_param
{
    num_output
: 64
    pad
: 2
    kernel_size
: 5
   
group: 2
 
}
}
layer
{
  name
: "relu2"
  type
: "ReLU"
  bottom
: "conv2"
  top
: "conv2"
}
layer
{
  name
: "pool2"
  type
: "Pooling"
  bottom
: "conv2"
  top
: "pool2"
  pooling_param
{
    pool
: MAX
    kernel_size
: 3
    stride
: 2
 
}
}
layer
{
  name
: "norm2"
  type
: "LRN"
  bottom
: "pool2"
  top
: "norm2"
  lrn_param
{
    local_size
: 5
    alpha
: 0.0001
    beta
: 0.75
 
}
}
layer
{
  name
: "conv3"
  type
: "Convolution"
  bottom
: "norm2"
  top
: "conv3"
  convolution_param
{
    num_output
: 384
    pad
: 1
    kernel_size
: 3
 
}
}
layer
{
  name
: "relu3"
  type
: "ReLU"
  bottom
: "conv3"
  top
: "conv3"
}
layer
{
  name
: "conv4"
  type
: "Convolution"
  bottom
: "conv3"
  top
: "conv4"
  convolution_param
{
    num_output
: 384
    pad
: 1
    kernel_size
: 3
   
group: 2
 
}
}
layer
{
  name
: "relu4"
  type
: "ReLU"
  bottom
: "conv4"
  top
: "conv4"
}
layer
{
  name
: "conv5"
  type
: "Convolution"
  bottom
: "conv4"
  top
: "conv5"
  convolution_param
{
    num_output
: 64
    pad
: 1
    kernel_size
: 3
   
group: 2
 
}
}
layer
{
  name
: "relu5"
  type
: "ReLU"
  bottom
: "conv5"
  top
: "conv5"
}
layer
{
  name
: "pool5"
  type
: "Pooling"
  bottom
: "conv5"
  top
: "pool5"
  pooling_param
{
    pool
: MAX
    kernel_size
: 3
    stride
: 2
 
}
}
layer
{
  name
: "fc6"
  type
: "InnerProduct"
  bottom
: "pool5"
  top
: "fc6"
  inner_product_param
{
    num_output
: 4096
 
}
}
layer
{
  name
: "relu6"
  type
: "ReLU"
  bottom
: "fc6"
  top
: "fc6"
}
layer
{
  name
: "drop6"
  type
: "Dropout"
  bottom
: "fc6"
  top
: "fc6"
  dropout_param
{
    dropout_ratio
: 0.5
 
}
}
layer
{
  name
: "fc7"
  type
: "InnerProduct"
  bottom
: "fc6"
  top
: "fc7"
  inner_product_param
{
    num_output
: 4096
 
}
}
layer
{
  name
: "relu7"
  type
: "ReLU"
  bottom
: "fc7"
  top
: "fc7"
}
layer
{
  name
: "drop7"
  type
: "Dropout"
  bottom
: "fc7"
  top
: "fc7"
  dropout_param
{
    dropout_ratio
: 0.5
 
}
}
layer
{
  name
: "fc8_pascal"
  type
: "InnerProduct"
  bottom
: "fc7"
  top
: "fc8_pascal"
  inner_product_param
{
    num_output
: 3
 
}
}

Does anyone could give me a clue?
Thank you very much.
Reply all
Reply to author
Forward
0 new messages