HDF5 mismatch from caffe model

364 views
Skip to first unread message

Antonio Paes

unread,
Apr 20, 2016, 2:21:24 PM4/20/16
to Caffe Users
Hi guys, 

I'm trying to construct a H5 file from images on Matlab using h5create(); and h5write();

But when I start the training on caffe I got this error:

Check failed: blob->num_axes() == 4 (1 vs. 4) Blob must be 4 dim 

when I start the training on data layer this information is show-me:

Setting up data_data_0_split
I0420 08:08:44.029693 14843 net.cpp:157] Top shape: 5 3 32 32 (15360)
I0420 08:08:44.029698 14843 net.cpp:157] Top shape: 5 3 32 32 (15360)

I think which this is not correct, am I right?

thanks

Antonio Paes

unread,
Apr 20, 2016, 2:39:16 PM4/20/16
to Caffe Users
this occur when devonvolution layer is created:

0420 08:33:19.913432 28293 net.cpp:454] fc4 <- fc3
I0420 08:33:19.913439 28293 net.cpp:411] fc4 -> fc4
I0420 08:33:19.997107 28293 net.cpp:150] Setting up fc4
I0420 08:33:19.997143 28293 net.cpp:157] Top shape: 5 16384 (81920)
I0420 08:33:19.997148 28293 net.cpp:165] Memory required for data: 440320
I0420 08:33:19.997159 28293 layer_factory.hpp:76] Creating layer deconv
I0420 08:33:19.997172 28293 net.cpp:106] Creating Layer deconv
I0420 08:33:19.997177 28293 net.cpp:454] deconv <- data_data_0_split_1
I0420 08:33:19.997187 28293 net.cpp:411] deconv -> deconv
F0420 08:33:19.997534 28293 filler.hpp:249] Check failed: blob->num_axes() == 4 (1 vs. 4) Blob must be 4 dim.
*** Check failure stack trace: ***

Antonio Paes

unread,
Apr 22, 2016, 2:16:53 PM4/22/16
to Caffe Users
Anybody?


Em quarta-feira, 20 de abril de 2016 11:21:24 UTC-3, Antonio Paes escreveu:

Jan

unread,
Apr 25, 2016, 2:28:06 PM4/25/16
to Caffe Users
Well, are your HDF5 datasets 4D or not? Usually when you work with images, they should be. Check with hdfview for example.

Jan

Antonio Paes

unread,
Apr 25, 2016, 7:15:57 PM4/25/16
to Caffe Users

Hi jan, 

thanks for your answer, I'm checking my h5 file using h5ls, and the answer is apparently correct:

data                     Dataset {32/Inf, 32, 3, 4}
label                    Dataset {32/Inf, 32, 3, 4}

this was generated on matlab, I'm trying on python now, you have some example using images on python?

Jan

unread,
Apr 26, 2016, 9:23:57 AM4/26/16
to Caffe Users
Ah, I noticed that you have an (or multiple) fc layer before that. Which outputs a 2D blob (N, C), as your output tells you:


I0420 08:33:19.997107 28293 net.cpp:150] Setting up fc4
I0420 08:33:19.997143 28293 net.cpp:157] Top shape: 5 16384 (81920)
I0420 08:33:19.997148 28293 net.cpp:165] Memory required for data: 440320

And the deconv layer expects a 4D blob. So it is not related to the input of the network, but to the shape of the top/bottom blob of a hidden layer. You need to put a reshape layer inbetween the fc and the deconv layer to make the blob 4D again with appropriate channel, width and height settings. See https://github.com/BVLC/caffe/blob/master/src/caffe/proto/caffe.proto#L971-L1033 for more info on the params.

Jan

Antonio Paes

unread,
Apr 26, 2016, 3:59:51 PM4/26/16
to Caffe Users
Thanks Jan,

I definitely need reshape layer, but the input of my deconvolutional layer as "data" which is a image from HDF5. I concat that layer with output of fc layers, after this the concatenation is processed by convolutional layers.

I follow this papaer:  http://arxiv.org/abs/1603.07235

Antonio Paes

unread,
Apr 26, 2016, 4:35:16 PM4/26/16
to Caffe Users

This is the entire output on terminal:

I0426 10:28:40.381045 52583 layer_factory.hpp:76] Creating layer data
I0426 10:28:40.381078 52583 net.cpp:106] Creating Layer data
I0426 10:28:40.381084 52583 net.cpp:411] data -> data
I0426 10:28:40.381104 52583 net.cpp:411] data -> label
I0426 10:28:40.381115 52583 hdf5_data_layer.cpp:79] Loading list of HDF5 filenames from: /home/antonio/face_hallucination/architectures/train.txt
I0426 10:28:40.381139 52583 hdf5_data_layer.cpp:93] Number of HDF5 files: 1
I0426 10:28:40.382208 52583 hdf5.cpp:32] Datatype class: H5T_FLOAT
I0426 10:28:40.522101 52583 net.cpp:150] Setting up data
I0426 10:28:40.522156 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.522168 52583 net.cpp:157] Top shape: 128 1 21 21 (56448)
I0426 10:28:40.522176 52583 net.cpp:165] Memory required for data: 783360
I0426 10:28:40.522189 52583 layer_factory.hpp:76] Creating layer data_data_0_split
I0426 10:28:40.522209 52583 net.cpp:106] Creating Layer data_data_0_split
I0426 10:28:40.522220 52583 net.cpp:454] data_data_0_split <- data
I0426 10:28:40.522240 52583 net.cpp:411] data_data_0_split -> data_data_0_split_0
I0426 10:28:40.522266 52583 net.cpp:411] data_data_0_split -> data_data_0_split_1
I0426 10:28:40.522325 52583 net.cpp:150] Setting up data_data_0_split
I0426 10:28:40.522339 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.522347 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.522353 52583 net.cpp:165] Memory required for data: 1898496
I0426 10:28:40.522359 52583 layer_factory.hpp:76] Creating layer fc1
I0426 10:28:40.522375 52583 net.cpp:106] Creating Layer fc1
I0426 10:28:40.522383 52583 net.cpp:454] fc1 <- data_data_0_split_0
I0426 10:28:40.522393 52583 net.cpp:411] fc1 -> fc1
I0426 10:28:40.532361 52583 net.cpp:150] Setting up fc1
I0426 10:28:40.532385 52583 net.cpp:157] Top shape: 128 512 (65536)
I0426 10:28:40.532392 52583 net.cpp:165] Memory required for data: 2160640
I0426 10:28:40.532418 52583 layer_factory.hpp:76] Creating layer relu1
I0426 10:28:40.532436 52583 net.cpp:106] Creating Layer relu1
I0426 10:28:40.532444 52583 net.cpp:454] relu1 <- fc1
I0426 10:28:40.532452 52583 net.cpp:397] relu1 -> fc1 (in-place)
I0426 10:28:40.664742 52583 net.cpp:150] Setting up relu1
I0426 10:28:40.664785 52583 net.cpp:157] Top shape: 128 512 (65536)
I0426 10:28:40.664790 52583 net.cpp:165] Memory required for data: 2422784
I0426 10:28:40.664798 52583 layer_factory.hpp:76] Creating layer fc2
I0426 10:28:40.664819 52583 net.cpp:106] Creating Layer fc2
I0426 10:28:40.664825 52583 net.cpp:454] fc2 <- fc1
I0426 10:28:40.664835 52583 net.cpp:411] fc2 -> fc2
I0426 10:28:40.666466 52583 net.cpp:150] Setting up fc2
I0426 10:28:40.666481 52583 net.cpp:157] Top shape: 128 256 (32768)
I0426 10:28:40.666486 52583 net.cpp:165] Memory required for data: 2553856
I0426 10:28:40.666497 52583 layer_factory.hpp:76] Creating layer relu2
I0426 10:28:40.666507 52583 net.cpp:106] Creating Layer relu2
I0426 10:28:40.666512 52583 net.cpp:454] relu2 <- fc2
I0426 10:28:40.666517 52583 net.cpp:397] relu2 -> fc2 (in-place)
I0426 10:28:40.666646 52583 net.cpp:150] Setting up relu2
I0426 10:28:40.666658 52583 net.cpp:157] Top shape: 128 256 (32768)
I0426 10:28:40.666662 52583 net.cpp:165] Memory required for data: 2684928
I0426 10:28:40.666666 52583 layer_factory.hpp:76] Creating layer fc3
I0426 10:28:40.666676 52583 net.cpp:106] Creating Layer fc3
I0426 10:28:40.666679 52583 net.cpp:454] fc3 <- fc2
I0426 10:28:40.666685 52583 net.cpp:411] fc3 -> fc3
I0426 10:28:40.667836 52583 net.cpp:150] Setting up fc3
I0426 10:28:40.667847 52583 net.cpp:157] Top shape: 128 512 (65536)
I0426 10:28:40.667851 52583 net.cpp:165] Memory required for data: 2947072
I0426 10:28:40.667860 52583 layer_factory.hpp:76] Creating layer relu2
I0426 10:28:40.667866 52583 net.cpp:106] Creating Layer relu2
I0426 10:28:40.667870 52583 net.cpp:454] relu2 <- fc3
I0426 10:28:40.667876 52583 net.cpp:397] relu2 -> fc3 (in-place)
I0426 10:28:40.668109 52583 net.cpp:150] Setting up relu2
I0426 10:28:40.668125 52583 net.cpp:157] Top shape: 128 512 (65536)
I0426 10:28:40.668130 52583 net.cpp:165] Memory required for data: 3209216
I0426 10:28:40.668134 52583 layer_factory.hpp:76] Creating layer fc4
I0426 10:28:40.668144 52583 net.cpp:106] Creating Layer fc4
I0426 10:28:40.668148 52583 net.cpp:454] fc4 <- fc3
I0426 10:28:40.668155 52583 net.cpp:411] fc4 -> fc4
I0426 10:28:40.751651 52583 net.cpp:150] Setting up fc4
I0426 10:28:40.751689 52583 net.cpp:157] Top shape: 128 16384 (2097152)
I0426 10:28:40.751694 52583 net.cpp:165] Memory required for data: 11597824
I0426 10:28:40.751708 52583 layer_factory.hpp:76] Creating layer reshape
I0426 10:28:40.751724 52583 net.cpp:106] Creating Layer reshape
I0426 10:28:40.751729 52583 net.cpp:454] reshape <- data_data_0_split_1
I0426 10:28:40.751741 52583 net.cpp:411] reshape -> output
I0426 10:28:40.751783 52583 net.cpp:150] Setting up reshape
I0426 10:28:40.751793 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.751798 52583 net.cpp:165] Memory required for data: 12155392
I0426 10:28:40.751802 52583 layer_factory.hpp:76] Creating layer deconv
I0426 10:28:40.751816 52583 net.cpp:106] Creating Layer deconv
I0426 10:28:40.751824 52583 net.cpp:454] deconv <- output
I0426 10:28:40.751832 52583 net.cpp:411] deconv -> deconv
F0426 10:28:40.752082 52583 filler.hpp:249] Check failed: blob->num_axes() == 4 (1 vs. 4) Blob must be 4 dim.


The dimensions are apparently correct, or not?

Jan

unread,
Apr 27, 2016, 9:03:42 AM4/27/16
to Caffe Users
Oh, my bad, I thought the fc layer feeds into the deconv layer, but apparently it doesn't. No, in this case you don't need a reshape layer. Look at

I0426 10:28:40.522209 52583 net.cpp:106] Creating Layer data_data_0_split
I0426 10:28:40.522220 52583 net.cpp:454] data_data_0_split <- data
I0426 10:28:40.522240 52583 net.cpp:411] data_data_0_split -> data_data_0_split_0
I0426 10:28:40.522266 52583 net.cpp:411] data_data_0_split -> data_data_0_split_1
I0426 10:28:40.522325 52583 net.cpp:150] Setting up data_data_0_split
I0426 10:28:40.522339 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.522347 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
 
So data_data_0_split_1, which feeds into the deconv layer, is 4D and looks completely ok. So the only thing I can imagine to be wrong is something in the config. Talking about this would be easier if you posted your network config.

Jan

Jan

unread,
Apr 27, 2016, 9:06:12 AM4/27/16
to Caffe Users
Oh, my bad, I thought the fc layer feeds into the deconv layer, but apparently it doesn't. No, in this case you don't need a reshape layer. Look at

I0426 10:28:40.522209 52583 net.cpp:106] Creating Layer data_data_0_split
I0426 10:28:40.522220 52583 net.cpp:454] data_data_0_split <- data
I0426 10:28:40.522240 52583 net.cpp:411] data_data_0_split -> data_data_0_split_0
I0426 10:28:40.522266 52583 net.cpp:411] data_data_0_split -> data_data_0_split_1
I0426 10:28:40.522325 52583 net.cpp:150] Setting up data_data_0_split
I0426 10:28:40.522339 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
I0426 10:28:40.522347 52583 net.cpp:157] Top shape: 128 1 33 33 (139392)
 
So data_data_0_split_1, which feeds into the deconv layer, is 4D and looks completely ok. So the only thing I can imagine to be wrong is something in the config. Talking about this would be easier if you posted your network config.

Jan


Antonio Paes

unread,
Apr 27, 2016, 5:23:03 PM4/27/16
to Caffe Users
Hi Jan I finally pass problem of 4D matrix, but satay with some problems when I run, I will post my architectures:

layer {
  name: "data"
  type: "HDF5Data"
  top: "data"
  top: "label"
  hdf5_data_param {
    source: "/home/antonio/face_hallucination/architectures/train.txt"
    batch_size: 5
  }
  include : { phase: TRAIN }
}
layer {
  name: "data"
  type: "HDF5Data"
  top: "data"
  top: "label"
  hdf5_data_param {
    source: "/home/antonio/face_hallucination/architectures/test.txt"
    batch_size: 5
  }
  include : { phase: TEST }
}
layer {
  name: "fc1"
  type: "InnerProduct"
  bottom: "data"
  top: "fc1"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 512
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "fc1"
  top: "fc1"
}
layer {
  name: "fc2"
  type: "InnerProduct"
  bottom: "fc1"
  top: "fc2"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 256
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "fc2"
  top: "fc2"
}
layer {
  name: "fc3"
  type: "InnerProduct"
  bottom: "fc2"
  top: "fc3"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 512
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "fc3"
  top: "fc3"
}
layer {
  name: "fc4"
  type: "InnerProduct"
  bottom: "fc3"
  top: "fc4"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 16384
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "deconv"
  type: "Deconvolution"
  bottom: "data"
  top: "deconv"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 8
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  type: "Flatten"
  bottom: "deconv"
  top: "dec_flatten"
  name: "dec_flatten"
}
layer {
  name: "concat"
  bottom: "fc4"
  bottom: "dec_flatten"
  top: "concat"
  type: "Concat"
  concat_param {
    axis: 1
  }
}

layer {
  name: "reshape"
  type: "Reshape"
  bottom: "concat"
  top: "output"
  reshape_param {
    shape {
     dim: -1  # copy the dimension from below
     dim: 1
     dim: 111
     dim: 111 # infer it from the other dimensions
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "output"
  top: "conv1"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 16
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "conv1"
  top: "conv2"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 64
    kernel_size: 7
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "conv2"
  top: "conv3"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 16
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "loss"
  type: "EuclideanLoss"
  bottom: "conv3"
  bottom: "label"
  top: "loss"
}

Antonio Paes

unread,
Apr 27, 2016, 7:47:11 PM4/27/16
to Caffe Users
And this is my currents problem, for some reason when I reshape the layer the number of labels change:

I0427 13:42:22.268987 127300 layer_factory.hpp:76] Creating layer data
I0427 13:42:22.269012 127300 net.cpp:106] Creating Layer data
I0427 13:42:22.269019 127300 net.cpp:411] data -> data
I0427 13:42:22.269040 127300 net.cpp:411] data -> label
I0427 13:42:22.269054 127300 hdf5_data_layer.cpp:79] Loading list of HDF5 filenames from: /home/antonio/face_hallucination/architectures/train.txt
I0427 13:42:22.269076 127300 hdf5_data_layer.cpp:93] Number of HDF5 files: 1
I0427 13:42:22.270187 127300 hdf5.cpp:32] Datatype class: H5T_FLOAT
I0427 13:42:22.413625 127300 net.cpp:150] Setting up data
I0427 13:42:22.413676 127300 net.cpp:157] Top shape: 5 1 32 32 (5120)
I0427 13:42:22.413683 127300 net.cpp:157] Top shape: 5 1 32 32 (5120)
I0427 13:42:22.413687 127300 net.cpp:165] Memory required for data: 40960
I0427 13:42:22.413697 127300 layer_factory.hpp:76] Creating layer data_data_0_split
I0427 13:42:22.413712 127300 net.cpp:106] Creating Layer data_data_0_split
I0427 13:42:22.413719 127300 net.cpp:454] data_data_0_split <- data
I0427 13:42:22.413733 127300 net.cpp:411] data_data_0_split -> data_data_0_split_0
I0427 13:42:22.413744 127300 net.cpp:411] data_data_0_split -> data_data_0_split_1
I0427 13:42:22.413787 127300 net.cpp:150] Setting up data_data_0_split
I0427 13:42:22.413795 127300 net.cpp:157] Top shape: 5 1 32 32 (5120)
I0427 13:42:22.413800 127300 net.cpp:157] Top shape: 5 1 32 32 (5120)
I0427 13:42:22.413805 127300 net.cpp:165] Memory required for data: 81920
I0427 13:42:22.413808 127300 layer_factory.hpp:76] Creating layer fc1
I0427 13:42:22.413820 127300 net.cpp:106] Creating Layer fc1
I0427 13:42:22.413823 127300 net.cpp:454] fc1 <- data_data_0_split_0
I0427 13:42:22.413830 127300 net.cpp:411] fc1 -> fc1
I0427 13:42:22.420203 127300 net.cpp:150] Setting up fc1
I0427 13:42:22.420230 127300 net.cpp:157] Top shape: 5 512 (2560)
I0427 13:42:22.420235 127300 net.cpp:165] Memory required for data: 92160
I0427 13:42:22.420253 127300 layer_factory.hpp:76] Creating layer relu1
I0427 13:42:22.420261 127300 net.cpp:106] Creating Layer relu1
I0427 13:42:22.420266 127300 net.cpp:454] relu1 <- fc1
I0427 13:42:22.420271 127300 net.cpp:397] relu1 -> fc1 (in-place)
I0427 13:42:22.576689 127300 net.cpp:150] Setting up relu1
I0427 13:42:22.576740 127300 net.cpp:157] Top shape: 5 512 (2560)
I0427 13:42:22.576745 127300 net.cpp:165] Memory required for data: 102400
I0427 13:42:22.576755 127300 layer_factory.hpp:76] Creating layer fc2
I0427 13:42:22.576772 127300 net.cpp:106] Creating Layer fc2
I0427 13:42:22.576779 127300 net.cpp:454] fc2 <- fc1
I0427 13:42:22.576788 127300 net.cpp:411] fc2 -> fc2
I0427 13:42:22.578515 127300 net.cpp:150] Setting up fc2
I0427 13:42:22.578541 127300 net.cpp:157] Top shape: 5 256 (1280)
I0427 13:42:22.578546 127300 net.cpp:165] Memory required for data: 107520
I0427 13:39:41.444768 126749 net.cpp:106] Creating Layer relu2
I0427 13:39:41.444773 126749 net.cpp:454] relu2 <- fc2
I0427 13:39:41.444778 126749 net.cpp:397] relu2 -> fc2 (in-place)
I0427 13:39:41.444942 126749 net.cpp:150] Setting up relu2
I0427 13:39:41.444953 126749 net.cpp:157] Top shape: 5 256 (1280)
I0427 13:39:41.444957 126749 net.cpp:165] Memory required for data: 112640
I0427 13:39:41.444962 126749 layer_factory.hpp:76] Creating layer fc3
I0427 13:39:41.444968 126749 net.cpp:106] Creating Layer fc3
I0427 13:39:41.444991 126749 net.cpp:454] fc3 <- fc2
I0427 13:39:41.444998 126749 net.cpp:411] fc3 -> fc3
I0427 13:39:41.446216 126749 net.cpp:150] Setting up fc3
I0427 13:39:41.446226 126749 net.cpp:157] Top shape: 5 512 (2560)
I0427 13:39:41.446230 126749 net.cpp:165] Memory required for data: 122880
I0427 13:39:41.446238 126749 layer_factory.hpp:76] Creating layer relu2
I0427 13:39:41.446244 126749 net.cpp:106] Creating Layer relu2
I0427 13:39:41.446247 126749 net.cpp:454] relu2 <- fc3
I0427 13:39:41.446251 126749 net.cpp:397] relu2 -> fc3 (in-place)
I0427 13:39:41.446521 126749 net.cpp:150] Setting up relu2
I0427 13:39:41.446535 126749 net.cpp:157] Top shape: 5 512 (2560)
I0427 13:39:41.446539 126749 net.cpp:165] Memory required for data: 133120
I0427 13:39:41.446543 126749 layer_factory.hpp:76] Creating layer fc4
I0427 13:39:41.446552 126749 net.cpp:106] Creating Layer fc4
I0427 13:39:41.446555 126749 net.cpp:454] fc4 <- fc3
I0427 13:39:41.446562 126749 net.cpp:411] fc4 -> fc4
I0427 13:39:41.561897 126749 net.cpp:150] Setting up fc4
I0427 13:39:41.561934 126749 net.cpp:157] Top shape: 5 21904 (109520)
I0427 13:39:41.561939 126749 net.cpp:165] Memory required for data: 571200
I0427 13:39:41.561950 126749 layer_factory.hpp:76] Creating layer deconv
I0427 13:39:41.561964 126749 net.cpp:106] Creating Layer deconv
I0427 13:39:41.561969 126749 net.cpp:454] deconv <- data_data_0_split_1
I0427 13:39:41.561990 126749 net.cpp:411] deconv -> deconv
I0427 13:39:41.563319 126749 net.cpp:150] Setting up deconv
I0427 13:39:41.563333 126749 net.cpp:157] Top shape: 5 256 37 37 (1752320)
I0427 13:39:41.563349 126749 net.cpp:165] Memory required for data: 7580480
I0427 13:39:41.563365 126749 layer_factory.hpp:76] Creating layer dec_flatten
I0427 13:39:41.563374 126749 net.cpp:106] Creating Layer dec_flatten
I0427 13:39:41.563379 126749 net.cpp:454] dec_flatten <- deconv
I0427 13:39:41.563383 126749 net.cpp:411] dec_flatten -> dec_flatten
I0427 13:39:41.563410 126749 net.cpp:150] Setting up dec_flatten
I0427 13:39:41.563417 126749 net.cpp:157] Top shape: 5 350464 (1752320)
I0427 13:39:41.563421 126749 net.cpp:165] Memory required for data: 14589760
I0427 13:39:41.563426 126749 layer_factory.hpp:76] Creating layer concat
I0427 13:39:41.563432 126749 net.cpp:106] Creating Layer concat
I0427 13:39:41.563436 126749 net.cpp:454] concat <- fc4
I0427 13:39:41.563441 126749 net.cpp:454] concat <- dec_flatten
I0427 13:39:41.563446 126749 net.cpp:411] concat -> concat
I0427 13:39:41.563469 126749 net.cpp:150] Setting up concat
I0427 13:39:41.563477 126749 net.cpp:157] Top shape: 5 372368 (1861840)
I0427 13:39:41.563482 126749 net.cpp:165] Memory required for data: 22037120
I0427 13:39:41.563485 126749 layer_factory.hpp:76] Creating layer reshape
I0427 13:39:41.563493 126749 net.cpp:106] Creating Layer reshape
I0427 13:39:41.563498 126749 net.cpp:454] reshape <- concat
I0427 13:39:41.563503 126749 net.cpp:411] reshape -> output
I0427 13:39:41.563529 126749 net.cpp:150] Setting up reshape
I0427 13:39:41.563535 126749 net.cpp:157] Top shape: 85 1 148 148 (1861840)
I0427 13:39:41.563539 126749 net.cpp:165] Memory required for data: 29484480
I0427 13:39:41.563542 126749 layer_factory.hpp:76] Creating layer conv1
I0427 13:39:41.563551 126749 net.cpp:106] Creating Layer conv1
I0427 13:39:41.563556 126749 net.cpp:454] conv1 <- output
I0427 13:39:41.563563 126749 net.cpp:411] conv1 -> conv1
I0427 13:39:41.564617 126749 net.cpp:150] Setting up conv1
I0427 13:39:41.564645 126749 net.cpp:157] Top shape: 85 16 144 144 (28200960)
I0427 13:39:41.564649 126749 net.cpp:165] Memory required for data: 142288320
I0427 13:39:41.564656 126749 layer_factory.hpp:76] Creating layer relu3
I0427 13:39:41.564663 126749 net.cpp:106] Creating Layer relu3
I0427 13:39:41.564668 126749 net.cpp:454] relu3 <- conv1
I0427 13:39:41.564676 126749 net.cpp:397] relu3 -> conv1 (in-place)
I0427 13:39:41.564862 126749 net.cpp:150] Setting up relu3
I0427 13:39:41.564884 126749 net.cpp:157] Top shape: 85 16 144 144 (28200960)
I0427 13:39:41.564888 126749 net.cpp:165] Memory required for data: 255092160
I0427 13:39:41.564913 126749 layer_factory.hpp:76] Creating layer conv2
I0427 13:39:41.564923 126749 net.cpp:106] Creating Layer conv2
I0427 13:39:41.564937 126749 net.cpp:454] conv2 <- conv1
I0427 13:39:41.564944 126749 net.cpp:411] conv2 -> conv2
I0427 13:39:41.566232 126749 net.cpp:150] Setting up conv2
I0427 13:39:41.566259 126749 net.cpp:157] Top shape: 85 64 138 138 (103599360)
I0427 13:39:41.566264 126749 net.cpp:165] Memory required for data: 669489600
I0427 13:39:41.566272 126749 layer_factory.hpp:76] Creating layer relu4
I0427 13:39:41.566280 126749 net.cpp:106] Creating Layer relu4
I0427 13:39:41.566285 126749 net.cpp:454] relu4 <- conv2
I0427 13:39:41.566290 126749 net.cpp:397] relu4 -> conv2 (in-place)
I0427 13:39:41.566498 126749 net.cpp:150] Setting up relu4
I0427 13:39:41.566509 126749 net.cpp:157] Top shape: 85 64 138 138 (103599360)
I0427 13:39:41.566525 126749 net.cpp:165] Memory required for data: 1083887040
I0427 13:39:41.566529 126749 layer_factory.hpp:76] Creating layer conv3
I0427 13:39:41.566550 126749 net.cpp:106] Creating Layer conv3
I0427 13:39:41.566555 126749 net.cpp:454] conv3 <- conv2
I0427 13:39:41.566562 126749 net.cpp:411] conv3 -> conv3
I0427 13:39:41.567746 126749 net.cpp:150] Setting up conv3
I0427 13:39:41.567773 126749 net.cpp:157] Top shape: 85 16 134 134 (24420160)
I0427 13:39:41.567778 126749 net.cpp:165] Memory required for data: 1181567680
I0427 13:39:41.567785 126749 layer_factory.hpp:76] Creating layer loss
I0427 13:39:41.567792 126749 net.cpp:106] Creating Layer loss
I0427 13:39:41.567797 126749 net.cpp:454] loss <- conv3
I0427 13:39:41.567801 126749 net.cpp:454] loss <- label
I0427 13:39:41.567807 126749 net.cpp:411] loss -> loss
F0427 13:39:41.567827 126749 loss_layer.cpp:19] Check failed: bottom[0]->num() == bottom[1]->num() (85 vs. 5) The data and label should have the same number.
*** Check failure stack trace: ***
    @     0x7f4aee742f6d  google::LogMessage::Fail()
    @     0x7f4aee744f23  google::LogMessage::SendToLog()
    @     0x7f4aee742ae9  google::LogMessage::Flush()
    @     0x7f4aee74594e  google::LogMessageFatal::~LogMessageFatal()
    @     0x7f4aeee055b6  caffe::LossLayer<>::Reshape()
    @     0x7f4aeee3cff3  caffe::EuclideanLossLayer<>::Reshape()
    @     0x7f4aeed9dd4a  caffe::Net<>::Init()
    @     0x7f4aeed9f6d8  caffe::Net<>::Net()
    @     0x7f4aeed78012  caffe::Solver<>::InitTrainNet()
    @     0x7f4aeed793b7  caffe::Solver<>::Init()
    @     0x7f4aeed79749  caffe::Solver<>::Solver()
    @     0x7f4aeed5f243  caffe::Creator_SGDSolver<>()
    @           0x40acda  (unknown)
    @           0x407199  (unknown)
    @     0x7f4aed3d3610  __libc_start_main
    @           0x4079e9  (unknown)
Aborted (core dumped)

real    0m2.005s
user    0m0.600s
sys     0m3.500s
[antonio@fire caffe]$
I0427 13:39:41.566280 126749 net.cpp:106] Creating Layer relu4
I0427 13:39:41.566285 126749 net.cpp:454] relu4 <- conv2
I0427 13:39:41.566290 126749 net.cpp:397] relu4 -> conv2 (in-place)
I0427 13:39:41.566498 126749 net.cpp:150] Setting up relu4
I0427 13:39:41.566509 126749 net.cpp:157] Top shape: 85 64 138 138 (103599360)
I0427 13:39:41.566525 126749 net.cpp:165] Memory required for data: 1083887040
I0427 13:39:41.566529 126749 layer_factory.hpp:76] Creating layer conv3
I0427 13:39:41.566550 126749 net.cpp:106] Creating Layer conv3
I0427 13:39:41.566555 126749 net.cpp:454] conv3 <- conv2
I0427 13:39:41.566562 126749 net.cpp:411] conv3 -> conv3
I0427 13:39:41.567746 126749 net.cpp:150] Setting up conv3
I0427 13:39:41.567773 126749 net.cpp:157] Top shape: 85 16 134 134 (24420160)
I0427 13:39:41.567778 126749 net.cpp:165] Memory required for data: 1181567680
I0427 13:39:41.567785 126749 layer_factory.hpp:76] Creating layer loss
I0427 13:39:41.567792 126749 net.cpp:106] Creating Layer loss
I0427 13:39:41.567797 126749 net.cpp:454] loss <- conv3
I0427 13:39:41.567801 126749 net.cpp:454] loss <- label
I0427 13:39:41.567807 126749 net.cpp:411] loss -> loss
F0427 13:39:41.567827 126749 loss_layer.cpp:19] Check failed: bottom[0]->num() == bottom[1]->num() (85 vs. 5) The data and label should have the same number.
*** Check failure stack trace: ***
    @     0x7f4aee742f6d  google::LogMessage::Fail()
    @     0x7f4aee744f23  google::LogMessage::SendToLog()
    @     0x7f4aee742ae9  google::LogMessage::Flush()
    @     0x7f4aee74594e  google::LogMessageFatal::~LogMessageFatal()
    @     0x7f4aeee055b6  caffe::LossLayer<>::Reshape()
    @     0x7f4aeee3cff3  caffe::EuclideanLossLayer<>::Reshape()
    @     0x7f4aeed9dd4a  caffe::Net<>::Init()
    @     0x7f4aeed9f6d8  caffe::Net<>::Net()
    @     0x7f4aeed78012  caffe::Solver<>::InitTrainNet()
    @     0x7f4aeed793b7  caffe::Solver<>::Init()
    @     0x7f4aeed79749  caffe::Solver<>::Solver()
    @     0x7f4aeed5f243  caffe::Creator_SGDSolver<>()
    @           0x40acda  (unknown)
    @           0x407199  (unknown)
    @     0x7f4aed3d3610  __libc_start_main
    @           0x4079e9  (unknown)
Aborted (core dumped)

Antonio Paes

unread,
Apr 27, 2016, 8:15:44 PM4/27/16
to Caffe Users
I think which when I do reshape, some thing happens with batch size numbers, for same reason 5 change to 85...kkkkk
...

Jan

unread,
Apr 28, 2016, 7:50:09 AM4/28/16
to Caffe Users
That's exactly what happens. You shouldn't do that. As I said in my last post, remove the reshaping. See if other errors remain.

Jan

Antonio Paes

unread,
Apr 28, 2016, 3:56:15 PM4/28/16
to Caffe Users
I can not remove reshape layer because I concat my Deconvolution layer with FC layer, and for concatenation the matrices have be the same dimensions. I try use flatten layer after deconvolution layer, works for concatenation, but after this I need reshape the flatten layer because the output of concatenation would be process by convolutional layers. You have any idea of how I reshape the layer without change the batch size?
...

Jan

unread,
Apr 29, 2016, 7:34:41 AM4/29/16
to Caffe Users
Ok, if you want to do it as in the paper, you need to put the reshape layer after the fc layer, and then concatenate (along the channel axis, which is the default anyway). Your reshpae layer definition looks strange. Have you read the documentation in https://github.com/BVLC/caffe/blob/master/src/caffe/proto/caffe.proto#L971-L1033? It is quite extensive. So you want to do something like the following (after the fc layer):

layer {
  type
: "Reshape"
  bottom
: "<put name here>"
  top
: "<put other name here>"
 
  reshape_param
{
    dim
: 0
    dim
: -1
    dim
: <w>
    dim
: <h>
 
}
}

Replace <w> and <h> with the same width and height the result of the deconvolution operation has. Then the result of that layer can be concatenated to this layer's result, as only the channel dimension is different.

Jan

Antonio Paes

unread,
May 1, 2016, 3:16:15 PM5/1/16
to Caffe Users
Hi Jan, I use some ideas which you say and the concatenation do more sense now, and works. But when the training construct the loss layer e got this error:

I0501 08:54:43.176820 14655 net.cpp:106] Creating Layer conv3
I0501 08:54:43.176826 14655 net.cpp:454] conv3 <- conv2
I0501 08:54:43.176831 14655 net.cpp:411] conv3 -> conv3
I0501 08:54:43.177955 14655 net.cpp:150] Setting up conv3
I0501 08:54:43.177970 14655 net.cpp:157] Top shape: 5 16 114 114 (1039680)
I0501 08:54:43.177975 14655 net.cpp:165] Memory required for data: 51416320
I0501 08:54:43.177983 14655 layer_factory.hpp:76] Creating layer loss
I0501 08:54:43.177991 14655 net.cpp:106] Creating Layer loss
I0501 08:54:43.177995 14655 net.cpp:454] loss <- conv3
I0501 08:54:43.178000 14655 net.cpp:454] loss <- label
I0501 08:54:43.178006 14655 net.cpp:411] loss -> loss
F0501 08:54:43.178031 14655 euclidean_loss_layer.cpp:12] Check failed: bottom[0]->count(1) == bottom[1]->count(1) (207936 vs. 1024) Input
*** Check failure stack trace: ***

I change the bottom of euclidean layer setting both on conv3 and works, but this is wrong right?

And more one time thanks for your help.
...

Jan

unread,
May 2, 2016, 12:29:23 PM5/2/16
to Caffe Users
Well, seems your "label" has the wrong shape. When I look at your second post here, I see that your "label" dataset has the same shape as the "data" dataset, which does not really make sense here, since the whole prupose of this network is to reconstruct a high-resolution version of an input image. So to train a network to do this, the label for a data point is a higher-resolution version of the input image. E.g. if your data input image is 32x32 (which it seems to be), then your output (and hence also label images) could be e.g. 114x114 (I got this number from your output: the conv3 blob seems to have that shape, so the label needs to have that shape as well to use an euclidean layer with both of them as bottoms). Also the number of channels must match: if your label image has three channels (probably RGB), your conv3 also needs to have three channels (just set the num_output of the preceeding conv-layer to 3).

Jan

Antonio Paes

unread,
May 3, 2016, 2:59:01 PM5/3/16
to Caffe Users

Hi Jan, so to respect a my images it is RGB images, but before I start the training i convert to grey scale and use only the luminance channel which is adopted on reconstruction process on literature. About the label dimensions, you say which my labels must have
the same dimensions of network output?

In this case for one 32x32 images my output is 114x114, so the label must be [1, 1, 114, 114] shape?

  
...

Jan

unread,
May 4, 2016, 9:48:38 AM5/4/16
to Caffe Users
The first dimension is the batch size, which should be equal for _all_ blobs in your network, usually you set it implicitly by defining the batch size in your data layers.

I you want to reconstruct grayscale images, then yes, your labels need to have [1, 114, 114] shape, where the 1 stands for the single (luminance) channel 114 high and 114 wide image. You could also try to reconstruct color, but that is probably too hard for the network.

Jan

Antonio Paes

unread,
May 4, 2016, 5:37:04 PM5/4/16
to Caffe Users

Hi jan, I do that and finally the training seems be ok, but the dimension of the labels have some influence on the learning of the network? 

...

Antonio Paes

unread,
May 6, 2016, 6:35:46 PM5/6/16
to Caffe Users
Hey Jan, I got it, thanks for your help, I need implement a new loss function describe on paper, but this is a new issue.

Thank you very much.


Em quarta-feira, 20 de abril de 2016 11:21:24 UTC-3, Antonio Paes escreveu:

Anne

unread,
Jun 15, 2016, 9:07:45 AM6/15/16
to Caffe Users
Hi Antonio,

I saw that you finally succeeded in training your model, but how did you manage to rescale your label images? Is it defined in your label layer?
Thank you in advance for sharing your solution :)

Antonio Paes

unread,
Jun 15, 2016, 2:34:10 PM6/15/16
to Caffe Users
Hi Anne,

I generate the scale of the labels when I create the .h5 file on matlab, so I let the fixed size, but for this I need to know how be the output scale of my network.
Reply all
Reply to author
Forward
0 new messages