error caffe train on caffenet for mammography classification(very long log)

68 views
Skip to first unread message

신동화

unread,
Jan 18, 2016, 1:28:52 PM1/18/16
to Caffe Users
I started caffe for mammography classification
I just try step by step, but I have less understand about it

so first, I try to collecting mammography data
and convert to tif format for using import on caffe
then create label normal(1), cancer(2), benign(3)

continually, I try convert_imageset

GLOG_logtostderr=1 ./build/tools/convert_imageset --resize_height=264 --resize_width=264 ./mydata/ ./result.txt ./train_lmdb


also, compute_image_mean

./build/tools/compute_image_mean ./train_lmdb ./mean.binaryproto


now it's training time!

./build/tools/caffe train --solver=models/bvlc_reference_caffenet/solver.prototxt


but unfortunatly, I'm fail

how can I do that, I don't understand this log


-I'm sorry it's too long!

<caffe train execute result>

I0119 03:24:17.752466 31232 caffe.cpp:184] Using GPUs 0

I0119 03:24:18.006865 31232 solver.cpp:48] Initializing solver from parameters: 

test_iter: 1000

test_interval: 1000

base_lr: 0.01

display: 20

max_iter: 450000

lr_policy: "step"

gamma: 0.1

momentum: 0.9

weight_decay: 0.0005

stepsize: 100000

snapshot: 10000

snapshot_prefix: "models/bvlc_reference_caffenet/caffenet_train"

solver_mode: GPU

device_id: 0

net: "models/bvlc_reference_caffenet/train_val.prototxt"

I0119 03:24:18.007097 31232 solver.cpp:91] Creating training net from net file: models/bvlc_reference_caffenet/train_val.prototxt

I0119 03:24:18.007602 31232 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer data

I0119 03:24:18.007637 31232 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy

I0119 03:24:18.007902 31232 net.cpp:49] Initializing net from parameters: 

name: "CaffeNet"

state {

  phase: TRAIN

}

layer {

  name: "data"

  type: "Data"

  top: "data"

  top: "label"

  include {

    phase: TRAIN

  }

  transform_param {

    mirror: true

    crop_size: 227

    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"

  }

  data_param {

    source: "examples/imagenet/ilsvrc12_train_lmdb"

    batch_size: 256

    backend: LMDB

  }

}

layer {

  name: "conv1"

  type: "Convolution"

  bottom: "data"

  top: "conv1"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 96

    kernel_size: 11

    stride: 4

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "relu1"

  type: "ReLU"

  bottom: "conv1"

  top: "conv1"

}

layer {

  name: "pool1"

  type: "Pooling"

  bottom: "conv1"

  top: "pool1"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm1"

  type: "LRN"

  bottom: "pool1"

  top: "norm1"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv2"

  type: "Convolution"

  bottom: "norm1"

  top: "conv2"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 256

    pad: 2

    kernel_size: 5

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu2"

  type: "ReLU"

  bottom: "conv2"

  top: "conv2"

}

layer {

  name: "pool2"

  type: "Pooling"

  bottom: "conv2"

  top: "pool2"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm2"

  type: "LRN"

  bottom: "pool2"

  top: "norm2"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv3"

  type: "Convolution"

  bottom: "norm2"

  top: "conv3"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "relu3"

  type: "ReLU"

  bottom: "conv3"

  top: "conv3"

}

layer {

  name: "conv4"

  type: "Convolution"

  bottom: "conv3"

  top: "conv4"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu4"

  type: "ReLU"

  bottom: "conv4"

  top: "conv4"

}

layer {

  name: "conv5"

  type: "Convolution"

  bottom: "conv4"

  top: "conv5"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 256

    pad: 1

    kernel_size: 3

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu5"

  type: "ReLU"

  bottom: "conv5"

  top: "conv5"

}

layer {

  name: "pool5"

  type: "Pooling"

  bottom: "conv5"

  top: "pool5"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "fc6"

  type: "InnerProduct"

  bottom: "pool5"

  top: "fc6"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 4096

    weight_filler {

      type: "gaussian"

      std: 0.005

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu6"

  type: "ReLU"

  bottom: "fc6"

  top: "fc6"

}

layer {

  name: "drop6"

  type: "Dropout"

  bottom: "fc6"

  top: "fc6"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc7"

  type: "InnerProduct"

  bottom: "fc6"

  top: "fc7"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 4096

    weight_filler {

      type: "gaussian"

      std: 0.005

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu7"

  type: "ReLU"

  bottom: "fc7"

  top: "fc7"

}

layer {

  name: "drop7"

  type: "Dropout"

  bottom: "fc7"

  top: "fc7"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc8"

  type: "InnerProduct"

  bottom: "fc7"

  top: "fc8"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 1000

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "loss"

  type: "SoftmaxWithLoss"

  bottom: "fc8"

  bottom: "label"

  top: "loss"

}

I0119 03:24:18.008065 31232 layer_factory.hpp:77] Creating layer data

I0119 03:24:18.008761 31232 net.cpp:106] Creating Layer data

I0119 03:24:18.008788 31232 net.cpp:411] data -> data

I0119 03:24:18.008841 31232 net.cpp:411] data -> label

I0119 03:24:18.008864 31232 data_transformer.cpp:25] Loading mean file from: data/ilsvrc12/imagenet_mean.binaryproto

I0119 03:24:18.010421 31235 db_lmdb.cpp:38] Opened lmdb examples/imagenet/ilsvrc12_train_lmdb

I0119 03:24:18.027631 31232 data_layer.cpp:41] output data size: 256,3,227,227

I0119 03:24:18.391597 31232 net.cpp:150] Setting up data

I0119 03:24:18.391686 31232 net.cpp:157] Top shape: 256 3 227 227 (39574272)

I0119 03:24:18.391698 31232 net.cpp:157] Top shape: 256 (256)

I0119 03:24:18.391705 31232 net.cpp:165] Memory required for data: 158298112

I0119 03:24:18.391729 31232 layer_factory.hpp:77] Creating layer conv1

I0119 03:24:18.391778 31232 net.cpp:106] Creating Layer conv1

I0119 03:24:18.391791 31232 net.cpp:454] conv1 <- data

I0119 03:24:18.391816 31232 net.cpp:411] conv1 -> conv1

I0119 03:24:18.598374 31232 net.cpp:150] Setting up conv1

I0119 03:24:18.598435 31232 net.cpp:157] Top shape: 256 96 55 55 (74342400)

I0119 03:24:18.598443 31232 net.cpp:165] Memory required for data: 455667712

I0119 03:24:18.598481 31232 layer_factory.hpp:77] Creating layer relu1

I0119 03:24:18.598502 31232 net.cpp:106] Creating Layer relu1

I0119 03:24:18.598511 31232 net.cpp:454] relu1 <- conv1

I0119 03:24:18.598522 31232 net.cpp:397] relu1 -> conv1 (in-place)

I0119 03:24:18.598687 31232 net.cpp:150] Setting up relu1

I0119 03:24:18.598703 31232 net.cpp:157] Top shape: 256 96 55 55 (74342400)

I0119 03:24:18.598711 31232 net.cpp:165] Memory required for data: 753037312

I0119 03:24:18.598716 31232 layer_factory.hpp:77] Creating layer pool1

I0119 03:24:18.598728 31232 net.cpp:106] Creating Layer pool1

I0119 03:24:18.598736 31232 net.cpp:454] pool1 <- conv1

I0119 03:24:18.598744 31232 net.cpp:411] pool1 -> pool1

I0119 03:24:18.599056 31232 net.cpp:150] Setting up pool1

I0119 03:24:18.599072 31232 net.cpp:157] Top shape: 256 96 27 27 (17915904)

I0119 03:24:18.599078 31232 net.cpp:165] Memory required for data: 824700928

I0119 03:24:18.599086 31232 layer_factory.hpp:77] Creating layer norm1

I0119 03:24:18.599104 31232 net.cpp:106] Creating Layer norm1

I0119 03:24:18.599112 31232 net.cpp:454] norm1 <- pool1

I0119 03:24:18.599133 31232 net.cpp:411] norm1 -> norm1

I0119 03:24:18.599319 31232 net.cpp:150] Setting up norm1

I0119 03:24:18.599334 31232 net.cpp:157] Top shape: 256 96 27 27 (17915904)

I0119 03:24:18.599341 31232 net.cpp:165] Memory required for data: 896364544

I0119 03:24:18.599347 31232 layer_factory.hpp:77] Creating layer conv2

I0119 03:24:18.599367 31232 net.cpp:106] Creating Layer conv2

I0119 03:24:18.599375 31232 net.cpp:454] conv2 <- norm1

I0119 03:24:18.599385 31232 net.cpp:411] conv2 -> conv2

I0119 03:24:18.605765 31232 net.cpp:150] Setting up conv2

I0119 03:24:18.605787 31232 net.cpp:157] Top shape: 256 256 27 27 (47775744)

I0119 03:24:18.605794 31232 net.cpp:165] Memory required for data: 1087467520

I0119 03:24:18.605809 31232 layer_factory.hpp:77] Creating layer relu2

I0119 03:24:18.605823 31232 net.cpp:106] Creating Layer relu2

I0119 03:24:18.605830 31232 net.cpp:454] relu2 <- conv2

I0119 03:24:18.605839 31232 net.cpp:397] relu2 -> conv2 (in-place)

I0119 03:24:18.605983 31232 net.cpp:150] Setting up relu2

I0119 03:24:18.605998 31232 net.cpp:157] Top shape: 256 256 27 27 (47775744)

I0119 03:24:18.606004 31232 net.cpp:165] Memory required for data: 1278570496

I0119 03:24:18.606011 31232 layer_factory.hpp:77] Creating layer pool2

I0119 03:24:18.606020 31232 net.cpp:106] Creating Layer pool2

I0119 03:24:18.606026 31232 net.cpp:454] pool2 <- conv2

I0119 03:24:18.606035 31232 net.cpp:411] pool2 -> pool2

I0119 03:24:18.606320 31232 net.cpp:150] Setting up pool2

I0119 03:24:18.606336 31232 net.cpp:157] Top shape: 256 256 13 13 (11075584)

I0119 03:24:18.606343 31232 net.cpp:165] Memory required for data: 1322872832

I0119 03:24:18.606349 31232 layer_factory.hpp:77] Creating layer norm2

I0119 03:24:18.606362 31232 net.cpp:106] Creating Layer norm2

I0119 03:24:18.606369 31232 net.cpp:454] norm2 <- pool2

I0119 03:24:18.606379 31232 net.cpp:411] norm2 -> norm2

I0119 03:24:18.606694 31232 net.cpp:150] Setting up norm2

I0119 03:24:18.606711 31232 net.cpp:157] Top shape: 256 256 13 13 (11075584)

I0119 03:24:18.606717 31232 net.cpp:165] Memory required for data: 1367175168

I0119 03:24:18.606724 31232 layer_factory.hpp:77] Creating layer conv3

I0119 03:24:18.606737 31232 net.cpp:106] Creating Layer conv3

I0119 03:24:18.606745 31232 net.cpp:454] conv3 <- norm2

I0119 03:24:18.606758 31232 net.cpp:411] conv3 -> conv3

I0119 03:24:18.619767 31232 net.cpp:150] Setting up conv3

I0119 03:24:18.619789 31232 net.cpp:157] Top shape: 256 384 13 13 (16613376)

I0119 03:24:18.619797 31232 net.cpp:165] Memory required for data: 1433628672

I0119 03:24:18.619810 31232 layer_factory.hpp:77] Creating layer relu3

I0119 03:24:18.619822 31232 net.cpp:106] Creating Layer relu3

I0119 03:24:18.619829 31232 net.cpp:454] relu3 <- conv3

I0119 03:24:18.619838 31232 net.cpp:397] relu3 -> conv3 (in-place)

I0119 03:24:18.620112 31232 net.cpp:150] Setting up relu3

I0119 03:24:18.620128 31232 net.cpp:157] Top shape: 256 384 13 13 (16613376)

I0119 03:24:18.620134 31232 net.cpp:165] Memory required for data: 1500082176

I0119 03:24:18.620141 31232 layer_factory.hpp:77] Creating layer conv4

I0119 03:24:18.620157 31232 net.cpp:106] Creating Layer conv4

I0119 03:24:18.620164 31232 net.cpp:454] conv4 <- conv3

I0119 03:24:18.620177 31232 net.cpp:411] conv4 -> conv4

I0119 03:24:18.631274 31232 net.cpp:150] Setting up conv4

I0119 03:24:18.631300 31232 net.cpp:157] Top shape: 256 384 13 13 (16613376)

I0119 03:24:18.631307 31232 net.cpp:165] Memory required for data: 1566535680

I0119 03:24:18.631319 31232 layer_factory.hpp:77] Creating layer relu4

I0119 03:24:18.631328 31232 net.cpp:106] Creating Layer relu4

I0119 03:24:18.631335 31232 net.cpp:454] relu4 <- conv4

I0119 03:24:18.631343 31232 net.cpp:397] relu4 -> conv4 (in-place)

I0119 03:24:18.631609 31232 net.cpp:150] Setting up relu4

I0119 03:24:18.631624 31232 net.cpp:157] Top shape: 256 384 13 13 (16613376)

I0119 03:24:18.631631 31232 net.cpp:165] Memory required for data: 1632989184

I0119 03:24:18.631638 31232 layer_factory.hpp:77] Creating layer conv5

I0119 03:24:18.631652 31232 net.cpp:106] Creating Layer conv5

I0119 03:24:18.631667 31232 net.cpp:454] conv5 <- conv4

I0119 03:24:18.631688 31232 net.cpp:411] conv5 -> conv5

I0119 03:24:18.639650 31232 net.cpp:150] Setting up conv5

I0119 03:24:18.639672 31232 net.cpp:157] Top shape: 256 256 13 13 (11075584)

I0119 03:24:18.639678 31232 net.cpp:165] Memory required for data: 1677291520

I0119 03:24:18.639695 31232 layer_factory.hpp:77] Creating layer relu5

I0119 03:24:18.639706 31232 net.cpp:106] Creating Layer relu5

I0119 03:24:18.639714 31232 net.cpp:454] relu5 <- conv5

I0119 03:24:18.639722 31232 net.cpp:397] relu5 -> conv5 (in-place)

I0119 03:24:18.639874 31232 net.cpp:150] Setting up relu5

I0119 03:24:18.639889 31232 net.cpp:157] Top shape: 256 256 13 13 (11075584)

I0119 03:24:18.639895 31232 net.cpp:165] Memory required for data: 1721593856

I0119 03:24:18.639902 31232 layer_factory.hpp:77] Creating layer pool5

I0119 03:24:18.639914 31232 net.cpp:106] Creating Layer pool5

I0119 03:24:18.639921 31232 net.cpp:454] pool5 <- conv5

I0119 03:24:18.639930 31232 net.cpp:411] pool5 -> pool5

I0119 03:24:18.640234 31232 net.cpp:150] Setting up pool5

I0119 03:24:18.640250 31232 net.cpp:157] Top shape: 256 256 6 6 (2359296)

I0119 03:24:18.640256 31232 net.cpp:165] Memory required for data: 1731031040

I0119 03:24:18.640264 31232 layer_factory.hpp:77] Creating layer fc6

I0119 03:24:18.640281 31232 net.cpp:106] Creating Layer fc6

I0119 03:24:18.640288 31232 net.cpp:454] fc6 <- pool5

I0119 03:24:18.640300 31232 net.cpp:411] fc6 -> fc6

I0119 03:24:19.173610 31232 net.cpp:150] Setting up fc6

I0119 03:24:19.173717 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.173727 31232 net.cpp:165] Memory required for data: 1735225344

I0119 03:24:19.173764 31232 layer_factory.hpp:77] Creating layer relu6

I0119 03:24:19.173811 31232 net.cpp:106] Creating Layer relu6

I0119 03:24:19.173823 31232 net.cpp:454] relu6 <- fc6

I0119 03:24:19.173838 31232 net.cpp:397] relu6 -> fc6 (in-place)

I0119 03:24:19.174217 31232 net.cpp:150] Setting up relu6

I0119 03:24:19.174232 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.174239 31232 net.cpp:165] Memory required for data: 1739419648

I0119 03:24:19.174247 31232 layer_factory.hpp:77] Creating layer drop6

I0119 03:24:19.174278 31232 net.cpp:106] Creating Layer drop6

I0119 03:24:19.174288 31232 net.cpp:454] drop6 <- fc6

I0119 03:24:19.174299 31232 net.cpp:397] drop6 -> fc6 (in-place)

I0119 03:24:19.174340 31232 net.cpp:150] Setting up drop6

I0119 03:24:19.174350 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.174356 31232 net.cpp:165] Memory required for data: 1743613952

I0119 03:24:19.174362 31232 layer_factory.hpp:77] Creating layer fc7

I0119 03:24:19.174377 31232 net.cpp:106] Creating Layer fc7

I0119 03:24:19.174384 31232 net.cpp:454] fc7 <- fc6

I0119 03:24:19.174396 31232 net.cpp:411] fc7 -> fc7

I0119 03:24:19.411005 31232 net.cpp:150] Setting up fc7

I0119 03:24:19.411108 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.411116 31232 net.cpp:165] Memory required for data: 1747808256

I0119 03:24:19.411136 31232 layer_factory.hpp:77] Creating layer relu7

I0119 03:24:19.411154 31232 net.cpp:106] Creating Layer relu7

I0119 03:24:19.411164 31232 net.cpp:454] relu7 <- fc7

I0119 03:24:19.411177 31232 net.cpp:397] relu7 -> fc7 (in-place)

I0119 03:24:19.411723 31232 net.cpp:150] Setting up relu7

I0119 03:24:19.411739 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.411746 31232 net.cpp:165] Memory required for data: 1752002560

I0119 03:24:19.411752 31232 layer_factory.hpp:77] Creating layer drop7

I0119 03:24:19.411766 31232 net.cpp:106] Creating Layer drop7

I0119 03:24:19.411772 31232 net.cpp:454] drop7 <- fc7

I0119 03:24:19.411782 31232 net.cpp:397] drop7 -> fc7 (in-place)

I0119 03:24:19.411813 31232 net.cpp:150] Setting up drop7

I0119 03:24:19.411826 31232 net.cpp:157] Top shape: 256 4096 (1048576)

I0119 03:24:19.411834 31232 net.cpp:165] Memory required for data: 1756196864

I0119 03:24:19.411839 31232 layer_factory.hpp:77] Creating layer fc8

I0119 03:24:19.411852 31232 net.cpp:106] Creating Layer fc8

I0119 03:24:19.411882 31232 net.cpp:454] fc8 <- fc7

I0119 03:24:19.411906 31232 net.cpp:411] fc8 -> fc8

I0119 03:24:19.469455 31232 net.cpp:150] Setting up fc8

I0119 03:24:19.469527 31232 net.cpp:157] Top shape: 256 1000 (256000)

I0119 03:24:19.469533 31232 net.cpp:165] Memory required for data: 1757220864

I0119 03:24:19.469552 31232 layer_factory.hpp:77] Creating layer loss

I0119 03:24:19.469569 31232 net.cpp:106] Creating Layer loss

I0119 03:24:19.469578 31232 net.cpp:454] loss <- fc8

I0119 03:24:19.469588 31232 net.cpp:454] loss <- label

I0119 03:24:19.469606 31232 net.cpp:411] loss -> loss

I0119 03:24:19.469629 31232 layer_factory.hpp:77] Creating layer loss

I0119 03:24:19.470729 31232 net.cpp:150] Setting up loss

I0119 03:24:19.470748 31232 net.cpp:157] Top shape: (1)

I0119 03:24:19.470755 31232 net.cpp:160]     with loss weight 1

I0119 03:24:19.470845 31232 net.cpp:165] Memory required for data: 1757220868

I0119 03:24:19.470854 31232 net.cpp:226] loss needs backward computation.

I0119 03:24:19.470866 31232 net.cpp:226] fc8 needs backward computation.

I0119 03:24:19.470873 31232 net.cpp:226] drop7 needs backward computation.

I0119 03:24:19.470880 31232 net.cpp:226] relu7 needs backward computation.

I0119 03:24:19.470885 31232 net.cpp:226] fc7 needs backward computation.

I0119 03:24:19.470895 31232 net.cpp:226] drop6 needs backward computation.

I0119 03:24:19.470901 31232 net.cpp:226] relu6 needs backward computation.

I0119 03:24:19.470906 31232 net.cpp:226] fc6 needs backward computation.

I0119 03:24:19.470913 31232 net.cpp:226] pool5 needs backward computation.

I0119 03:24:19.470919 31232 net.cpp:226] relu5 needs backward computation.

I0119 03:24:19.470927 31232 net.cpp:226] conv5 needs backward computation.

I0119 03:24:19.470933 31232 net.cpp:226] relu4 needs backward computation.

I0119 03:24:19.470940 31232 net.cpp:226] conv4 needs backward computation.

I0119 03:24:19.470947 31232 net.cpp:226] relu3 needs backward computation.

I0119 03:24:19.470954 31232 net.cpp:226] conv3 needs backward computation.

I0119 03:24:19.470960 31232 net.cpp:226] norm2 needs backward computation.

I0119 03:24:19.470968 31232 net.cpp:226] pool2 needs backward computation.

I0119 03:24:19.470976 31232 net.cpp:226] relu2 needs backward computation.

I0119 03:24:19.470983 31232 net.cpp:226] conv2 needs backward computation.

I0119 03:24:19.470991 31232 net.cpp:226] norm1 needs backward computation.

I0119 03:24:19.470999 31232 net.cpp:226] pool1 needs backward computation.

I0119 03:24:19.471005 31232 net.cpp:226] relu1 needs backward computation.

I0119 03:24:19.471012 31232 net.cpp:226] conv1 needs backward computation.

I0119 03:24:19.471019 31232 net.cpp:228] data does not need backward computation.

I0119 03:24:19.471025 31232 net.cpp:270] This network produces output loss

I0119 03:24:19.471046 31232 net.cpp:283] Network initialization done.

I0119 03:24:19.471745 31232 solver.cpp:181] Creating test net (#0) specified by net file: models/bvlc_reference_caffenet/train_val.prototxt

I0119 03:24:19.471794 31232 net.cpp:322] The NetState phase (1) differed from the phase (0) specified by a rule in layer data

I0119 03:24:19.472040 31232 net.cpp:49] Initializing net from parameters: 

name: "CaffeNet"

state {

  phase: TEST

}

layer {

  name: "data"

  type: "Data"

  top: "data"

  top: "label"

  include {

    phase: TEST

  }

  transform_param {

    mirror: false

    crop_size: 227

    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"

  }

  data_param {

    source: "examples/imagenet/ilsvrc12_val_lmdb"

    batch_size: 50

    backend: LMDB

  }

}

layer {

  name: "conv1"

  type: "Convolution"

  bottom: "data"

  top: "conv1"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 96

    kernel_size: 11

    stride: 4

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "relu1"

  type: "ReLU"

  bottom: "conv1"

  top: "conv1"

}

layer {

  name: "pool1"

  type: "Pooling"

  bottom: "conv1"

  top: "pool1"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm1"

  type: "LRN"

  bottom: "pool1"

  top: "norm1"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv2"

  type: "Convolution"

  bottom: "norm1"

  top: "conv2"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 256

    pad: 2

    kernel_size: 5

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu2"

  type: "ReLU"

  bottom: "conv2"

  top: "conv2"

}

layer {

  name: "pool2"

  type: "Pooling"

  bottom: "conv2"

  top: "pool2"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm2"

  type: "LRN"

  bottom: "pool2"

  top: "norm2"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv3"

  type: "Convolution"

  bottom: "norm2"

  top: "conv3"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "relu3"

  type: "ReLU"

  bottom: "conv3"

  top: "conv3"

}

layer {

  name: "conv4"

  type: "Convolution"

  bottom: "conv3"

  top: "conv4"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu4"

  type: "ReLU"

  bottom: "conv4"

  top: "conv4"

}

layer {

  name: "conv5"

  type: "Convolution"

  bottom: "conv4"

  top: "conv5"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  convolution_param {

    num_output: 256

    pad: 1

    kernel_size: 3

    group: 2

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu5"

  type: "ReLU"

  bottom: "conv5"

  top: "conv5"

}

layer {

  name: "pool5"

  type: "Pooling"

  bottom: "conv5"

  top: "pool5"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "fc6"

  type: "InnerProduct"

  bottom: "pool5"

  top: "fc6"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 4096

    weight_filler {

      type: "gaussian"

      std: 0.005

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu6"

  type: "ReLU"

  bottom: "fc6"

  top: "fc6"

}

layer {

  name: "drop6"

  type: "Dropout"

  bottom: "fc6"

  top: "fc6"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc7"

  type: "InnerProduct"

  bottom: "fc6"

  top: "fc7"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 4096

    weight_filler {

      type: "gaussian"

      std: 0.005

    }

    bias_filler {

      type: "constant"

      value: 1

    }

  }

}

layer {

  name: "relu7"

  type: "ReLU"

  bottom: "fc7"

  top: "fc7"

}

layer {

  name: "drop7"

  type: "Dropout"

  bottom: "fc7"

  top: "fc7"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc8"

  type: "InnerProduct"

  bottom: "fc7"

  top: "fc8"

  param {

    lr_mult: 1

    decay_mult: 1

  }

  param {

    lr_mult: 2

    decay_mult: 0

  }

  inner_product_param {

    num_output: 1000

    weight_filler {

      type: "gaussian"

      std: 0.01

    }

    bias_filler {

      type: "constant"

      value: 0

    }

  }

}

layer {

  name: "accuracy"

  type: "Accuracy"

  bottom: "fc8"

  bottom: "label"

  top: "accuracy"

  include {

    phase: TEST

  }

}

layer {

  name: "loss"

  type: "SoftmaxWithLoss"

  bottom: "fc8"

  bottom: "label"

  top: "loss"

}

I0119 03:24:19.472198 31232 layer_factory.hpp:77] Creating layer data

I0119 03:24:19.472345 31232 net.cpp:106] Creating Layer data

I0119 03:24:19.472362 31232 net.cpp:411] data -> data

I0119 03:24:19.472378 31232 net.cpp:411] data -> label

I0119 03:24:19.472390 31232 data_transformer.cpp:25] Loading mean file from: data/ilsvrc12/imagenet_mean.binaryproto

F0119 03:24:19.473767 31237 db_lmdb.hpp:14] Check failed: mdb_status == 0 (2 vs. 0) No such file or directory

*** Check failure stack trace: ***

    @     0x7f4ddcede5cd  google::LogMessage::Fail()

    @     0x7f4ddcee0433  google::LogMessage::SendToLog()

    @     0x7f4ddcede15b  google::LogMessage::Flush()

    @     0x7f4ddcee0e1e  google::LogMessageFatal::~LogMessageFatal()

    @     0x7f4ddd5a2362  caffe::db::LMDB::Open()

    @     0x7f4ddd424396  caffe::DataReader::Body::InternalThreadEntry()

    @     0x7f4ddd41d725  caffe::InternalThread::entry()

    @     0x7f4ddb714bc5  (unknown)

    @     0x7f4ddb4ed6aa  start_thread

    @     0x7f4ddba30eed  clone

    @              (nil)  (unknown)




Jan C Peters

unread,
Jan 19, 2016, 4:42:54 AM1/19/16
to Caffe Users
the last ~10 lines of the log would have sufficed to see the error. caffe tries to open an lmdb database which does not exist. Why? Because it uses a network definition that tells it to use "examples/imagenet/ilsvrc12_train_lmdb" which is an example database, not the one you want to use. Same goes for the mean file. You created a database for your files and computed the mean file, but the caffe train command you issued is completely ignorant of them. You cannot just take an existing solver and network definition, you have to adjust it at least to take your data, your computed mean file, your training settings. Take your time and read the tutorial documentation first (http://caffe.berkeleyvision.org/tutorial/). Just downloading and directly being productive with caffe will not work. You will have to understand what you are doing.

Jan
Reply all
Reply to author
Forward
0 new messages