I am trying to generate the mean image from training data in caffe
. My data is 256x256 grayscale images. I created lmdb by using create_imagenet.sh
by replecing --shuffle
with --gray
.
I edited create_imagenet.sh as follows:
GLOG_logtostderr=1 $TOOLS/convert_imageset \
--resize_height=$RESIZE_HEIGHT \
--resize_width=$RESIZE_WIDTH \
--gray \
$TRAIN_DATA_ROOT \
$DATA/train.txt \
$EXAMPLE/train_lmdb
echo "Creating val lmdb..."
GLOG_logtostderr=1 $TOOLS/convert_imageset \
--resize_height=$RESIZE_HEIGHT \
--resize_width=$RESIZE_WIDTH \
--gray \
$VAL_DATA_ROOT \
$DATA/val.txt \
$EXAMPLE/val_lmdb
echo "Done."
I successfully generated my LMDB databases. But I am still getting the error while creatingmean image
F0105 14:50:52.470038 2191 compute_image_mean.cpp:77] Check failed: size_in_datum == data_size (64000 vs. 65536) Incorrect data field size 64000
*** Check failure stack trace: ***
@ 0x7faa4978d5cd google::LogMessage::Fail()
@ 0x7faa4978f433 google::LogMessage::SendToLog()
@ 0x7faa4978d15b google::LogMessage::Flush()
@ 0x7faa4978fe1e google::LogMessageFatal::~LogMessageFatal()
@ 0x402be1 main
@ 0x7faa486da830 __libc_start_main
@ 0x403249 _start
@ (nil) (unknown)
Aborted (core dumped)
I am sure all images have the same size and format. Does anyone have any suggestion to tackle this error?
Your help is really appreciated.