Why subtract mean BGR values and where are those mean values come from in FCN?

287 views
Skip to first unread message

JunSik CHOI

unread,
Dec 8, 2016, 6:42:54 AM12/8/16
to Caffe Users
Hi, everyone. 

I am new to caffe, deep learning, so I'm having a hard time to understand caffe, and deep learning theory.. and everyting else..
 
Recently, I am trying to implement Fully Convolutional network in the original paper of Long, Jonathan, Evan Shelhamer, and Trevor Darrell.

In their code, they subtract mean BGR values to every images.

My questions are,

1. why subtract mean BGR values ? 
    - To normalize the images for training and validation? but why? the distribution of pixel values is same after all. isn't it? 
2. In the test phase, Should I subtract mean BGR value of test image?
3. The value in the source code(FCN github) is mean=(104.00699, 116.66877, 122.67892). Where are they come from?
- mean value of images from SBDD train.txt + seg11val.txt?
- actually, I calculated the mean value of images from SBDD train.txt + seg11val.txt,
It's not same with (104.00699, 116.66877, 122.67892). then, where are they come from?

Thank you.

 

Martin Lukačovič

unread,
Mar 5, 2017, 8:27:24 AM3/5/17
to Caffe Users
Hi, is here someone who can clarify this please?

@Jun-sik CHOI
How did you calculate the mean values of images for each channel?

Thanks!


Dňa štvrtok, 8. decembra 2016 12:42:54 UTC+1 Jun-sik CHOI napísal(-a):

souZou

unread,
May 10, 2017, 8:59:47 AM5/10/17
to Caffe Users
Hello;
I have the same problem;
Did you resolve this issue?
How calculate the mean values of images for each channel?

Best regards;

puren...@gmail.com

unread,
Apr 2, 2018, 5:54:15 AM4/2/18
to Caffe Users
Hello,

I would like to get an answer for these questions as well.
Also, why do they convert RGB to BGR? 

Best regards
Reply all
Reply to author
Forward
0 new messages