DIGITS preprocessing -- Subtract mean image

318 views
Skip to first unread message

Yuguang Lee

unread,
Jul 15, 2015, 6:04:35 PM7/15/15
to digits...@googlegroups.com

Hi guys, 


I'm building a DNN detector looking for certain cell patterns. The detector response positively when black patterns appears in the center of the input image as below. 



(positive sample)    (negative samples)        
As a result, I found that subtracting mean image (BGR image) in the preprocessing instead of subtracting the mean pixel (a single set of BGR value) value from the mean image trains a better model. However, from the code posted here: https://github.com/NVIDIA/DIGITS/issues/59, I found the transformer is not actually subtracting the whole mean image. If this is the case, is there any way I could finish this preprocessing procedure using DIGITS without touching the DIGITS code?




Thanks

Luke Yeager

unread,
Jul 15, 2015, 8:18:28 PM7/15/15
to digits...@googlegroups.com, seva...@gmail.com
While looking into answering your question, I realized there's a bug in DIGITS:

is there any way I could finish this preprocessing procedure using DIGITS without touching the DIGITS code?

To answer your question, no there's not any way to fix it without changing the code at the moment. But the model generated by DIGITS is actually what you're wanting - with the mean image subtracted - even though it's not supposed to be!
Reply all
Reply to author
Forward
0 new messages