Normalization Layer

140 views
Skip to first unread message

yr

unread,
Jan 10, 2015, 5:31:28 AM1/10/15
to caffe...@googlegroups.com
Hi, I have recently read the paper "Improving Deep Neural Network Acoustic Models Using Generalized Maxout Networks" which proposed a few variants of maxout activation. It also suggested a normalization layer as in Eq (5) to constraint the stand deviation of the unbounded activation values, as follows:

Input: xi
Output: yi
sigma = \sqrt{1/K * \sum_i (x_i)^2}
if sigma <= 1: yi = xi
else             : yi = xi / sigma

I am interested in implementing this in Caffe using the existing layer or activation. For sigma, I can use the POWER and ELTWISE with sum. But I don't quite figure out how to compute yi given xi. 
Any suggestion will be appreciated.
Reply all
Reply to author
Forward
0 new messages