[(layers/input input-w input-h 1 :id :data)
(layers/convolutional 5 0 1 20)
(layers/max-pooling 2 0 2)
(layers/dropout 0.9)
(layers/relu)
(layers/convolutional 5 0 1 50)
(layers/max-pooling 2 0 2)
(layers/batch-normalization)
(layers/linear 1000)
(layers/relu :center-loss {:label-indexes {:stream :labels}
:label-inverse-counts {:stream :labels}
:labels {:stream :labels}
:alpha 0.9
:lambda 1e-4})
(layers/dropout 0.5)
(layers/linear num-classes)
(layers/softmax :id :labels)]
It seems like we should get rid of the initial dropout of (layers/dropout 0.9)in the example? Any reason that I'm missing that it should be in there?
- Carin