Good evening, dear community! Hope it's right place for this discussion.
I have next problem: trying to repeat one of the modern CNN architectures on Java API. Most of them are using BatchNormalization as a popular layer with tf.nn.batchNormalization() op.
I trying to use old operands like BatchNormWithGlobalNormalization
I've used but got Exception in thread "main" org.tensorflow.exceptions.TFUnimplementedException: Op BatchNormWithGlobalNormalization is not available in GraphDef version 175. It has been removed in version 9. Use tf.nn.batch_normalization().
at org.tensorflow.internal.c_api.AbstractTF_Status.throwExceptionIfNotOK(AbstractTF_Status.java:99)
This was deperecated years ago, but we have it in 1.15 and 2.x APIs.
* BatchNorm contains trainable params and as a result it participates in gradient calculation too (the internal state close to Optimizer and its internal variables, but it's part of the model weights).
But it uses tf.nn.moments which is not presented in our API. Also all known Batch related operands are waiting results of tf.nn.moments as input parameters (looks strange).
@Jim I know you explored a lot of missed things in CC API, maybe you faced with this problem? I suppose it will be a big problem in Java Keras usage without such Layer
@Karl do we have a chance to fix this problem? Have you had deal with such kind of normalization?
Does anybody has working example with BatchNorm?
If you have any ideas or related experience, please share in this thread.
If someone has a working snippet, could you share please?
Alex