I am using custom estimator to build a CNN network. In the EVAL part, I returned the training loss and also the loss computed by tf.metrics.mean_squared_error. The code is the following:
loss1 = tf.losses.mean_squared_error(labels, dense)
loss = loss1
if mode == tf.estimator.ModeKeys.TRAIN:
optimizer = tf.train.AdamOptimizer(learning_rate=0.000001)
train_op = optimizer.minimize(loss, global_step=tf.train.get_global_step())
return tf.estimator.EstimatorSpec(mode, loss=loss, train_op=train_op)
if mode == tf.estimator.ModeKeys.EVAL:
metrics = {'pure_loss': tf.metrics.mean_squared_error(labels, tf.cast(dense, tf.float64), name='loss1')}
return tf.estimator.EstimatorSpec(mode, loss=loss, eval_metric_ops=metrics)
These two losses should be the same but the problem is that they are not the same. Here is the result from one run:
{'loss': 0.9698009, 'pure_loss': 1.1101756, 'global_step': 3}
{'loss': 1.0194854, 'pure_loss': 1.1081429, 'global_step': 6}
{'loss': 1.0608621, 'pure_loss': 1.1064296, 'global_step': 9}
{'loss': 1.3501577, 'pure_loss': 1.1048187, 'global_step': 12}
{'loss': 1.156745, 'pure_loss': 1.1032388, 'global_step': 15}
{'loss': 1.094354, 'pure_loss': 1.1016937, 'global_step': 18}
{'loss': 1.1235374, 'pure_loss': 1.100199, 'global_step': 21}
{'loss': 1.005026, 'pure_loss': 1.0985746, 'global_step': 24}
{'loss': 1.403479, 'pure_loss': 1.0968639, 'global_step': 27}
{'loss': 1.100715, 'pure_loss': 1.0952694, 'global_step': 30}
Actually, I want to use loss for total loss and pure_loss for loss without regularization. But it failed so I removed regularization and then found out they are not the same. Is there any problem with my code?