1/1 [==============================] - 0s 386us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 249us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 275us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 280us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 250us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 249us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 252us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 255us/step - loss: 0.0000e+00 - acc: 0.0000e+00
INFO:xtremedistil:Count of instances with label 0.0 is 32
1/1 [==============================] - 0s 260us/step - loss: 0.0000e+00 - acc: 0.0000e+00
Following shows python time() difference around model.evaluate()
python time [2.9290642738342285, 0.07118463516235352, 0.15517759323120117, 0.07076430320739746, 0.06884241104125977, 0.06838345527648926, 0.07424783706665039, 0.06981348991394043, 0.06886076927185059]
Following shows callback.logs that I wrote a custom callback function and passed it to model.evaluate()
callback time [2.894454002380371, 0.039789676666259766, 0.12280726432800293, 0.03963470458984375, 0.0384669303894043, 0.03844141960144043, 0.04326891899108887, 0.039290428161621094, 0.03845667839050293]
Like you can see the time reported is in micro seconds in the first case, but is larger in the second and third case.
with strategy.scope():
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=3e-5, epsilon=1e-08), loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), me
trics=[tf.keras.metrics.SparseCategoricalAccuracy(name="acc")])
logger.info (model.summary())
model.load_weights(os.path.join(args["model_dir"], "x.h5"))
start = time()
Y = model.evaluate(X, batch_size=1, callbacks=[cb])
y = np.argmax(Y, axis=-1)
end = time()
from time import time
class TimingCallback(tf.keras.callbacks.Callback):
def __init__(self):
self.logs=[]
def on_test_begin(self, logs={}):
self.starttime=time()
def on_test_end(self, logs={}):
self.logs.append(time()-self.starttime)
Any help appreciated. Thanks
Deepa
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/22c11962-b0e8-45bc-888e-aa5971cf92d0n%40googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/ccc01333-e37e-4b98-9dfe-dbe2489097een%40googlegroups.com.