Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

request a feature in tensorflow-serving which already exists in tensorflow

0 views
Skip to first unread message

zhoumin...@gmail.com

unread,
Mar 18, 2023, 6:59:55 AM3/18/23
to TensorFlow End Users - GETTING STARTED, TUTORIALS & HOW-TO'S
Hi, dear all.

        In TensorFlow 2.0, we know that we can save a model to a file with its weights of dtype as float32, and then load this model from this file and train or inference on it with the computation of data type as float16 by applying the mix precision ability i.e. by running the following snippet of code:

policy = tf.keras.mixed_precision.experimental.Policy('mixed_float16')
tf.keras.mixed_precision.experimental.set_policy(policy)

        I wonder if tensorflow-serving is also bestowed with such capability? I mean, e.g. by putting some parameter with the value set as "mixed_float16" or something like that in, say, model config file and pass it to tensorflow-serving when loading some model (pb) file which all its weights are of dtype as float32 but computing(training or inferencing) by dtype as float16?
If it can be done. How to set this parameter and its value in which file? If it cannot be done, what are the alternative ways to get the same desired result? Or we just can only convert the whole model from dtype as float32 into dtype as float16 as the model pb file being passed into tensorflow-serving?
What are the solutions in tensorflow-serving? Thanks in advance.

Best Regards.

Reply all
Reply to author
Forward
0 new messages