Issue using TF Text with Tensorflow serving

878 views
Skip to first unread message

Maxime Poulain

unread,
Jun 28, 2019, 10:33:55 AM6/28/19
to TensorFlow Developers
Hi ! 

I have a model with TF Text preprocessing included in the graph exported for Tensorflow Serving but I have the following error : 

2019-06-28 13:29:19.048482: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: my_model version: 1561728496} failed: Not found: Op type not registered 'UnicodeScriptTokenizeWithOffsets' in binary running on 57085cb71277. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.

It seems that the 'UnicodeScriptTokenizeWithOffsets' Op from TF Text is not included in TF Server. I saw that a solution could be to build a custom serving image to include TF Text Ops and Kernel using Tensorflow docker development image, is it right ? How could I do that ? Or Maybe there is a simpler solution ? 

I'm using :
- Tensorflow 2.0.0b0 
- Tensorflow Text 1.0.0b0  

Serving is done with the official docker image from : tensorflow/serving:latest

Thanks,

Maxime

Pedram Pejman

unread,
Jul 16, 2019, 5:14:47 PM7/16/19
to TensorFlow Developers
Hi there!
Your observation is correct. TF Serving only links ops that are built into TF core, which does not include tf.text ops. 

Can you please take a look at the recently posted "building TF Serving with custom ops" guide [1] to see if it solves your problem?
If not, could you please open an issue on the serving repo so that we can possibly evolve the guide to be more helpful?
Reply all
Reply to author
Forward
0 new messages