how to use tensorflow-text operators

148 views
Skip to first unread message

André Claudino

unread,
Jun 23, 2021, 11:22:56 AM6/23/21
to Rust for TensorFlow
Hi all!

I've deployed universal-sentence-encoder (v4) as an API in rust. The ideia was simple: load the saved model and use the DEFAULT_SERVING_SIGNATURE_DEF_KEY for inferente, you can see how I did that in the attached file mod.rs.

The problem begans when using universal-sentence-encoder-multilingual (v3), as the page says, it needs tensorflow-text library, but the examples show only how to use it in Python. The same rust code, with this new model, results in the following error:

NotFound: Op type not registered 'SentencepieceOp' in binary running

It means an operator, the SentencepieceOp, is not found when loading the model. How to register this operator for my rust code?

There is the complete stack:

cargo run -- --model-path docker/model-path/
Finished dev [unoptimized + debuginfo] target(s) in 0.13s
Running `target/debug/vectorization-service --model-path docker/model-path/`
2021-06-23 12:18:58.562720: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: docker/model-path/
2021-06-23 12:18:58.579120: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2021-06-23 12:18:58.601124: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-06-23 12:18:58.628114: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2599990000 Hz
2021-06-23 12:18:58.628866: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55814b458c20 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2021-06-23 12:18:58.628932: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
2021-06-23 12:18:58.707493: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2021-06-23 12:18:59.684364: I tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: docker/model-path/
2021-06-23 12:19:00.191925: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 1629205 microseconds.
thread 'main' panicked at 'Error loading load model: {inner:0x55814b410870, NotFound: Op type not registered 'SentencepieceOp' in binary running on trabalho. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.}', src/encoder_model/mod.rs:89:19
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Panic in Arbiter thread.
mod.rs

Daniel McKenna

unread,
Jun 23, 2021, 11:28:24 AM6/23/21
to Rust for TensorFlow, André Claudino
So with tf.contrib, in python tensorflow will go over the contrib module folder and load any shared objects using tf_load_library in the C API so those operations are available in the runtime. You'll want to find the shared objects needed and load them via: https://tensorflow.github.io/rust/tensorflow/struct.Library.html#method.load 

It's a bit of a pain, when I last looked into it the best way I saw was building tensorflow from source (if you can figure out bazel), or walking the python install directory for the package and finding the so files and copying them out to your project.

André Claudino

unread,
Jun 23, 2021, 11:44:12 AM6/23/21
to Rust for TensorFlow, danielm...@gmail.com, André Claudino
Thanks, you helped me a lot. I will try to load the .so files, but I prefer to rebuild tensorflow.

If I build tensorflow with these libraries support, I just need to include the resulting library in LD_LIBRARY_PATH, or there is some need to rebuild or adjust some thing in tensorflow-rust crate?

Daniel McKenna

unread,
Jun 23, 2021, 11:47:15 AM6/23/21
to Rust for TensorFlow, André Claudino, Daniel McKenna
There's no need to change anything, but you still need to use the load_library call. The benefit of the bazel build is you _should_ have some control over where the contrib modules end up and keep everything together to make loading easier

André Claudino

unread,
Jun 23, 2021, 2:36:20 PM6/23/21
to Daniel McKenna, Rust for TensorFlow
Do you have some example on how to build tensorflow including external libraries? Unfortunately, looking for operators inside the libs is much harder than I thought.

André Claudino
D²x - Ideias
telefone: (21) 98227-4731
e-mail: clau...@d2x.com.br

Antes de imprimir, verifique se realmente
precisa fazer isso, a natureza agradece!


Daniel McKenna

unread,
Jun 23, 2021, 2:42:41 PM6/23/21
to Rust for TensorFlow, André Claudino, Rust for TensorFlow, Daniel McKenna
No sorry, so tensorflow is a bit annoying in that it uses pre 1.0 bazel and the build system is poorly documented. Whenever I've had to build specific parts I've looked in CI to see if I can find a dockerfile or shell script that does the same thing and use that as a starting point. Unfortunately, that's what passes for build documentation in tensorflow

André Claudino

unread,
Jun 23, 2021, 2:47:10 PM6/23/21
to Daniel McKenna, Rust for TensorFlow
Ok, no worries. Still a good suggestion.

André Claudino
D²x - Ideias
telefone: (21) 98227-4731
e-mail: clau...@d2x.com.br

Antes de imprimir, verifique se realmente
precisa fazer isso, a natureza agradece!


André Claudino

unread,
Jun 24, 2021, 1:55:04 PM6/24/21
to Rust for TensorFlow, André Claudino, Rust for TensorFlow, danielm...@gmail.com
I resolved the problem using @danielm suggestions.

For the future generations: The libraries from a glob pattern-path like "/path/to/lib/*.so" (in extensions_glob variable). 

Screenshot_20210624_145142.png

You need to include tensorflow-sys crate in cargo dependencies.

tensorflow-sys = "^0.19.1"

Thanks to the community, and thanks to @danielm by the help.
Reply all
Reply to author
Forward
0 new messages