Hi all!
I've deployed
universal-sentence-encoder (v4) as an API in rust. The ideia was simple: load the saved model and use the
DEFAULT_SERVING_SIGNATURE_DEF_KEY for inferente, you can see how I did that in the attached file
mod.rs.
NotFound: Op type not registered 'SentencepieceOp' in binary running
It means an operator, the SentencepieceOp, is not found when loading the model. How to register this operator for my rust code?
There is the complete stack:
cargo run -- --model-path docker/model-path/
Finished dev [unoptimized + debuginfo] target(s) in 0.13s
Running `target/debug/vectorization-service --model-path docker/model-path/`
2021-06-23 12:18:58.562720: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: docker/model-path/
2021-06-23 12:18:58.579120: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2021-06-23 12:18:58.601124: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-06-23 12:18:58.628114: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2599990000 Hz
2021-06-23 12:18:58.628866: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55814b458c20 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2021-06-23 12:18:58.628932: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
2021-06-23 12:18:58.707493: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2021-06-23 12:18:59.684364: I tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: docker/model-path/
2021-06-23 12:19:00.191925: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 1629205 microseconds.
thread 'main' panicked at 'Error loading load model: {inner:0x55814b410870, NotFound: Op type not registered 'SentencepieceOp' in binary running on trabalho. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.}', src/encoder_model/mod.rs:89:19
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Panic in Arbiter thread.