Using SavedModelBundle for onnx models ??

20 views
Skip to first unread message

shubham goyal

unread,
Jul 4, 2023, 5:01:51 AM7/4/23
to SIG JVM
Hi everyone,

I am from Flipkart. I am a little new to the jvm world of tensorflow.

I wanted to understand if It is possible run ONNX model using SavedModelBundle module  of tensorflow in java. we want to run the model in memory because of latency concerns.

we have done a poc where we have exported the model using tf.save_model and ran it using SavedModel in java. the latency numbers were great and it was successful. Aside from this,  we also want to evaluate other frameworks also. We want to similalrly check for pytorch based models, where we can convert pytorch model to first onnx format and serve it with tensorflow java.

If its not possible to load onnx format using tensorflow java,  Any alternatives to server pytorch models in java?

Thanks.
Reply all
Reply to author
Forward
0 new messages