TFServing support for custom devices

34 views
Skip to first unread message

pksubbarao

unread,
Feb 21, 2022, 2:42:53 PM2/21/22
to TensorFlow Developers
Hello All

Can someone point me to documentation on how TFServing can be made to work with custom devices/accelerators? The link below talks about custom-ops, but not custom devices:

For TensorFlow, we have registered our accelerator and op_kernels supported for that device. Then we work with the ops that get offloaded for that device. TF will load custom-device when we set the device as tf.device(<custom_device_name>) and then it loads the shared object file from third party folder.

We tried to build tf-serving with this locally built TF but ran into few bazel issues. But after linking our library statically, we can at least build tensorflow-model-server. However, we are still seeing the ops being placed on CPU even though some of them are eligible to be placed on the device.

Any suggestions/links/documentation  on how TFServing can work with custom devices is appreciated. One suggestion was to use pluggable interface for TF, but not sure if/how that would work with Serving.

Thanks and Regards
Prashantha
Reply all
Reply to author
Forward
0 new messages