Hi
I am aware you can run tensorflow with multiple GPUs, but I was wondering if tensorflow serving supports multiple GPUs. If with the exported model for one GPU tensorflow serving is able to replicate it in every GPU and then perform the classification.
If tensorflow serving is no able to do what I said, is it possible to export and load a model which supports for multiple GPUs? The same way it does the example of the inception for training in multiple GPUs:
https://github.com/tensorflow/models/blob/master/inception/inception/inception_train.py (I am not sure the configuration for each GPU is stored in the graph when exported and then loaded correctly).
Thanks in advance,
Jorge