Does tensorflow serving supports multiple GPUs?

1,339 views
Skip to first unread message

Jorge Muñoz

unread,
Jun 30, 2016, 9:03:07 AM6/30/16
to Discuss
Hi

I am aware you can run tensorflow with multiple GPUs, but I was wondering if tensorflow serving supports multiple GPUs. If with the exported model for one GPU tensorflow serving is able to replicate it in every GPU and then perform the classification.

If tensorflow serving is no able to do what I said, is it possible to export and load a model which supports for multiple GPUs? The same way it does the example of the inception for training in multiple GPUs: https://github.com/tensorflow/models/blob/master/inception/inception/inception_train.py (I am not sure the configuration for each GPU is stored in the graph when exported and then loaded correctly).

Thanks in advance,
Jorge


Martin Wicke

unread,
Jun 30, 2016, 2:08:07 PM6/30/16
to Jorge Muñoz, Discuss
Serving uses tensorflow in the backend, so it will make use of multiple GPUs if you have them. 

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.
To post to this group, send email to dis...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/7930f3d6-56c4-4035-a416-29911c194ecf%40tensorflow.org.

Jorge Muñoz

unread,
Jun 30, 2016, 2:30:12 PM6/30/16
to Discuss
Thanks Martin. I couldn't find in the documentation that tensorflow serving uses more than one GPU when they are available.

trialcritic

unread,
Jun 30, 2016, 2:34:42 PM6/30/16
to Discuss
Here is the page on GPUs

Tensorflow using GPUs

It has a section, Using Multiple GPUs

Jorge Muñoz

unread,
Jun 30, 2016, 2:38:30 PM6/30/16
to Discuss
Those docs are for tensorflow, I can run the training in multiple GPUs already. My question was related to tensorflow serving once you export the inference layer and load it in the server if it makes use of multiple GPUs without any tweak, as it happens in the training were you have to specify the number of GPUs and handle how to split the data.

Martin Wicke

unread,
Jun 30, 2016, 3:27:43 PM6/30/16
to Jorge Muñoz, Discuss
serving will just take the graph you have and run it. If your graph works on multiple GPUs, it will work on multiple GPUs when serving. If your serving infrastructure is the same as the one you used in training, that should work without touching the model itself.

Martin

Toby Chan

unread,
Aug 9, 2016, 10:13:36 PM8/9/16
to Discuss
If you're using TensorFlow serving with GPUs, there is one more thing you need to know when running with bazel.

We had the problem to compile and run the script on GPUs and please check out this issue https://github.com/tensorflow/tensorflow/issues/3708 . It seems not a bug and you always need to build with `--config=cuda` parameter.

黄炎久

unread,
Dec 29, 2016, 10:42:41 PM12/29/16
to Discuss
I have run my graph in multiple GPUs, however when I enable the log_devices_placment,it shows that the placer only assign the node to "gpu:0". 
If there is some rules to build graph which would takes advantage of multiple GPUs, please explain it. Thank you. 

在 2016年6月30日星期四 UTC+8下午9:03:07,Jorge Muñoz写道:

ricky singh

unread,
May 30, 2018, 4:58:21 AM5/30/18
to Discuss

Jorge

unread,
May 30, 2018, 5:04:45 AM5/30/18
to ricky singh, Discuss
Please do not bump old threads. The answer to my question is that TF serving uses the graph you exported, so if your graph supports multiple GPUs TF does support them too, as it is calling the normal TF. But this solution requieres a bit more of effort and it is an adhoc solution for a fixed number of GPUs, so no the best way to generalize for different types of machines.

--
You received this message because you are subscribed to a topic in the Google Groups "Discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/a/tensorflow.org/d/topic/discuss/B2aSlrWFsuA/unsubscribe.
To unsubscribe from this group and all its topics, send an email to discuss+u...@tensorflow.org.

To post to this group, send email to dis...@tensorflow.org.

ricky singh

unread,
May 30, 2018, 5:14:20 AM5/30/18
to Jorge, Discuss
Thanks for the tip! 
Reply all
Reply to author
Forward
0 new messages