how to monitor GPU memory usage

734 views
Skip to first unread message

Chris Albertson

unread,
Aug 10, 2017, 8:29:04 PM8/10/17
to Keras-users

I'm a brand new Keras user.   Just I got my fist ConvNet running on a GPU.  One thing I can't figure out is how much of the GPU card my model is using.   How does one monitor GPU memory usage? 

Daπid

unread,
Aug 11, 2017, 5:13:12 AM8/11/17
to Chris Albertson, Keras-users
You can run nvidia-smi on the command line while your model is running.

Note that, by default, Tensorflow will allocate all your memory, so if you are using that backend, you'll need to explicitly start the session and limit the starting memory (you can easily find the details on Google).

On 11 August 2017 at 02:29, Chris Albertson <alberts...@gmail.com> wrote:

I'm a brand new Keras user.   Just I got my fist ConvNet running on a GPU.  One thing I can't figure out is how much of the GPU card my model is using.   How does one monitor GPU memory usage? 

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/fb0a805f-332d-4acb-b557-be313661338d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Chris Albertson

unread,
Aug 11, 2017, 4:00:39 PM8/11/17
to Keras-users
On Fri, Aug 11, 2017 at 2:12 AM, Daπid <david...@gmail.com> wrote:
You can run nvidia-smi on the command line while your model is running.
 
Thanks,  But now two more questions:

1. This is a current version Ubuntu system and I can build and run Keras networks on my GPU but there is no "nvidia-smi" on my system.   Perhaps it comes with some package I've not installed?  Would anyone know which one?  (Yes I Googled and found comments like "install the drivers" but they must be installed as I'm up and running models on the GPU)

2. I had hoped there would be a way to figure out memory use programmatically so my software could adapt to different target environments


--

Chris Albertson
Redondo Beach, California

arunran...@gmail.com

unread,
Sep 20, 2018, 4:21:45 PM9/20/18
to Keras-users
Is there a built-in way to track GPU utilization than having to run nvidia-smi manually? That would help track GPU utilization for automated runs.


On Friday, August 11, 2017 at 2:13:12 AM UTC-7, David Menéndez Hurtado wrote:
You can run nvidia-smi on the command line while your model is running.

Note that, by default, Tensorflow will allocate all your memory, so if you are using that backend, you'll need to explicitly start the session and limit the starting memory (you can easily find the details on Google).
On 11 August 2017 at 02:29, Chris Albertson <alberts...@gmail.com> wrote:

I'm a brand new Keras user.   Just I got my fist ConvNet running on a GPU.  One thing I can't figure out is how much of the GPU card my model is using.   How does one monitor GPU memory usage? 

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.

李逸帆

unread,
Sep 20, 2018, 9:31:38 PM9/20/18
to Keras-users
maybe you did not set the path?


>>>echo 'export CUDA_HOME=/usr/local/cuda-9.0' >> ~/.bashrc
>>>echo 'export PATH=$PATH:$CUDA_HOME/bin' >> ~/.bashrc
>>>echo 'export LD_LIBRARY_PATH=$CUDA_HOME/lib64' >> ~/.bashrc
>>>source ~/.bashrc

Chris Albertson於 2017年8月11日星期五 UTC+8上午8時29分04秒寫道:
Reply all
Reply to author
Forward
0 new messages