I'm a brand new Keras user. Just I got my fist ConvNet running on a GPU. One thing I can't figure out is how much of the GPU card my model is using. How does one monitor GPU memory usage?
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users+unsubscribe@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/fb0a805f-332d-4acb-b557-be313661338d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
You can run nvidia-smi on the command line while your model is running.
You can run nvidia-smi on the command line while your model is running.Note that, by default, Tensorflow will allocate all your memory, so if you are using that backend, you'll need to explicitly start the session and limit the starting memory (you can easily find the details on Google).
On 11 August 2017 at 02:29, Chris Albertson <alberts...@gmail.com> wrote:
I'm a brand new Keras user. Just I got my fist ConvNet running on a GPU. One thing I can't figure out is how much of the GPU card my model is using. How does one monitor GPU memory usage?
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.