Installation of numba (with GPU support) on AWS, e.g. with nvidia-docker?

0 views
Skip to first unread message

remi...@gmail.com

unread,
Dec 13, 2016, 6:10:56 PM12/13/16
to Numba Public Discussion - Public
One common way to get access to high-performance GPUs is through the AWS.

Does the numba documentation provide guidelines on how to install / setup numba (with GPU support) on AWS (and if not, is this something that could potentially be added?)
Also, could this be done with nvidia-docker (https://github.com/NVIDIA/nvidia-docker) for portability ?

Thanks in advance for your help.

Siu Kwan Lam

unread,
Dec 20, 2016, 4:45:51 PM12/20/16
to Numba Public Discussion - Public
We currently don't have any guideline for using CUDA on AWS or docker.  But, the system setup should be the same as that for a CUDA C/C++ application.  With a suitable AMI with CUDA driver, you can simply download Miniconda to install numba and cudatoolkit, and it is ready to go.  The same is true for docker.  

--
You received this message because you are subscribed to the Google Groups "Numba Public Discussion - Public" group.
To unsubscribe from this group and stop receiving emails from it, send an email to numba-users...@continuum.io.
To post to this group, send email to numba...@continuum.io.
To view this discussion on the web visit https://groups.google.com/a/continuum.io/d/msgid/numba-users/55ec4017-00c6-4049-9469-1409a94b1f75%40continuum.io.
For more options, visit https://groups.google.com/a/continuum.io/d/optout.
--
Siu Kwan Lam
Software Engineer
Continuum Analytics

rl...@lbl.gov

unread,
Dec 20, 2016, 8:22:11 PM12/20/16
to Numba Public Discussion - Public
OK, thanks for your answer. I'll give it a try.
Reply all
Reply to author
Forward
0 new messages