Using the GPU attached to a VM for machine learning only

65 views
Skip to first unread message

Andre Sulka

unread,
Jan 29, 2020, 2:19:45 PM1/29/20
to qubes-users
Hi guys,

i would like to use my currently unused MSI GTX 660 for machine learning with R/Python, Keras, PyTorch, Tensorflow in a Qubes-VM.

I already send a question to the list by email but i can't find it here, so if it appears - sorry for the double post.

Currently both of my monitors are connected to my Onboard-GPU, a Intel i5 CPU, that is ok.

There are some instructions how to attach and what to install, but they are older and I'm not sure if i need to do exactly the same procedure.

I don't want to connect a monitor to it AND i don't want to break my system... ;)

So the question is:

How can i use my NVIDIA GTX 660 inside a template based VM for machine learning, which steps/mods/installs are needed?

Many thanks!

awokd

unread,
Jan 30, 2020, 9:37:33 AM1/30/20
to qubes...@googlegroups.com
Andre Sulka:
1. Replace the nvidia with an older AMD video card
2.
https://github.com/Qubes-Community/Contents/blob/master/docs/customization/windows-gaming-hvm.md

Nvidia is not consumer friendly, and do their best to prevent users
using their hardware as they like without spending thousands on a
business version of their cards. Don't think anyone has been able to get
passthrough working with them.

--
- don't top post
Mailing list etiquette:
- trim quoted reply to only relevant portions
- when possible, copy and paste text instead of screenshots

Andre Sulka

unread,
Feb 4, 2020, 9:56:03 AM2/4/20
to qubes-users
Dear awokd, thanks for your answer.

My Nvidia GPU is unused in Qubes, but it is used for gaming on Win10.

I got some switches to enable/disable a different hdd/ssd. I'm working on SSD1 (Qubes) and gaming is on a separate SSD2 (Win10).

The only AMD card i own is very old, sorry that does not work for me.

No other options?
Reply all
Reply to author
Forward
0 new messages