Quadro RTX6000 Passthrough

Setup: Dell7525 2x EPYC 7F72 and a Quadro RTX6000 - Proxmox 6.4 on the bare metal with PCI passthrough enabled.

VM running linuxmint20 with the rtx6000 passthrough and nvidia-driver-470 installed. nvidia-smi loads and shows processes loaded on the GPU, but no utilization.
My test app is obs-studio (snap) with all the built in nvenc support. OBS shows loaded on the GPU, but no utilization. All the rendering is still taking place on the CPU.

Is there something limiting the performance of the GTX6000?

Hello @bass1957 and welcome to the NVIDIA Developer forums!

Can you try running
nvidia-smi encodersessions
specifically to check if nvidia-smi reports NvEnc being used? As a next step to verify the GPU as such works as expected you should try to use different rendering applications that utilize different features from the GPU than NvEnc as OBS does.
Maybe run an example Deep Learning docker image or some OpenGL samples?

That could help distinguish what exactly is and is not working for your setup.


# GPU Session    Process   Codec       H       V Average     Average
# Idx      Id         Id    Type     Res     Res     FPS Latency(us)
    0       -          -       -       -       -       -           -
    0       -          -       -       -       -       -           -
    0       -          -       -       -       -       -           -
    0       -          -       -       -       -       -           -

I can send the whole GPU to this VM w/o any paid licensing, correct?

I don’t think I am able to help you further at this point, but I moved your topic into a better suited category where I am sure there will be someone who can give you some suggestions.


Hi @bass1957

Are you still having issues with this setup. Quadro RTX6000 is supported in pass-through and you don’t need any additional software - such as vGPU to make it work.

I am not familiar with Proxmax and how that works for virtual environments. However, assuming that nvidia-smi is detecting the GPU correctly I assume the NVIDIA driver is working correctly.

I suspect that your application is based on GLX and however you are remoting into the server is defaulting to CPU rendering. I recommend running X11VNC on the server and see if that helps.


Yes, still trying to work this out. Had some other issue pop-up but will be getting back to it shortly.

That’s what I’m beginning to realize and looking for a way to run heavy graphic apps (like OBS) but use the GPU to render the desktop and the encoding work. I was using Proxmox’s built in remote desktop (spice or novnc). So, yeah, I guess all the desktop rendering was done on the host’s CPU. :/

That said, I’m moving to try and use an LXC arrangement on proxmox rather than QEMU VM’s so I can have multiple, different apps/processes use the GPU simultaneously.
(eg. spin up a quick VM for someone to render a backlog of video content into more suitable formats, while other regular use LXC’s are using the gpu for object detection and what not…)

I’ve got the drivers loaded on the host and I can spin up an LXC and install the same version of the drivers and SMI is happy. Now I’m working on the apps part of the equation.

But just so I’m clear, if I’m running a desktop environment (like Mate) in an LXC, can I have that desktop rendered/processed on the quadro? (using a different remote desktop solution like X11VNC)
What are the tricks or need to knows?

I’ve already found out the snaps don’t work well in LXC’s so not snap ffmpeg or OBS :(. I’ll have to build my own ffmpeg.
Lots of questions, sorry, but thanks in advance for your advise.