I’m currently running a test environment on a Supermicro X10DRG-HT Server, ESXi 6.5, and Horizon 7, using all the latest drivers/versions. I had all set up and running with 8 VDI VMs (Windows 7) and a Horizon View Server (Windows 2012 R2, not using vGPU). They all ran fine at one moment, each VDI configured with M60-2Q (2GB).
However I was not very pleased with video playback/2D in general (3D perform great btw), so I procured a Teradici APEX 2800 card to help on the 2D side. So I put the card in, installed their latest drivers, voila, its running fine.
However, since I have the APEX card installed, I found this weird behavior: Only 4 VMs would run at a time. I went to SSH and ran "nvidia-smi", and I see this process called "Xorg" with PID 68015, Type G, and GPU 1, using 5MiB of Memory. However, it seems to be hogging or locking that GPU (GPU 1) to itself. All my VMs can only run on GPU 0, hence only 4 VMs can run at a time.
I’m not sure if the APEX card is the culprit, since afterwards I unloaded all APEX drivers, and yet that process is still there, and still only 4 VMs can run at a time. I tried going to ESXi Host --> Configure --> Security Profile --> Stop Xorg Service. That would remove the Xorg service, but then no vGPU can run. So the service is required for vGPU, but then its hogging an entire GPU??
Anybody any idea on what I am missing?