I have a Supermicro AS-2125HS-TNR with a RTX A2000 (12GB) and 24 - 15TB Micron NVMe drives. I’m trying to use software from GRAID to perform GPU accelerated NVMe RAID. When I setup the software it works great and I can get over 100GB/s RAID6 on Ubuntu and 30GB/s Windows 2022. When I reboot the GPU defaults to WDDM mode and the driver fails to load. Nvidia-smi shows error communicating with driver due to server onboard video being primary. I can reboot and change BIOS to external video as default. This allows the driver to load once but subsequent reboots result in driver not loading again. If I change BIOS back and forth between onboard and external, I can get the driver to load each time. That won’t be an option once this server becomes a production Veeam backup repository.
Any way to modify registry to force TCC mode on GPU every reboot? I believe Quadro GPUs default to WDDM mode which I think is the issue.
After some testing, the driver loads and the A2000 is available if I change any setting in the PCIe BIOS settings. It loads once, and then every reboot after the change, the device is unavailable and the driver fails to load. Make any additional change in the BIOS menu for PCIe settings and it loads once more.
Hi
I’m not sure I understand your config fully, BUT: Windows simply needs a GPU (in WDDM) to run its desktop on. you CAN’T run windows with ONLY a GPU in TCC mode…
so: either you keep the onboard server GPU working, for Windows to run its desktop, then you will need to find a way to make your workload go the NV GPU in TCC mode, or you add a 2nd NV GPU in WDDM mode to the system, so Windows is running on NV GPUs, and its way easier to send compute workloads to just the GPU in TCC mode…
would that make sense in the context of your problem…?