Tesla V100 Nvlink (Tesla V100-SXM2-32GB) GPU passthrough

Hello,

We’ve got 4x Tesla V100-SXM2-32GB in a Supermicro Chassis (4029GP-TVRT | 4U | SuperServer | Products | Super Micro Computer, Inc.) and we are attempting to do pci-passthrough of the GPUs into a VM but when we pass through the pci devices and install drivers we get errors from the nvidia-nvlink module.

Is GPU passthrough supported for the nvlink variants of the v100 cards?

Hi,

There is no support for NVLink boards. What is the error message? Which Hypervisor? Dmesg errors?

regards

Simon