Installing A GPU Card Into One Node Of A Cluster.

Wondering if this is a supported configuration and/or if anyone has done this in a VMware environment. I have a 4 node ESXi cluster and I actually want to install GPU cards into two of the nodes. Then set up rules to prevent GPU enabled VMs from vMotioning to the hosts that don’t have GPU cards.

Thoughts?

I assume you’re thinking of DRS, rather than vMotion. As to manually vMotion a VM you’d obviously specify where the VM was moving to first, as you’d know which Hosts have GPUs installed. Whereas DRS will vMotion automatically based on metrics and policies.

DRS and vGPU do not work together currently ( Section 6.28 - VMware vSphere :: NVIDIA Virtual GPU Software Documentation ) however when they do it will be awesome!

Regards

Ben

Ben,

Thanks for the reply! I’m more concerned about the configuration of two nodes with and two nodes without GPU cards in the same cluster. Does it pose problems? This is probably more of a VMware question but I can’t find anything about it on the VMware site or forums.

Rick.

Hi Rick

Nope, it won’t cause any problems at all. The only real difference is the NVIDIA .vib will be installed on those Hosts that have GPUs installed (assuming you’re using vGPU and not Passthrough). I run various platforms that have different GPUs in different Hosts (P4s in some Hosts and V100s in others within the same Resource Pool), and I just keep those specific GPU enabled VMs running on those specific Hosts. As long as you have at least 2 Hosts that have the same hardware spec, you’re covered for resilience as well.

Regards

Ben

Excellent. That’s what I’m looking for. I simply want to GPU enable a set of VM’s that will be running ArcGIS on two of the four hosts. The rest of the VM’s will not be GPU enabled. Make sense?

Absolutely, and that’ll work without issue.

Which GPU are you using?

Regards

Ben