vGPU installation support on ASUS H200 GPU

Dear NVIDIA Enterprise Support Team,

Good Morning.

We are currently evaluating NVIDIA H200 GPUs for our compute-based virtualization (vGPU / mdev) proof of concept using Proxmox (KVM).
Our goal is to enable GPU slicing (MIG/vGPU) so that multiple virtual machines can share H200 resources for AI/ML workloads in our internal HPC environment.

During the evaluation, we noticed that H200 vGPU (mdev) is not currently supported under the vGPU C-Series 17 (R580) release.
We would appreciate your clarification on the following points to help us plan our deployment and licensing:


Key Queries

  1. Please confirm the list of currently supported GPUs for vGPU (C-Series) under the R580 branch (e.g., A100, H100, L40, etc.).

  2. When is vGPU/mdev support expected for the H200 (Hopper) platform — will it be available in C-Series 18 (R590) or later?

  3. Are there any firmware or driver prerequisites required for H200 vGPU enablement once released?

  4. Could you share the licensing details and cost model for NVIDIA vGPU Enterprise (C-Series) — specifically for compute workloads on KVM environments?

  5. Is NVIDIA AI Enterprise licensing mandatory for vGPU usage on Hopper GPUs such as H100/H200?


We would also appreciate any official documentation or roadmap that outlines future H200 vGPU support timelines.

Thank you for your guidance and support.

Regards,

Manu

C-Series is always NV AI Enterprise and yes, H100 and H200 as all the other pure compute GPUs require NV AI Enterprise licensing.

KVM based Proxmox is currently not supported yet for AI Enterprise. Other KVM hypervisors are supported. Please check our support matrix :

AI Enterprise is licensed per GPU. NVL GPUs already contain a 5year AI Enterprise license.

Thanks for the response . So if we need to use vGPU on KVM based proxmox it will be not supported from nvidia ..
Also i want to know per GPU cost if we have other models also like A100 and H100 ..

Regards,

Manu

It should be simple to find out the licensing costs but here the direct link: