10bit HDR for vGPU

Hello.
How to set 10bit (HDR) output on virtual M10-1Q DVI connector (NVidia vGPU/GRID) (see https://gridforums.nvidia.com/default/topic/402/) ?

  • GPU driver - R386.09
  • NVAPI - R384

I tried:
NvAPI_GPU_SetEDID() virtual vGPU output is set to valid HDR/HLG/DolbyVision EDID (from LG TV) (resolutions correctly parsed from EDID and present in vGPU).
NvAPI_Disp_GetHdrCapabilities() returns NVAPI_OK but zeroed data (expecting HDR flags extracted from EDID).
NvAPI_Disp_ColorControl() returns NVAPI_ERROR.
Does anyone have working example of NvAPI_Disp_ColorControl() (For example: What is “size” in http://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/struct__NV__COLOR__DATA__V4.html ? Is “data” part variable (I am expecting array) ? …)

Problem was on NVIDIA driver side. The first driver that is compatible with Windows 10 requirements to support HDR in VDI is VGPU 10.1 (Windows 10 driver 442.06). More than 3 years of waiting to “enable” feature from 2009 !
Upload HDR aware EDID is sufficient and Windows control panel must be used. NVAPI (including NVIDIA control panel) is still dysfunctional.

PS: VDI HDR - https://gridforums.nvidia.com/default/topic/402/#16296.