glViewportIndexedf() wrongly rounding down x and y parameters

Very recently we started using ARB_clip_control and ARB_viewport_array functionality in Wine to handle the d3d - OpenGL clip space differences in place of vertex position fixups in the shader code. It turns out this breaks under Nvidia binary drivers.

Specifically, for d3d10 applications we offset the viewport a tiny bit in the negative direction, to try to account for the different triangle rasterization rules. For some reason on Nvidia this actually shifts all the rendering by one whole pixel. The value we’re currently using is -1.0f / 128.0f but even much smaller values like -1.0f / 4096.0f (which BTW are way below the advertised subpixel precision - GL_VIEWPORT_SUBPIXEL_BITS gives 8 here) do the same thing, so we suspect this is caused by some kind of rounding issue in the driver code.
Applying the offset to the vertex position instead of shifting the viewport works fine and has always worked fine in the past.

I’ve had the issue reported from multiple people / multiple GPUs so it’s probably not HW-specific, but for reference I can reproduce it on a GTX 970. This works fine on Mesa r600g.

I’m attaching a small standalone testcase for this issue.

This does look like a bug in our driver. Tracking this as NVIDIA bug 1826133.
Thank you

Is this bug still not resolved?

FWIW this was fixed in driver version 415.