Adaptive VSync (not G-Sync/VRR) is not syncing correctly on drivers > 430.26

NOTE: By adaptive vsync, I mean the feature where vsync turns off when frame rate falls below less than refresh rate, and is re-enabled when frame rates exceed the refresh rate. This is not about FreeSync/VRR/G-Sync

The game engine I’m working in is having issues with adaptive vsync . When frame rates exceed 60 FPS, it’s locking to 62-63 FPS rather than 60 and causing frame tearing. What’s odd to me is that this issue doesn’t happen in all scenes, but it seems to happen only in scenes with heavy 3D rendering.

If I switch our engine to use normal vsync, then it locks to 60 FPS without issue. However, the issue did not happen until updating the NVidia drivers past v430.26. I’ve seen it happen on the 440, 450, and 460 driver series. Using normal vsync instead of adaptive vsync is not ideal due to the nature of the product running on low-end hardware with an emphasis on responsiveness.

Since the issue did not appear until updating the drivers, my gut wants to say there is an issue with the newer versions. However due to the inconsistent behavior between 3D-heavy scenes and 2D scenes, is there something potentially being done wrong in code that could cause adaptive vsync to not sync properly?

Hardware Info:
OS: Xubuntu 18.04 LTS (kernel 4.15)
GPU: GTX 1650
Driver: 460.32 (provided by ppa:graphics-drivers)

Would you be able to send us a test case we can use to reproduce the problem? That would be the easiest way for us to try to track it down if it turns out to be a driver regression.

Unfortunately I haven’t been able to reproduce this issue on a test project and I am currently trying to figure out how to cut down the codebase of the engine so that it is small enough to not difficult to run.

In the meantime, is there any other information I can send that may be useful, like perhaps an OpenGL call list provided by either NSight or RenderDoc?