NOTE: By adaptive vsync, I mean the feature where vsync turns off when frame rate falls below less than refresh rate, and is re-enabled when frame rates exceed the refresh rate. This is not about FreeSync/VRR/G-Sync
The game engine I’m working in is having issues with adaptive vsync . When frame rates exceed 60 FPS, it’s locking to 62-63 FPS rather than 60 and causing frame tearing. What’s odd to me is that this issue doesn’t happen in all scenes, but it seems to happen only in scenes with heavy 3D rendering.
If I switch our engine to use normal vsync, then it locks to 60 FPS without issue. However, the issue did not happen until updating the NVidia drivers past v430.26. I’ve seen it happen on the 440, 450, and 460 driver series. Using normal vsync instead of adaptive vsync is not ideal due to the nature of the product running on low-end hardware with an emphasis on responsiveness.
Since the issue did not appear until updating the drivers, my gut wants to say there is an issue with the newer versions. However due to the inconsistent behavior between 3D-heavy scenes and 2D scenes, is there something potentially being done wrong in code that could cause adaptive vsync to not sync properly?
Hardware Info:
OS: Xubuntu 18.04 LTS (kernel 4.15)
GPU: GTX 1650
Driver: 460.32 (provided by ppa:graphics-drivers)
- Desktop compositing disabled
nvidia-bug-report.log.gz (907.1 KB)