Eliminating screen tearing in multi-monitor setup

Hi,

I am running a dual monitor setup and displaying 1080P 60 FPS individual camera feeds, one to each monitor. Each camera operates independently so there is no syncing for the cameras. Even with Vsync enabled, I am seeing screen tearing appear on one or both monitors, generally as a line that goes through the image. The only way I have been able to eliminate the screen tearing is by enabling ForceCompositionPipeline in the Advanced tab (in addition to Vysnc) of the driver for each monitor but this adds latency to the camera feeds that are displayed on the monitors. Is there another way to reduce screen tearing that won’t add latency?

Specs and software version:

Linux OS: Ubuntu 16.04.04 LTS
Driver version: 418.56
Hardware: Quadro P4000 or RTX 2070 (same issue on both cards)

Thanks

Which player software are you using, is this playing fullscreen?

Hi Generix,

Yes both screens are fullscreen. I am using the OpenCV rendering function imshow in the left screen which runs as an independent program and then I do the same in the right screen which runs as an independent program. Both screens are rendered using OpenGL if that is helpful.

Thanks

That’s a very simple aproach to display a video. The desktop compositor’s vsync also can’t help since it’s off in fullscreen mode, so your application would have to set up a proper vsync’d gl swapchain which is out of opencv’s focus. So you could either pipe the video to a player like mpv which can handle this or you’ll have to rely on the forcecompositionpipeline option. IDK if setting the env variable __GL_MaxFramesAllowed=1 (turning from triple to double buffering) either for your application or systemwide would help in reducing latency.

Hi Generix,

Thanks for that information - I’ll look into the Vsync’d GL Swapchain but in the short term I am going to try __GL_MaxFramesAllowed=1. I was wondering to revert back to the original state (which is triple buffered) do I simply have to remove __GL_MaxFramesAllowed=1 from my /etc/environment or do I have to set the value to something else? I can’t seem to find much information online about __GL_MaxFramesAllowed.

Thanks

Those driver internal settings aren’t well documented if at all there’s any information. Just removing it from environment reverts driver behaviour to the normal state.