OS: Ubuntu 20.04.5 LTS (GNOME 3.36.8)
CPU: i7-9700K @ 3.60GHz x 8
GPU: GeForce RTX 3060
Driver: 470.199.02
We need to maintain a high, consistent refresh rate for a number of applications built in Unity, each of which (up to 3 at a time) are displayed on separate monitors. Upon careful measurement of the latencies between the computer, the Unity update cycle, and when the monitor display actually updates, we found that the display refresh rate of the Unity application decreases (to about half the original) when it does not have focus. This is despite the fact that the Unity’s update cycle is not changing, meaning the CPU/GPU calculations are not taking longer, but rather when the display actually updates upon receiving the request from the application.
The decrease persists when the PowerMizer is set to “Prefer Maximum Performance” and DPMS is disabled for xorg via xset or xorg.conf. The decrease still persists when the Unity application update rates are decreased (therefore reducing the CPU/GPU load), and when the display refresh rate is adjusted via Linux’s Display settings or X Server Display Configurations. The relative drop seems to be fairly consistent, meaning when display refresh rate is set at 120Hz, it drops to roughly 60Hz without focus, and when set at 60Hz, it drops to roughly 30Hz, etc.
Since we need to drive multiple instances at the same time, we would like to maintain the highest refresh rate possible regardless of whether the application has focus or not. However, adjusting configurations at the Linux system level does not seem to affect this — so is there a way to do so, perhaps by adjusting configurations on the graphics card that are not shown on the X Server Settings?
Thanks in advance!