I’ve got a very simple OpenGL app that I use for research purposes. This app opens a window covering the entire second monitor then shows basic shapes in different colors, up to 30 shapes at a time. The timing from asking a frame to be drawn to the time it is actually on the screen matters a great deal for what I’m doing.
I have a test setup that we have been using for many years to test the latency of displays. The setup involves a photo-diode attached to a real-time computer and data acquisition card. I command a frame to be drawn and measure the time until the photo-diode “sees” the frame.
I am using using a Quadro K2200.
I have found that driver version 340.52 is one frame faster than any subsequent driver version. i.e. If a monitor is running at 60Hz, driver 340.52 may have a latency of ~50ms, then 340.66 and later will have a latency of ~65ms.
I have tested this with: 340.66, 340.84, 341.05, 341.61, 341.81, 341.92, 348.40, 354.35, 354.42, 358.50. I have tested using 2 different LG commercial monitors (large TV’s) and 3 different desktop monitors (Lenovo, LG, Asus), and at 60Hz and 120Hz. The results are all extremely consistent, any driver after 340.50 has one extra frame of latency.
I have also tested my program with Intel HD 4600 and on at least 3 different GEForce cards over the years. All other test set-ups give me the lower latency values when using the same display.
Does anyone have any idea why this may be? It really looks like a bug in the drivers to me at this point.
Please let me know if more information would be helpful. Thanks!