Need help fixing vsync tearing with dual screen OpenGL (HDMI + DisplayPort)

I originally had problems initializing OpenGL on the second monitor of a dual monitor setup but I have jumped that hurdle. However now the problem is with tearing (on both monitors) - this is detailed in my followup post below.

So I have managed to get OpenGL to appear on the second monitor by positioning the window in the appropriate coordinate space obtained by querying via the Xrandr interfaces.

However, it appears after initial testing that when my system is configured with a dual monitor setup, any fullscreen OpenGL display I create on either monitor has very bad tearing.

I tried different combinations of resolutions and refresh rates. Both monitors 1920x1080x60fps, for example. But all tests resulted in tearing on whichever monitor I run my OpenGL code on.

If I disable the displayport monitor, then I can get the HDMI display to work without tearing, and if I disable the HDMI display, I can get the displayport to work without tearing, but with both monitors enabled, I will see tearing on whatever monitor I try. I am not trying to display OpenGL on both monitors at the same time (yet). That is the ultimate goal but first I need to figure out why even a single monitor in a dual monitor setup is unable to display without tearing.

Why is this happening?

If anyone else is having trouble getting OpenGL to display on different monitors in a multi-monitor setup, the code below might help you get an OpenGL display going on the DisplayPort display. But the tearing issue remains unresolved for now.

  //the name of the display port display as identified by the routines called below
  const char *display = "DP-0";

  XRRCrtcInfo *crtc_info;
  XRRScreenResources *screen = XRRGetScreenResources (x_display, root);
  //printf("outputs: %d\n",screen->noutput);
  for (int i=0; i<screen->noutput; i++)
  {
      XRROutputInfo *info = XRRGetOutputInfo (x_display, screen, screen->outputs[i]);
      //printf("%s\n",info->name);
      //printf("ncrtc: %d\n",info->ncrtc);
      if (strcmp(info->name,display)==0 && info->ncrtc>0)
      {
        crtc_info = XRRGetCrtcInfo (x_display, screen, info->crtcs[0]);

        windowX = crtc_info->x;
        windowY = crtc_info->y;
        windowWidth = crtc_info->width;
        windowHeight = crtc_info->height;

        printf("Found display '%s' at position: %d,%d size: %dx%d\n",info->name,(int)windowX,(int)windowY,(int)windowWidth,(int)windowHeight);

        XRRFreeCrtcInfo(crtc_info);
      }
      XRRFreeOutputInfo (info);
  }
  XRRFreeScreenResources(screen);

This topic may help your case.

Thanks WayneWWW but it’s a very long winded thread and I’m not sure what I’m supposed to try to address the vsync tearing. Can you offer any further guidance?

Sorry, my fault.

If you are just talking about vsync shown on dual display, then this post can help.

In brief, adding below option in your Xorg.conf can resolve this issue.

  Option            "ForceCompositionPipeline " "on"

Thanks WayneWWW - I have tried this and I find it does prevent tearing, but it severely hurts the framerate. What used to be smooth 60hz motion is very jittery and inconsistent, looks to be averaging around 30hz motion. This is with a very very basic scene with a single vertical white bar moving left and right on the screen. Is there some additional change to the OpenGL code that needs to be done to work in this “ForceCompositionPipeline” “on” mode?

Hi trey,

Are both of your display modes in 60 fps? Is your application running in full screen?

Yes both displays are configured to 60hz 1920x1080. The application runs fullscreen, yes. I can re-post the code used to initialize the windows if that helps.

Hi trey,

Could you share your sample code file and makefile for us to reproduce issue?

Here you go Wayne. You may need to run the following to be sure you have the needed GL/Mesa/GLUT libraries installed:

sudo apt -y install freeglut3 freeglut3-dev libglew-dev
sudo apt -y install mesa-utils

the code is below, no makefile is needed, you can build like this:

>g++ GLTest.cpp -lglut -lGL

GLTest.cpp:

#include <iostream>
#include "GL/freeglut.h"
#include "GL/gl.h"
#include <math.h>
#include <sys/time.h>

using namespace std;

double g_pos = 0.5;

double TimeInSeconds()
{ 
  struct timeval tv;
  gettimeofday(&tv, NULL); 
  
  double timeSinceEpoch = (double)tv.tv_sec + (double(tv.tv_usec) / 1000000.0);  // convert to seconds
  return fmod(timeSinceEpoch, 86400.00);  // mod with seconds in day to get seconds since midnight
}

float Pulse(double time) 
{
  const float pi = 3.14;
  const float frequency = .3333;  // Frequency in Hz
  return 0.5*(1 + sin(2 * pi * frequency * time));
}

void drawTriangle()
{
  glClear(GL_COLOR_BUFFER_BIT);

  glColor4f(1.0, 1.0, 1.0, 1.0);
  glOrtho(-1.0, 1.0, -1.0, 1.0, -1.0, 1.0);

  glBegin(GL_TRIANGLES);
  glVertex3f(g_pos - 0.05, -1, 0);
  glVertex3f(g_pos - 0.05, 1, 0);
  glVertex3f(g_pos + 0.05, 1, 0);
  glVertex3f(g_pos + 0.05, -1, 0);
  glVertex3f(g_pos + 0.05, 1, 0);
  glVertex3f(g_pos - 0.05, -1, 0);
  glEnd();

  glutSwapBuffers();
}

void update()
{
  g_pos = -1.f + (Pulse(TimeInSeconds()) * 2.f);
  glutPostRedisplay();
}

void keyboardCallback( unsigned char key, int x, int y )
{
  switch ( key )
  {
    case 27: // Escape key
      exit (0);
      break;
  }
}

int main(int argc, char *argv[])
{
  system("export DISPLAY=:0");
  putenv((char *) "__GL_SYNC_TO_VBLANK=1");
  glutInit(&argc, argv);
  glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
  glutInitWindowSize(1280, 720);
  glutInitWindowPosition(0, 0);
  glutCreateWindow("OpenGL - Simple Test");
  glutFullScreen();
  glutKeyboardFunc(keyboardCallback);	
  glutDisplayFunc(drawTriangle);
  glutIdleFunc(update);
  glClearColor(0.0, 0.0, 0.0, 1.0);
  
  glutMainLoop();
  return 0;
}

With this sample code, I observe the following on my dual monitor system:

Immediately upon launch, tearing is visible. If I switch app focus back to the terminal window that launched the app, then the tearing stops. If I click back over on the display with the application running, tearing comes back.

If I disable the second monitor and launch the app, no tearing is visible.

Hi,

Your issue looks duplicated with one of the old issue

Please try to use EGL+ GL sample to see if you can still see the tearing + frame rate drop issue.

The support for GLUT seems still not good.

Which EGL + GL sample do you want me to try? the SimpleGL example mentioned in that post is for QNX operating systems, not Jetson.

For what its worth, our main application does use EGL but there is too much source code in that to post here.

If you know of a EGL + GL sample that you know works without tearing on dual screen Jetson nano systems, please refer me to it and I will just copy however that initializes the screens.

Hi,

Please check/try our MMAPI sample (sdkmanager will install it).

Hi Wayne.

I am working with Trey on this issue. I can’t see the sample you are referring to, but I do see these samples, would they be sufficient?

:/usr/src/nvidia/graphics_demos$ ls
bubble
eglstreamcube
gears-lib
nvgldemo
README
ctree
gears-basic
include
nvtexfont
weston-dmabuf-formats
docs
gears-cube
Makefile.l4tsdkdefs
prebuilts

I am running this one:

/usr/src/nvidia/graphics_demos/prebuilts/bin/x11/bubble

and not seeing any tearing with two instances of the appllication 1 on each display.

Please check the code and sample here.

/usr/src/jetson_multimedia_api

Please also refer to our document for MMAPI sample.

Hi Wayne

I have been working on this with Trey.

We located the /usr/src/jetson_multimedia_api samples using the SDKManager installer.

I have been testing today with this sample:

/usr/src/jetson_multimedia_api/samples/00_video_decode

I made a simple h264 video file which I have attached to the post below:

WhiteLine_HD.zip (111.5 KB)

I am playing the video file using this command:

/usr/src/jetson_multimedia_api/samples/00_video_decode$ ./video_decode H264 ~/Videos/WhiteLine_HD.h264

When I start the nano with only the HDMI connected the video file plays perfectly with no tearing.

When I start the device with both the HDMI & the Display Port connected to monitors the video file is tearing badly and conssistently.

Both Monitors are running at 1920x1080 60.0Hz verified with xrandr

I also tried:

export __GL_SYNC_TO_VBLANK=1
export __GL_SYNC_TO_VBLANK=0

This often changes the position of the tearing, and sometimes it is less noticable, but it does not consistently stop the tearing occuring.

Please can you advise how to stop the tearing when the second monitor is connected? Ultimately we need to be able to draw to both outputs from our application without tearing. But right now we cannot even draw to one output without tearing whenever the second monitor is connected.

Thanks!

Also to note: there is no tearing when I mirror the displays using:

xrandr --output DP-0 --same-as HDMI-0

No tearing

But when I run

xrandr --output DP-0 --right-of HDMI-0

The tearing returns

Your previous suggestion to add this to Xorg.conf

Option "ForceCompositionPipeline " “on”

Does stop the tearing, but as Trey pointed out, performance is then very bad. Decode of the 30 fps 1080p file is very jittery, lots of frames missing.

Hi Wayne. Any thoughts on this?

Thanks

I see the same problem on the AGX Xavier too.

The lack of response or assistance on this mstter is frustrating.

Should we conclude that it’s not possible to make an open GL / EGL application which can draw to multiple outputs on the Jetson platform efficiently without tearing then?

For a company such as Nvidia to make a product range with such a major flaw seems pretty shocking!