Jetson TX1 - custom output resolution

Hi,
I’m working on Jetson TX1, and trying to capture the output video with a DVI2USB 3.0 Frame Grabber.
The grabber is connected via an HDMI-to-DVI cable, and supports input resolution of 1920X1200 60 Hz (the resolution is configured in the grabber’s EDID file).
I’m trying to set a custom refresh rate to 50Hz (staying with 1920X1200) without any success.
I have tried the following commands:

cvt 1920 1200 50
Which gives: # 1920x1200 49.93 Hz (CVT 2.30MA) hsync: 61.82 kHz; pclk: 158.25 MHz
Modeline “1920x1200_50.00” 158.25 1920 2040 2240 2560 1200 1203 1209 1238 -hsync +vsync

Then: sudo xrandr --newmode “1920x1200_50.00” 158.25 1920 2040 2240 2560 1200 1203 1209 1238 -hsync +vsync

And then: sudo xrandr --addmode HDMI-0 1920x1200_50.00
But the error I get is:

ubuntu@tegra-ubuntu:~ $ xrandr --addmode HDMI-0 1920x1200_50.00
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 18 (RRAddOutputMode)
Serial number of failed request: 20
Current serial number in output stream: 21

When I check the supported output resolution (when the grabber is connected) I get:
ubuntu@tegra-ubuntu:~ $ xrandr -q
Screen 0: minimum 16 x 16, current 1920 x 1200, maximum 16384 x 16384
HDMI-0 connected primary 1920x1200+0+0 (normal left inverted right x axis y axis) 400mm x 300mm
1920x1200 59.95*+

When I try to force the frame rate with the command:
ubuntu@tegra-ubuntu:~ xrandr –s 1920x1200 –r 50 [/b]I get nothing, like it succeeded, but nothing happens. The grabber blinks for one second but after that it back to its normal mode (60 Hz) and when I try: [b]ubuntu@tegra-ubuntu:~ xrandr –s 1920x1200 –r 60 (or any other value, even 59.95), I get:
Rate 60.00 Hz not available for this size

I’ve also tried to set the resolution mode in /etc/X11/xorg.conf

In all of these cases, the grabber still get 1920x1200 60Hz.

If it’s matter, i’m using ubuntu 16.04 64 bit, and have the latest L4T version (24.2.1)

Any suggessions?
Thanks,
Shani.

It’s possible the resolution and flash rate are paired. Have you try it on any other platform with the same config?

What do you mean in “paired”? the resolution and flash rate in the frame grabber? or in the TX1?
I haven’t try another platform because i have only this platform for test. But i have tried to install another EDID file into the frame grabber which include other resolutions (up to 1920X1080) and could change the resolution to one of the supported resolutions.
But, i need 1920X1200 50Hz and have not succeeded yet.

I mean maybe this resolution for this monitor only support like 60Hz, You can try it on your PC.

finally i managed to setup a new EDID file and load it into the frame grabber. The EDID file include only one supported resolution: 1920X1200 50Hz, so the ‘xrandr’ has no other option and choose this resolution as the current resolution.
It didn’t solve my first problem- i still get the same error when i try to add a new mode of the same resolution and refresh rate, with reduced blanking (a different pixel-clock). But the solution i found is good for now - I change the EDID file of the frame-grabber with the desired resolution, refresh rate, pixel clock etc…

Thanks for the advice.

Now, I have another issue.
I have an application that renders frames into the screen in frequency of 50Hz.
I want to synchronize between the HDMI refresh rate to the program frame-rate, so that one frame will be sent each 20ms (50Hz)
Is the output resolution setup i did before is enough?

(It may be a silly question but this is the first time i try to synchronize the output frame-rate, and really not sure what i have to do in order to synchronize the frames with the HDMI refresh-rate)

Thanks in advance

Hi Shani,

If you want to sync the frame rate between screen and program, the vsync function could help. However, the output frame rate may be not as your expectation.

Hi WayneWWW,
Yes, I want my program to be synchronized with the screen. What is ‘vsync function’?

I want to render a frame exactly every 20ms.
Assuming I have a monitor that able to get 50Hz frame rate, how can I sync between the monitor and my program?

This is the psaudo code I thaught about (working with double buffer):


prepare the frame

wait until finish all commands(glFinish)
get timer (t2)
time elapsed = t2-t1
sleep until 20ms will pass
get timet (t1)
call to swapBuffers

But this rendering is not syncronized to any monitor…
Is there a better way to be synchronized to a monitor?

Our driver already helps you do vsync. It results in the synchronization between app and screen.

Maybe it is the reason you failed to synchronized to screen. Please check

/sys/module/window/parameters/no_vsync is on or not.

/sys/module/window/parameters/no_vsync value is 0.
Is it equal to __GL_SYNC_TO_VBLANK == 1 ?

As I understands, the TX1 GPU is a “free-running” machine that has an internal sync (depend on the screen refresh-rate defined), but independent of the screen sync timing.
If the screen resolution is 1920X1200 50Hz, the TX1 output would also be 1920X1200 50Hz, but nobody ensures me that the GPU vsync (for which the buffer-swapping is waiting for) and screen sync will raise on the same time

Am I wrong?

As I know, __GL_SYNC_TO_VBLANK works for openGL swap-buffer and “no_vsync” controls from dc driver.

To avoid screen tearing, both parameters should be on. However, the side effect of vsync is frame rate dropping to some fixed value.

Take the following code for example, I try to set the fps(delay) to 120 in line 29.
With a 1920x1080 60Hz screen and __GL_SYNC_TO_VBLANK set to 0, I can see the fps run to almost 120. However, if vsync is on, the fps shown would drop to 60.

If I lower the value in line29 to 60, with vsync is on, fps shows only 30. Maybe that is the reason you cannot control the synchronization.

#include <stdio.h>
#include <math.h>
#include <GL/glut.h>

void disp_fps(){
    static GLint frames = 0;
    static GLint t0 = 0;
    static char fps_str[20] = {'

#include <stdio.h>
#include <math.h>
#include <GL/glut.h>

void disp_fps(){
static GLint frames = 0;
static GLint t0 = 0;
static char fps_str[20] = {’\0’};
GLint t = glutGet(GLUT_ELAPSED_TIME);
if (t - t0 >= 200) {
GLfloat seconds = (t - t0) / 1000.0;
GLfloat fps = frames / seconds;
sprintf(fps_str, “%6.1f FPS\n”, fps);
printf("%6.1f FPS\n", fps);
t0 = t;
frames = 0;
}
glColor3f(0.0, 0.0, 1.0);
glRasterPos2f(0, 0);
glutBitmapString(GLUT_BITMAP_HELVETICA_18, fps_str);
frames++;
}

void display() {

float siz = 0.01;
float inc = 0.05;
float delay = 1.0 / 120; // control the frame rate here

static double a = 0;
a = fmod(a+inc, 2*M_PI);
double x = sin(a);

glClearColor(0.5, 0.5, 0.5, 1.0);
glClear(GL_COLOR_BUFFER_BIT);

usleep((int)(delay*1000000));

glColor3f(0,1,0);
glBegin(GL_POLYGON);
glVertex3f(x-siz,-1.0,0.0);
glVertex3f(x-siz,1.0,0.0);
glVertex3f(x+siz,1.0,0.0);
glVertex3f(x+siz,-1.0,0.0);
glEnd(); 

disp_fps();

glutPostRedisplay();
glutSwapBuffers();
glFinish();

}

void toggle_fullscreen(){
static int fullscreen = 1;
if(fullscreen){
glutFullScreen();
}else{
glutReshapeWindow(1800, 900);
glutPositionWindow(0,0);
}
fullscreen = !fullscreen;
}

static void key(unsigned char k, int x, int y)
{
switch (k) {
case ‘f’:
toggle_fullscreen();
break;
case 27: // Escape
case ‘q’:
exit(0);
break;
default:
return;
}
}

int main(int argc, char** argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE);
glutCreateWindow(“Tests”);
glutKeyboardFunc(key);
glutDisplayFunc(display);
toggle_fullscreen();
glutMainLoop();
return 0;
}

'};
    GLint t = glutGet(GLUT_ELAPSED_TIME);
    if (t - t0 >= 200) {
        GLfloat seconds = (t - t0) / 1000.0;
        GLfloat fps = frames / seconds;
        sprintf(fps_str, "%6.1f FPS\n", fps);
        printf("%6.1f FPS\n", fps);
        t0 = t;
        frames = 0;
    }
    glColor3f(0.0, 0.0, 1.0);
    glRasterPos2f(0, 0);
    glutBitmapString(GLUT_BITMAP_HELVETICA_18, fps_str);
    frames++;
}

void display() {
    
    float siz = 0.01;
    float inc = 0.05;
    float delay = 1.0 / 120; // control the frame rate here
    
    static double a = 0;
    a = fmod(a+inc, 2*M_PI);
    double x = sin(a);
    
    glClearColor(0.5, 0.5, 0.5, 1.0);
    glClear(GL_COLOR_BUFFER_BIT);
    
    usleep((int)(delay*1000000));
    
    glColor3f(0,1,0);
    glBegin(GL_POLYGON);
    glVertex3f(x-siz,-1.0,0.0);
    glVertex3f(x-siz,1.0,0.0);
    glVertex3f(x+siz,1.0,0.0);
    glVertex3f(x+siz,-1.0,0.0);
    glEnd(); 
    
    disp_fps();
    
    glutPostRedisplay();
    glutSwapBuffers();
    glFinish();
}

void toggle_fullscreen(){
    static int fullscreen = 1;
    if(fullscreen){
        glutFullScreen();
    }else{
        glutReshapeWindow(1800, 900);
        glutPositionWindow(0,0);
    }
    fullscreen = !fullscreen;
}

static void key(unsigned char k, int x, int y)
{
    switch (k) {
        case 'f':
            toggle_fullscreen();
            break;
        case 27:  // Escape
        case 'q':
            exit(0);
            break;
        default:
            return;
    }
}

int main(int argc, char** argv) {
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH  | GLUT_DOUBLE);
    glutCreateWindow("Tests");
    glutKeyboardFunc(key);
    glutDisplayFunc(display);
    toggle_fullscreen();
    glutMainLoop();
    return 0;
}

Thanks, I will try it and let u know.
When you say “vsync is on”, do you mean: no_vsync = 0?

Yes, that is what I mean.

The problem is that i need exactly 50Hz and not almost, and that’s why I cannot count on usleep because it always has an extra-delay of few microseconds, And after some frames it would lose the sync

I think you could render the frame in rate higher than 50 and let vsync to make it meet monitor frequency.

I tired 2 options:

  1. rendering without sleeping (frame rate higher than 50) + __GL_SYNC_TO_VBLANK = 0 + no_vsync = 0
    As a result I lost frames. Frames arrived in 50Hz to the screen, but not all frames I’ve tried to send.
  2. rendering without sleeping (frame rate higher than 50) + __GL_SYNC_TO_VBLANK = 1 + no_vsync = 0 (I wasn’t aware of the ‘no_vsync’ parameter when I did the simulation yesterday, but its value was 0)
    I let it run for more than an hour and only two frames were lost on their way. It seems like OpenGL-swap buffers meeting the dc driver sync. (I think the reason for the two lost frames is some OS process running on the background cause performance issues - is there any way to avoid this?).

It would be hard to find the root cause. Please raise cpu/gpu clock to maximum to see if it still happens.

yes, the cpu\gpu clock had been raised to maximum