General performance while hi-res monitor attached via display port

I have been working with the Jetson Nano B01 for a few weeks now with a fresh SD image. I have followed a few tutorials (Jetson Hacks and Toptechboy.com) and I am not getting anywhere close to the frame rates they are getting. I followed the setups EXACTLY but the main difference is that my Nano is hooked up via the display port on a 30 inch monitor @2560x1600.

Is it fair to say that the rendering of the Ubuntu desktop at that resolution causing gstreamer to chug? JTOP shows spikes in GPU with code not running and roughly the same spikes show up when executing code.

I am just thinking I have a bad Nano . I have had all nighters trying to figure out what is different. I am not going to talk specific code because its code from from tutorials that obviously work well.

  • Running in full power mode
  • jetson_clock service is enabled
  • Fast SD card

Not very sure what is the exact problem you have.

Please elaborate more about it.

Sorry that you missed the question. I will ask it agian…

Is it fair to say that the rendering of the Ubuntu desktop at that resolution causing gstreamer to chug?

The frame rates are underwhelming. MY raspberry pi gets better FPS using the same opencv code.

It depends on what opencv code you are running. You mentioned gstreamer, so is it a pipeline in opencv code with appsink? If you have videoconvert component in your pipeline, it is possible that the performance is bad because it is a pure cpu based converter.

Generally, opencv is using some X based API so the ubuntu desktop is indeed sharing the resources with it. However, this should not cause slow FPS unless your GPU is occupied something else.

If you suspect it is a defective nano, you could try other nano devices as well if you have extra ones.

I am sorry for not being clear. I am using code from Jetson hacks that clearly works really well. I have not changed anything. Its using hardware encoding.

Hi,

Could you point out which code you are using? Actually Jetsonhacks is not a official website so we know nothing about the sample you are referring to…

Also, could you dump your tegrastats result when device is in idle as you said there is weird GPU spikes.
Have you ever tried to reboot it?

I have tried everything. Yeah I never said Jetson hacks was official - that wasnt my point. My point was you can clearly see in his videos given his github code its running as expected. I was just curious about the monitor and you answered the question. Thanks.

Hi,

No offense. But I was just hoping you could at least point out a link to which code you are referring from jetsonhacks or at least the video link with us. There are quite of codes over jetsonhacks github. It is better to align our status on the same page.

Maybe it is a general bug that we could try to reproduce on our devkit too. Your kind help may also benefit other forum users .

and I tried using using AI on the Jetson Nano LESSON 34: Face and Eye Detection with Haar Cascades in OpenCV - YouTube

Both of them tend to use the same pipeline string. When you see their videos, the frame rate is pretty reasonable.

Here is the gstreamer pipeline string I am using with a CSI-camera.

('nvarguscamerasrc ! '
           'video/x-raw(memory:NVMM), '
           'width=(int)640, height=(int)360, '
           'format=(string)NV12, framerate=(fraction)10/1 ! '
           'nvvidconv flip-method=2 ! '
           'video/x-raw, width=(int){}, height=(int){}, '
           'format=(string)BGRx ! '
           'videoconvert ! appsink').format(640, 360)

I have been messing with that string in all sorts of variations over the past couple of weeks with no luck. Keep in mind, I am hobbyist and not a professional and I am finding the Nano is just not cut out for simple face detection development. Maybe it will be fine when its in a headless configuration with no opencv window created. I should try that next.

If I increase the frame rate in that string, the delay gets worse. Obviously there is a buffer de-sync going on, but I just cannot figure out why I simply cannot reproduce their results.

Hi,

You could run similar usecase from our gstreamer user guide first.

You could firstly try to use some mp4 file with below command to render it on monitor. This pipeline can prove whether this is due to gpu or not. Only xvimagesink in below pipeline is using the GPU. Other components are all going through other hardwares.

gst-launch-1.0 filesrc location=1280x720_30p.mp4 ! qtdemux ! queue !  h264parse ! omxh264dec ! nvvidconv ! 'video/x-raw, format=(string)I420, width=640, height=480' ! xvimagesink -e

Great idea. I tried it once before but this time with JTOP

To me it seemed the CPUs spiked more than the GPU. I managed to take a screenshot of the highest point of the GPU.

How is the frame rate there? Is it normal?

The video seemed to be running smooth. Played the same one on my windows machine and appeared to be the same.

I was getting GStreamer errors though.

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.184: gst_caps_is_empty: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.185: gst_caps_truncate: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.185: gst_caps_fixate: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.185: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.185: gst_structure_get_string: assertion 'structure != NULL' failed

(gst-launch-1.0:8549): GStreamer-CRITICAL **: 04:30:11.185: gst_mini_object_unref: assertion 'mini_object != NULL' failed

This might be related.

Sorry I didnt see this post earlier. I will keep on eye on this.

This issue is more related to darknet and face recognition. If your opencv code is able to run in smooth when face recognition is disabled (pure rendering), then you might refer to that thread.

Yes, when I disable face recognition, frame rates come back to normal. I completely understand that machine vision eats up CPU/GPU, but it shouldnt be running at 1-3fps.