USB3 UVC camera low frame rate when HDMI is disconnected


I’m facing a strange issue streaming a USB3 UVC camera on the Xavier NX dev kit running Jetson Linux 35.1.
When my external HDMI monitor is connected to the board I get the expected 120FPS from the camera. When the monitor is not connected, the frame rate drops to an unstable 20-50FPS.

I was able to get 120FPS without a monitor by manually overriding HDMI hotplug detection (edid.bin is the EDID read from my monitor):

root@jetson:~# v4l2-ctl -d /dev/video1 --stream-mmap
<<<<<<<<<<< 23.56 fps, dropped buffers: 15
<<<<<<<<<<<< 24.73 fps, dropped buffers: 13
<<<<<<<<<<<< 24.71 fps, dropped buffers: 15
<<<<<<< 24.03 fps, dropped buffers: 13
<<<<<<<<<<<<<< 24.57 fps, dropped buffers: 12
root@jetson:~# cat edid.bin > /sys/kernel/debug/tegradc.0/edid
root@jetson:~# echo 1 > /sys/kernel/debug/tegradc.0/hotplug
root@jetson:~# v4l2-ctl -d /dev/video1 --stream-mmap
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 119.68 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 119.68 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 107.35 fps, dropped buffers: 4
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 107.35 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 113.13 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 119.67 fps

This behavior is 100% deterministic. Any ideas what’s going on here?


This isn’t much help, but in many cases the X server is used for GPU use. Many people think X is a GUI, but it isn’t; X is an API and interface to the GPU (it is a standardized API to the framebuffer, which in turn can exist in the GPU). There are a lot of CUDA and other GPU-accelerated programs which revert to CPU when the GPU is not available. As an experiment, can you monitor CPU activity (e.g., I like htop or xosview, implies “sudo apt-get install htop” or “sudo apt-get install xosview”) when:

  • The GUI is running locally on the Jetson, and the program is not running.
  • The GUI is running locally on the Jetson, and the program is running.
  • The monitor/GUI is not running, and the program is running.

I’m thinking you’ll see the GUI case lower CPU use (due to using GPU), but when GUI is not running, it would switch to CPU. Don’t know, but it is easy to examine that.

I didn’t observe significant CPU usage in either case, but FWIW I verified that the behavior is the same when X is not running at all (systemctl stop gdm3.service). I’m running my tests over SSH without any GUI.

You might then examine if the GPU load is showing up via tegrastats. Same goal, to test for load when running in the GPU via GUI, and when running without the monitor. If this is using GPU in both cases, then X should not matter.

Looks like the issue was that the EMC clock is set to a lower frequency when the external display is disconnected. Forcing the high frequency using jetson_clocks also seems to fix the camera stream.
I wonder why the xHCI driver doesn’t do this automatically…

I couldn’t say, but I think defaults should automatically scale up clocks under load. It depends on the model which was being used. Models set ranges for clocks and voltages and temperatures; jetson_clocks just pegs to the upper end for that given model. So another model might do what you wish, although it is simple enough to just run jetson_clocks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.