Jetson TX2 Dev Kit fails to capture from 2 USB UVC Cameras on same USB 3.0 Port at 160x120 and 9 FPS

Hi All,

I have a Jetson Tx2 running the latest Jetpack 4.2 SDK. I have plugged in 2 identical USB UVC Cameras that have a resolution of 160x120 at 9 Hz through a USB Hub to the Jetson’s USB 3.0 port through a USB 2.0 hub. All software is up to date.

I can capture with GStreamer from the commandline with gst-launch with either camera independently, one at a time - but not at the same time. I want to capture with both cameras at the same time, but the second capture command always fails with “Failed to allocate required memory”.

I’ve read elsewhere that 2 USB cameras are tough to get working due to USB bandwidth limitations and how the drivers allocate memory for worst case frame sizes, however my framerates and sizes are tiny compared to most modern cameras.

I think my math is correct, the USB 2.0 hub should be able to handle 2 cameras:

160 * 120 * 2 ( 16-bit ) * 9 = 345600 Bps * 8 = 2764800 ( 2.764800 Mbps )

2764800 * 2 cameras = 5529600 bps ( 5.529600 Mbps ) 

( 10 Mbps ) conservative USB 2.0 speed  > 5.5 Mbps

How can I make this work?

Here is the failure for the second camera after the first is already streaming with the same parameters:

GST_DEBUG=2 gst-launch-1.0 -v v4l2src device=/dev/video2 ! videorate ! video/x-raw, width=160, height=120,format=GRAY16_LE,framerate=5/1 ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)9/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)5/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)5/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)5/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)5/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps = video/x-raw, width=(int)160, height=(int)120, format=(string)GRAY16_LE, framerate=(fraction)9/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
0:00:00.235760607  9015   0x559713f4a0 WARN          v4l2bufferpool gstv4l2bufferpool.c:790:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:00.243462223  9015   0x559713f4a0 ERROR         v4l2bufferpool gstv4l2bufferpool.c:677:gst_v4l2_buffer_pool_streamon:<v4l2src0:pool:src> error with STREAMON 28 (No space left on device)
0:00:00.243520174  9015   0x559713f4a0 ERROR             bufferpool gstbufferpool.c:564:gst_buffer_pool_set_active:<v4l2src0:pool:src> start failed
0:00:00.243951818  9015   0x559713f4a0 WARN                 v4l2src gstv4l2src.c:650:gst_v4l2src_decide_allocation:<v4l2src0> error: Failed to allocate required memory.
0:00:00.243993033  9015   0x559713f4a0 WARN                 v4l2src gstv4l2src.c:650:gst_v4l2src_decide_allocation:<v4l2src0> error: Buffer pool activation failed
0:00:00.244182247  9015   0x559713f4a0 WARN                 basesrc gstbasesrc.c:3275:gst_base_src_prepare_allocation:<v4l2src0> Subclass failed to decide allocation
0:00:00.244241415  9015   0x559713f4a0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: Internal data stream error.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
Additional debug info:
gstv4l2src.c(650): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed
0:00:00.244280390  9015   0x559713f4a0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.021354625
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Here are the capabilities of both cameras:

nvidia@nvidia:~/Desktop$ v4l2-ctl --list-formats-ext -d /dev/video1
ioctl: VIDIOC_ENUM_FMT
        Index       : 0
        Type        : Video Capture
        Pixel Format: 'UYVY'
        Name        : UYVY 4:2:2
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 1
        Type        : Video Capture
        Pixel Format: 'Y16 '
        Name        : 16-bit Greyscale
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)
                Size: Discrete 160x122
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 2
        Type        : Video Capture
        Pixel Format: 'GREY'
        Name        : 8-bit Greyscale
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 3
        Type        : Video Capture
        Pixel Format: 'RGBP'
        Name        : 16-bit RGB 5-6-5
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 4
        Type        : Video Capture
        Pixel Format: 'BGR3'
        Name        : 24-bit BGR 8-8-8
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

nvidia@nvidia:~/Desktop$ v4l2-ctl --list-formats-ext -d /dev/video2
ioctl: VIDIOC_ENUM_FMT
        Index       : 0
        Type        : Video Capture
        Pixel Format: 'UYVY'
        Name        : UYVY 4:2:2
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 1
        Type        : Video Capture
        Pixel Format: 'Y16 '
        Name        : 16-bit Greyscale
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)
                Size: Discrete 160x122
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 2
        Type        : Video Capture
        Pixel Format: 'GREY'
        Name        : 8-bit Greyscale
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 3
        Type        : Video Capture
        Pixel Format: 'RGBP'
        Name        : 16-bit RGB 5-6-5
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

        Index       : 4
        Type        : Video Capture
        Pixel Format: 'BGR3'
        Name        : 24-bit BGR 8-8-8
                Size: Discrete 160x120
                        Interval: Discrete 0.111s (9.000 fps)

And here are the 2 commands I’d like to run at the same time:

gst-launch-1.0 -v v4l2src device=/dev/video1 ! video/x-raw, width=160, height=120,format=GRAY16_LE ! fakesink

gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, width=160, height=120,format=GRAY16_LE ! fakesink

I also tried with a brand new, powered, USB 3.0 ( 5 Gbps ) Hub, the same exact error happens.

nvidia@nvidia:/mnt/sdcard$ lsusb -t
/:  Bus 02.Port 1: Dev 1, Class=root_hub, Driver=tegra-xusb/3p, 5000M
    |__ Port 1: Dev 2, If 0, Class=Hub, Driver=hub/4p, 5000M
/:  Bus 01.Port 1: Dev 1, Class=root_hub, Driver=tegra-xusb/4p, 480M
    |__ Port 2: Dev 5, If 0, Class=Hub, Driver=hub/4p, 480M
        |__ Port 1: Dev 7, If 0, Class=Video, Driver=uvcvideo, 12M
        |__ Port 1: Dev 7, If 1, Class=Video, Driver=uvcvideo, 12M
        |__ Port 2: Dev 9, If 0, Class=Video, Driver=uvcvideo, 12M
        |__ Port 2: Dev 9, If 1, Class=Video, Driver=uvcvideo, 12M

Hi,
Is the issue seen if you set format=UYVY? GRAY16_LE is not a verified since we don’t have USB cameras which can output this kind of format.

Thanks for the response. I do see the same error with UYVY.

If I use an USB-A female to USB-B micro male adapter, I can put the second camera on the micro-usb port on the Jetson TX2 Dev Kit and then can capture from 2.

But I believe with these frame rates and resolutions, that 1 USB 3.0 port with a powered USB 3.0 hub should be able to capture from several. My final goal is to use the USB 3.0 hub and capture from more than 2 of these small cameras. Does that seem right to you? Thanks.

Hi,
We have tried to run e-con cu135 and Logitech c930e simultaneously by connecting to a hub which connects to the type A port. It looks to be an issue specific to your usbcams. Do you have other usbcams to give it a try?

I am still troubleshooting this issue. I will try to patch the Jetpack 4.2 Kernel for the TX2 per the following post:

https://stackoverflow.com/questions/9781770/capturing-multiple-webcams-uvcvideo-with-opencv-on-linux/23881125#23881125
root@nvidia:/sys/module/uvcvideo/parameters# uname -r
4.9.140-tegra

I’ve found blog posts for compiling the kernel for previous Jetpack releases, but not 4.2 Are there instructions for compiling and installing the Jetpack 4.2 kernel on the device? If not, cross compilation?

Thank you!

Hi phillip.class,

Please download L4T Documentation and check “Kernel Customization” → “Building the NVIDIA Kernel” section.

Download link: [url]https://developer.nvidia.com/embedded/dlc/l4t-documentation-32-1[/url]

It is kind of unusual for a camera to be only USB1.1 (which is mostly what you’d see from slow devices like a keyboard or mouse). On the other hand, your cameras shouldn’t need more than that amount of bandwidth (I’m guessing they are some sort of specialty camera, e.g., thermal).

The HUB itself is running USB2, which is not a problem. However, there are some design differences between HUBs. I’m wondering if the HUB might be part of the issue in the conversion between USB2 and USB1.1.

When two ports of a HUB both need servicing it is up to the host to scan through the ports. In a more naive HUB the slowest device on a port would force all of the other devices to run at the slower speed. For a bit more money a transaction translator (TT) can be added (a transaction translator adapts two different USB speeds…it is a kind of buffered USB bridge). One design uses a single TT and is common and doesn’t cost much to add. Another design is multiple TT where each port has its own TT (versus an entire bus adapting to the host speed but forcing individual ports to slow down). One might think that since all of your particular camera devices are USB1.1 that single-TT versus multi-TT wouldn’t matter, but even in this case performance can suffer with single-TT since the host itself is issuing commands to one port at a time (the time servicing one port implies the other ports must wait).

If your HUB is not multi-TT, then it is possible that internally to the USB HUB traffic is being choked off. The host itself is quite capable of sending out control commands at USB2 speeds (480Mb/s), but each USB1.1 device is only capable of reacting at USB1.1 speeds (12Mb/s). If the point of translation is via a single TT then the responsiveness is much slower than if each port has its own TT (multi-TT). The HUB itself could be slowing things down and the time spent waiting delays the time before a different port can be serviced.

You might want to describe the details of the HUB itself. If you have more HUBs, and especially if you have some of the more expensive HUBs (not the cheap ones…they never use more than single TT), you might try to check differences.

I tried patching the kernel and changing the determined bandwidth requirements, but it kept failing. Thank you linuxdev for the comment, I purchased a multi-TT hub, and now I can capture on all four of my cameras simultaneously.

Thanks for the help!

Others might benefit to know which HUB model you found to work. Glad it was something simple.

I bought this hub:
https://www.sweetwater.com/store/detail/Overhub--elektron-overhub-7-port-usb-3.0-hub

It is spendy for a USB hub, but the problem is that there are not too many actual “Multi-TT” hubs out there. It would be better if my camera manufacturing output USB 2.0 video instead of 1.1…

That’s an interesting HUB. Looks like it was designed for audio or video studio work…which would mean it is tailored for the same kind of traffic a Jetson needs low latency with. Seven ports would definitely exceed the root HUB capacity if it were all USB3 cameras, but I’m guessing this would work well with keyboards and mice and low speed items where some other HUBs start losing camera frames if a keyboard/mouse is used on the same HUB. Nice choice.

Yes, it works well for this application. Do you know if the Jetson Nano’s 4 exposed USB Ports are all on the same root controller? Or did they break out multiple root controllers to these? I’m haven’t found whether or not the Jetson Nano SoC has multiple root controllers…

Here is another one:

It is multi-TT hub because Yoctopuce’s sensors are all USB 1.1. This one would work better if space constrained but would require some soldering in my case.

I have not looked, but since it is derived from a TX2 it implies they are all separate root HUBs. You can always tell via “lsusb -t”.

Interesting - on my Nano lsusb shows a realtek USB 2.0 hub with an ID that points me to this datasheet:

That datasheet says:
MTT(Multiple Transaction Translator)

  • One TT for each downstream port.
  • Better data throughput when multiple downstream ports act on FS concurrently.

lsusb seems to confirm that it is multi-tt, once compiling opencv is done I will test it out.

I can confirm that the Jetson Nano can capture from all 4 USB 1.1 Cameras at the same time without any powered external hub. It can do this all on a cheap microUSB 5V/2A Power Supply.