USB 3.0 and UVC devices rendering and bandwidth

Hi Support,

We tested the Toradex Apalis TK1 which embeds the Tegra K1.

I tried the JetPack 2.3 with 2x UVC cameras and I was not able to stream 1x 1080p60 stream in YUY2 and I420. Is it possible that the Tegra K1 doesn’t have the necessary USB 3.0 bandwidth to do that?

We would like to design a system that is able to capture 2x 1080p60 streams from USB 3.0. These streams will be rendered on the HDMI port or saved to a file or streamed through network. One use case at a time. I found also that the K1 is able to hardware encode 2x 1080p60 to H264 without too much problems using videotestsrc with moving patterns. However, the bottleneck seems to come from the USB 3.0 interfaces. Do you know if this type of setup will fit with the K1 architecture?

Right now, if I encode 1x 1080p60 streaming coming out from 1 UVC device using H264 hardware acceleration, the generated video is only at 30fps or less using omxh264enc. Even when rending the stream through the HDMI port, what we see on screen is not fluid. It’s like the system is not able to handle this bandwidth. We are using the following pipeline.

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080’ ! nvhdmioverlaysink

No problems with 720p60 in either cases.

What do you think about it?

Thanks for your time.

Regards,
Jérôme

I can’t answer whether the bandwidth is sufficient, but one thing you do want to be certain of is that the port is actually in USB3 mode. One way is that “lsusb -t” tree mode will list a “480M” or “5000M” to a device or HUB to note if it is running in USB2 or 3 mode.

Thanks for your feedback.
I used the device in USB 3.0 mode since the device has the 5000M mark besides it.
Like I said, I am able to render 720p60 which is higher than USB 2.0 bandwidth.
I’m just worried that maybe this issue is kernel-related. There are a lot of threads that says some devices are not able to achieve a full frame rate.
The UVC device that we are using can deliver 1080p60 streams to a PC x86_64 system without problems.

Thanks,
Jérôme

Any insights to this problem?
There are a lot of threads that says some devices are not able to achieve a full frame rate. All of this seems related to V4L2 stack/kernel.
Even Toradex mentionned us that there was maybe a problem. Here is the reply:

I simply got told that UVC would be unsuitable for any such use cases but I am missing the exact context now. Fact is the TK1 can very well handle such USB 3.0 bandwidth so it does not seem to be a USB issue. On the other hand you claim your UVC device is able to do it. So I guess then it would be some issue of the whole stack in between. Unfortunately I don’t know much details about neither the low-level USB part of UVC nor the higher-level V4L2 integration thereof. As you seem more proficient in that area I would suggest for you to start analysing resp. comparing e.g. USB analyser captures, V4L2 and/or gstreamer traces in order to figure out which exact part is misbehaving.

Are you planning to solve the issue on a near future?
Do you think we will have the same problem with the Tegra X1 board?

Thanks,
Jérôme

Any updates?
Thanks.

Jérôme

Hi jbolduc01,

Please try to update the TK1 firmware from the below thread, and make sure set CPU to max freq to see if can help on your case:
[url]https://devtalk.nvidia.com/default/topic/936302/jetson-tx1/support-for-multiple-high-resolution-usb-3-0-cameras-/post/4928235/#4928235[/url]

Thanks

Hi kayccc,
I just tried the new USB firmware and I replace the tegra_xusb_firmware file by the new one. Problems still persists.

I set the CPU to max freq using following commands:
echo 0 > /sys/devices/system/cpu/cpuquiet/tegra_cpuquiet/enable
echo 1 > /sys/devices/system/cpu/cpu0/online
echo 1 > /sys/devices/system/cpu/cpu1/online
echo 1 > /sys/devices/system/cpu/cpu2/online
echo 1 > /sys/devices/system/cpu/cpu3/online
echo performance > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor

Problem still persists.

Any ideas?

Jérôme

Could you try below command to see the recording can get 60fps or not.

gst-launch-1.0 v4l2src num-buffers=500 ! 'video/x-raw, width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)60/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), 
 format=(string)I420' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v -e

I just tried the proposed command and I get the following output in terminal. The file is empty.

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1, format=(string)I420
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1, format=(string)I420
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)60/1
EOS on shutdown enabled – waiting for EOS after Error
Waiting for EOS…
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
handling interrupt.
Interrupt: Stopping pipeline …
Interrupt while waiting for EOS - stopping pipeline…
Execution ended after 0:00:03.831010113
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Are you able to make the command working with a simple UVC webcam?

Regards,
Jérôme

You must add the UVC device node in the command like below. Make sure you web cam is at /dev/videoX

device=gst-launch-1.0 v4l2src <u><i>/dev/video1</i></u> num-buffers=500 ! 'video/x-raw, width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)60/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v -e

Yes, I thought of that. Still the same problem.
This command is working using which device?

Are you able to record 1080p60 to a file using this device?

Jérôme

Below is my device information. And for my case I need to modify the width and height to 640x480 because this device can’t support 1920x1080.

buntu@tegra-ubuntu:~$ v4l2-ctl -d /dev/video1 -D
Driver Info (not using libv4l2):
Driver name : uvcvideo
Card type : Microsoft LifeCam
Bus info : usb-tegra-xhci-3
Driver version: 3.10.96
Capabilities : 0x84000001
Video Capture
Streaming
Device Capabilities
Device Caps : 0x04000001
Video Capture
Streaming
ubuntu@tegra-ubuntu:~$ v4l2-ctl -d /dev/video1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘YUYV’
Name : YUV 4:2:2 (YUYV)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Interval: Discrete 1.000s (1.000 fps)
Size: Discrete 352x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Interval: Discrete 1.000s (1.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Interval: Discrete 1.000s (1.000 fps)
Size: Discrete 176x144
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Interval: Discrete 1.000s (1.000 fps)
Size: Discrete 160x120
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Interval: Discrete 1.000s (1.000 fps)