Hi,
I’m trying to record a single mkv video from two UVC cameras attached to the Jetson Orin NX using the nvidia gstreamer plugins. The acquisition and encoding of individual streams works perfectly with v4l2src
, but as soon as I try to mux two different cameras into one container, I get unexpected dependency issues. Also nvv4l2camerasrc
does not support the input format of my UVC cameras.
gst-launch-1.0 -e \
v4l2src device=/dev/video1 ! \
queue ! nvvidconv ! nvv4l2h264enc ! h264parse ! queue ! mux.video_0 \
v4l2src device=/dev/video3 ! \
queue ! nvvidconv ! nvv4l2h264enc ! h264parse ! queue ! mux.video_1 \
matroskamux name=mux ! filesink location=output.mkv
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Opening in BLOCKING MODE
gst-launch-1.0: symbol lookup error: /usr/lib/aarch64-linux-gnu/nvidia/libnvargus.so: undefined symbol: jpeg_destroy_compress
However these pipelines work individually
gst-launch-1.0 -e \
v4l2src device=/dev/video{1,3} ! \
queue name=videoqueue1 ! nvvidconv ! nvv4l2h264enc ! h264parse ! queue ! matroskamux name=mux ! filesink location=output.mkv
If I use nvv4l2camerasrc
the acquisition works, but none of my UVC camera sources support YUYV, so the video does not contain any valid data …
gst-launch-1.0 -e \
nvv4l2camerasrc device=/dev/video1 ! \
queue name=videoqueue1 ! nvvidconv ! nvv4l2h264enc ! h264parse ! queue ! mux.video_0 \
nvv4l2camerasrc device=/dev/video3 ! \
queue name=videoqueue2 ! nvvidconv ! nvv4l2h264enc ! h264parse ! queue ! mux.video_1 \
matroskamux name=mux ! filesink location=output.mkv
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 4
===== NvVideo: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
Pipeline is live and does not need PREROLL ...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
H264: Profile = 66 Level = 0
NVMEDIA: Need to set EMC bandwidth : 846000
NvVideo: bBlitMode is set to TRUE
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4
===== NvVideo: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66 Level = 0
NVMEDIA: Need to set EMC bandwidth : 846000
NvVideo: bBlitMode is set to TRUE
Redistribute latency...
^Chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:01.850706139
Setting pipeline to NULL ...
Freeing pipeline ...
… and the nvidia source only supports UYVY …
Factory Details:
Rank primary (256)
Long-name NvV4l2CameraSrc
Klass Video/Capture
Description Nvidia V4l2 Camera Source
Author Ashwin Deshpande <ashwind@nvidia.com>
Plugin Details:
Name nvv4l2camerasrc
Description Nvidia v4l2 Source Component
Filename /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvv4l2camerasrc.so
Version 1.14.5
License Proprietary
Source module nvv4l2camerasrc
Binary package NvV4l2CameraSrc
Origin URL http://nvidia.com/
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseSrc
+----GstNvV4l2CameraSrc
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
format: { (string)UYVY }
interlace-mode: { (string)progressive, (string)interlaced }
framerate: [ 0/1, 2147483647/1 ]
and my cameras only support different encodings at source …
v4l2-ctl -d /dev/video1 --all
...
Format Video Capture:
Width/Height : 640/512
Pixel Format : 'NV12' (Y/CbCr 4:2:0)
Field : None
Bytes per Line : 640
Size Image : 491520
Colorspace : Default
Transfer Function : Default (maps to Rec. 709)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Limited Range)
Flags :
...
and
v4l2-ctl -d /dev/video3 --all
...
Format Video Capture:
Width/Height : 640/480
Pixel Format : 'YUYV' (YUYV 4:2:2)
Field : None
Bytes per Line : 1280
Size Image : 614400
Colorspace : sRGB
Transfer Function : Rec. 709
YCbCr/HSV Encoding: ITU-R 601
Quantization : Default (maps to Limited Range)
Flags :
...
We are using the following Jetpack on a Jetson Orin NX 8GB
cat /etc/nv_tegra_release
# R36 (release), REVISION: 4.0, GCID: 37537400, BOARD: generic, EABI: aarch64, DATE: Fri Sep 13 04:36:44 UTC 2024
# KERNEL_VARIANT: oot
TARGET_USERSPACE_LIB_DIR=nvidia
TARGET_USERSPACE_LIB_DIR_PATH=usr/lib/aarch64-linux-gnu/nvidia
uname -a
Linux labforge-orinnx 5.15.148-tegra #125 SMP PREEMPT Tue Nov 19 09:23:11 EST 2024 aarch64 aarch64 aarch64 GNU/Linux
How can I acquire two streams into a single container in the same process? I need to synchronize other sensor inputs with the video-feed, so I’d rather not have to worry about synchronizing two independent gstreamer processes. This eventually has to go into a Python deepstream application.
It is also unclear to me why libargus has to get involved when streaming from UVC cameras? It also seems to have all the dependencies installed
ldd /usr/lib/aarch64-linux-gnu/nvidia/libnvargus.so
linux-vdso.so.1 (0x0000ffffa4b98000)
libEGL.so.1 => /lib/aarch64-linux-gnu/libEGL.so.1 (0x0000ffffa49b0000)
libnvvic.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvvic.so (0x0000ffffa4980000)
libnvscf.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvscf.so (0x0000ffffa41f0000)
libnvcamerautils.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcamerautils.so (0x0000ffffa41b0000)
libnvrm_mem.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_mem.so (0x0000ffffa4190000)
libnvrm_sync.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_sync.so (0x0000ffffa4170000)
libnvrm_host1x.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_host1x.so (0x0000ffffa4140000)
libnvrm_surface.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_surface.so (0x0000ffffa4100000)
libnvos.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvos.so (0x0000ffffa40d0000)
libnvbufsurface.so.1.0.0 => /usr/lib/aarch64-linux-gnu/nvidia/libnvbufsurface.so.1.0.0 (0x0000ffffa4000000)
libnvcameratools.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcameratools.so (0x0000ffffa3f60000)
libnvjpeg.so => /usr/local/cuda/targets/aarch64-linux/lib/libnvjpeg.so (0x0000ffffa3b70000)
libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000ffffa3940000)
libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffa38a0000)
libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffa36f0000)
/lib/ld-linux-aarch64.so.1 (0x0000ffffa4b5f000)
libGLdispatch.so.0 => /lib/aarch64-linux-gnu/libGLdispatch.so.0 (0x0000ffffa3560000)
libnvrm_stream.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_stream.so (0x0000ffffa3540000)
libnvsocsys.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvsocsys.so (0x0000ffffa3520000)
libnvcolorutil.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcolorutil.so (0x0000ffffa34f0000)
libnvmedia_isp_ext.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvmedia_isp_ext.so (0x0000ffffa34b0000)
libnvisppg.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvisppg.so (0x0000ffffa33a0000)
libcuda.so.1 => /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1 (0x0000ffffa0b30000)
libnvcamlog.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcamlog.so (0x0000ffffa0b10000)
libnvfnetstoredefog.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvfnetstoredefog.so (0x0000ffffa0ac0000)
libnvfnet.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvfnet.so (0x0000ffffa0a00000)
libnvfnetstorehdfx.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvfnetstorehdfx.so (0x0000ffffa09d0000)
libnvrm_chip.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_chip.so (0x0000ffffa09b0000)
libnvodm_imager.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvodm_imager.so (0x0000ffffa0450000)
libnvcapture.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcapture.so (0x0000ffffa03e0000)
libGLESv2.so.2 => /lib/aarch64-linux-gnu/libGLESv2.so.2 (0x0000ffffa03a0000)
libnvmm_utils.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvmm_utils.so (0x0000ffffa0370000)
libnvtvmr.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvtvmr.so (0x0000ffffa0270000)
libnvphs.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvphs.so (0x0000ffffa0240000)
libnvfusacap.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvfusacap.so (0x0000ffffa01f0000)
libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000ffffa01c0000)
libnvsciipc.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvsciipc.so (0x0000ffffa0180000)
libnvbuf_fdmap.so.1.0.0 => /usr/lib/aarch64-linux-gnu/nvidia/libnvbuf_fdmap.so.1.0.0 (0x0000ffffa0160000)
libnvrm_gpu.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so (0x0000ffffa00e0000)
libnvcam_imageencoder.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcam_imageencoder.so (0x0000ffffa00b0000)
librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000ffffa0090000)
libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000ffffa0070000)
libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffffa0050000)
libnvisp.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvisp.so (0x0000ffff9ff60000)
libexpat.so.1 => /lib/aarch64-linux-gnu/libexpat.so.1 (0x0000ffff9ff20000)
libnvcamv4l2.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvcamv4l2.so (0x0000ffff9fef0000)
libnvtegrahv.so => /usr/lib/aarch64-linux-gnu/nvidia/libnvtegrahv.so (0x0000ffff9fed0000)
including the needed dependencies for the Jpeg library
ldd /usr/local/cuda/targets/aarch64-linux/lib/libnvjpeg.so
linux-vdso.so.1 (0x0000ffff9acf2000)
librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000ffff9a880000)
libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000ffff9a860000)
libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffff9a840000)
libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff9a7a0000)
libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000ffff9a770000)
libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff9a5c0000)
/lib/ld-linux-aarch64.so.1 (0x0000ffff9acb9000)