Gstreamer or arguslib to merge two camera sources


I’m trying to merge two camera sources into one gstreamer pipe, but can’t find any guidance.


Any good examples?

Need to have the performance of 2x4k@30fps

hello Fredde,

you may enable nvarguscamerasrc plugin for each camera sources, but you may enable tee to split data into multiple pads for different usage.
for example,
here’s an example pipeline to preview single camera stream and also perform video recording.

$ gst-launch-1.0 -e nvarguscamerasrc num-buffers=300 ! 'video/x-raw(memory:NVMM), width=2952, height=1944, format=NV12, framerate=30/1' ! tee name=streams streams. ! queue ! nvv4l2h265enc bitrate=8000000 ! h265parse ! qtmux ! filesink location=video0.mp4 streams. ! queue ! nvoverlaysink -e

please refer to below Argus sample, if you would like to enable dual camera in the single capture request,
for example,

How can I use the /syncSensor example for feeding dual cameras into gStreamer?

hello Fredde,

please refer to Camera Architecture Stack,
syncSensor example is one of libargus application, which based-on libargus for implementation; gstreamer pipeline use nvarguscamerasrc plugin to access camera stack.

you may look into syncSensor sample codes, it enable single captureSession to create a capture request for dual camera devices;
AFAIK, gstreamer pipeline only support nvarguscamerasrc plugin for single camera source at once.
please have implementation based-on Argus samples.

So, if I choose to write an application like the syncSensor I will need to write the “recording” and “RTMP streaming” parts into the application, right?

As I’m new in this community I’m still surprices that there are no “good examples” offering multiple sensors/cameras into the gstreamer pipelines.

hello Fredde,

it depends-on your use-case, as I mentioned, please check syncSensor sample codes if you would like to have synchronization captures.
there’re also some Multimedia API Sample Applications you may refer to.