Stereo camera frame offset in gstreamer pipeline

My setup

• Jetson AGX Xavier
• DeepStream Version 6.3
• JetPack Version 5.1.2
• NVIDIA GPU Driver Version 11.4

Issue

Hey,
I am running a dual camera setup. I want to acquire synchronised frame (the cameras have an external trigger) how ever when I ran a pipeline the frames are synchronised but there are offset (for example left7.jpg corresponds to right10.jpg). The pipeline I used to acquire images:

gst-launch-1.0 \ 
    multiqueue max-size-buffers=1 name=mqueue \
    nvarguscamerasrc sensor-id=0 sensor-mode=0 ! 'video/x-raw(memory:NVMM), framerate=60/1' ! mqueue.sink_1 \
    nvarguscamerasrc sensor-id=1 sensor-mode=0 ! 'video/x-raw(memory:NVMM), framerate=60/1' ! mqueue.sink_2 \ 
    mqueue.src_1 ! nvvidconv ! 'video/x-raw' !  nvjpegenc ! multifilesink location=left%d.jpg \
    mqueue.src_2 ! nvvidconv ! 'video/x-raw' ! nvjpegenc ! multifilesink location=right%d.jpg

I also tried a pipeline with nvstreammux like this but the same issue occurs.

gst-launch-1.0 \ 
    nvarguscamerasrc sensor-id=0 sensor-mode=0 ! 'video/x-raw(memory:NVMM), framerate=60/1' ! mux.sink_1 \
    nvarguscamerasrc sensor-id=1 sensor-mode=0 ! 'video/x-raw(memory:NVMM), framerate=60/1' ! mux.sink_2 \ 
    nvstreammux name=mux batch-size=2 width=1920 height=1080 ! \
    nvmultistreamtiler rows=1 columns=2 width=3840 height=1080 ! nvvidconv ! 'video/x-raw' !  nvjpegenc ! multifilesink location=frame%d.jpg 

I want to eliminate the offset to so that file indexes are identical. My assumption is that the issue is related to initialisation of data streams in the nvarguscamerasrc plugin. If it is the case it is not clear to me weather I can make the pipeline wait for both cameras to start acquisition ?

You should have both left/right cameras synchronize for correct results. you may enable hardware sync pin to sync both sensor frames.
however, there’s software approach, please check the syncSensor sample.
this example using multi-sensor per single session, it’ll duplicate single capture request to dual-camera. the capture results should be close enough,
in addition, you may also check getSensorTimestamp() to enable the timestamp comparison,
thanks

Hi ShaneCCC,
Thank you for response. I already have cameras synchronised using the external trigger (I’m driving their synchronisation pin with external board) and I would rather not use software synchronisation.

I have tried the example code in syncSensor. This approach generates frames without offset and the sensor time stamps from getSensorTimestamp() are identical for both cameras. As you would expect given we used external trigger.

But I’m still having issue with gstreamer pipeline. As I mentioned above I can find 2 corresponding frames how ever the indexes resulting from my pipeline are offset. Is there a way to configure nvarguscamerasrc to work similar to the syncSensor example or do I have to create a custom gstremer plugin ?

1 Like

I don’t think nvarguscamerasrc able to do it like syncSensor by current implementation.

Please consider to implement your APP like syncSensor.

Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.