Sync IMX-219-83 with DeepStream

Hello, guys!

My question is about syncing two cameras (but is one device in hardware - IMX-219-83). I did perform this task with threading lib in my Python app, but I haven’t reach same result using deepstream, so, please, I need help to do this.

Here is my config I’m using for test:

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=4

[tiled-display]
enable=1
rows=2
columns=1
width=1280
height=1440
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=5
camera-width=1280
camera-height=720
camera-fps-n=60
camera-fps-d=1
camera-csi-sensor-id=0
num-sources=1
gpu-id=0

[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=5
camera-width=1280
camera-height=720
camera-fps-n=60
camera-fps-d=1
camera-csi-sensor-id=1
num-sources=1
gpu-id=0

[streammux]
gpu-id=0
batch-size=2
batched-push-timeout=40000
#Set muxer output width and height
width=1280
height=1440

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=0
gpu-id=0

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=1
gpu-id=0

[osd]
enable=1
gpu-id=0
border-width=3
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Arial

The result is not almost synced. I need exactly match to obtain disparity map and use stereo-algorithms.

How do you check them are synced?

For now just with my eyes. Well, threading realization is more synced…I want to know if my config is wrong for the syncronization as it have to be for DeepStream SDK.

How about check with vendor how do they verify the synchronization for their design.

Well, maybe. Step from IMX-219-83 - is my config best solution for receiving synced frames from two csi cameras connected to Jetson? ShaneCCC, my question is not about IMX, it’s about config for DeepStream.

Hi,
The case is supported in Argus. Please refer to the sample:

/usr/src/jetson_multimedia_api/argus/samples/syncSensor

It is not supported in DeepStream SDK. If you would like to use DeepStream SDK, suggest check with camera vendor if the cameras can be initialized in synchronization mode.

Another possible solution is to use jetson_multimedia_api and implement syncSensor + TensorRT. You can refer to the two samples and do integration:

/usr/src/jetson_multimedia_api/argus/samples/syncSensor
/usr/src/jetson_multimedia_api/samples/04_video_dec_trt

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.