How to sync up frames from two filesrc using nvstreammux

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) AGX Xavier
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4

I have a deepstream pipeline feeding two file sources and mux them using nvstreammux to pipe into nvinfer then to a stereo plugin as below:

gst-launch-1.0
filesrc location= left.h264 ! h264parse ! nvv4l2decoder name=c102
filesrc location= right.h264 ! h264parse ! nvv4l2decoder name=c104
c102. ! nvvideoconvert nvbuf-memory-type=3 ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080
c104. ! nvvideoconvert nvbuf-memory-type=3 ! m.sink_1
m. ! nvinfer config-file-path= dstest1_pgie_config.txt ! nvvideoconvert !
stereo ! nvmultistreamtiler width=2560 height=720 ! nvdsosd ! nveglglessink sync=True

The problem of the above pipeline is there is no guarantee the frame 1 of left.h264 will follow the frame 1 of right.h264 for stereo plugin. Some times the left.h264 ahead of right.h264 by 4 frames. And some times ahead of 3 frames. There is almost never in sync even I add sync=True at the end of the pipeline.

Question: is there a way to guarantee the left.h264 and right.h264 in sync using nvstreammux? that is either left.h264 ahead by only 1 frame of right.h264 or vice versa. How? Please advise. Thanks a lot for your help.

No. They will never be sync.

Two filesrcs mean two separated sources. There is no mechanism in open source gstreamer to sync these two videos. nvstreammux is a common use plugin for inference preparing, it can not know the special relationship between these two videos.

Thank you for the information.

Then is there a way to use, for example, python to read in images from both files and then feed to the pipeline? or other API I should use instead of Deepstream to feed the left/right images to nvinfer object detector to get bounding box result? Please advise. Thank you.

Do you mean you want to combine the two streams outside the pipeline and then input as one stream to the deepstream pipeline?

Yes, that could be one of the options:

  1. combine left frame and right frame into one frame input to the pipeline
  2. read a left frame and read a right frame, then push a left frame into appsrc 1 and push a right frame into appsrc 2, such that the app gate the files read and buffer push to guarantee the sync

For the current software already developed using deepstream we have, we prefer use option 2.

For two appsrc plugins, they are still two separated streams. It is just the same case with two filesrc plugins. Maybe you can try to attached same timestamp to the frames which you think should be together with your appsrc implementation. But this is still not guaranteed.