Use demux with multiple inputs and multiple output mp4 files

Please provide complete information as applicable to your setup.

• Hardware Platform: GPU
• DeepStream Version: 6.2-triton (container of docker)
• Issue Type: issue
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

I am creating a basic input pipeline of 50 videos, then going through a detection model with nvinfer. Finally, each input video will be saved to a corresponding osd mp4 file. With the old pipeline, I used streammux → pgie → nvvidconv → queue → tiler → nvosd → nvvidconv1 → capsfilter → encoder → codeparser → sink and found that with 50 input videos, there was only 1 video output. I did some research and I found it necessary to use demux to do this so I consulted https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-demux-multi-in-multi-out/ deepstream_demux_multi_in_multi_out.py. I was able to run until the end of the video but did not see the output file being saved but an error appeared:
nvstreammux: Successfully handled EOS for source_id=0
[Gst-queue0] gst-stream-error-quark: Internal data stream error. (1): gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:pipeline0/GstQueue:queue0:
streaming stopped, reason not-linked (-1). Can you tell me why and how to fix it?

This is the current code I am using. Please let me know where it is wrong and how to fix it, thank you very much.
main.py (6.5 KB)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

• Hardware Platform: GPU (V100)
• DeepStream Version: 6.2-triton (container of docker)
NVIDIA GPU Driver Version: 525.85.12
• Issue Type: questions

I am creating a basic input pipeline of 50 videos, then going through a detection model with nvinfer. Finally, each input video will be saved to a corresponding osd mp4 file. With the old pipeline, I used streammux → pgie → nvvidconv → queue → tiler → nvosd → nvvidconv1 → capsfilter → encoder → codeparser → sink and found that with 50 input videos, there was only 1 video output. I did some research and I found it necessary to use demux to do this so I consulted https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-demux-multi-in-multi-out/ deepstream_demux_multi_in_multi_out.py. I was able to run until the end of the video but did not see the output file being saved but an error appeared:
nvstreammux: Successfully handled EOS for source_id=0
[Gst-queue0] gst-stream-error-quark: Internal data stream error. (1): gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:pipeline0/GstQueue:queue0:
streaming stopped, reason not-linked (-1). Can you tell me why and how to fix it?

And here is the code I ran.

You need to make your pipeline like this:

gst-launch-1.0 -e uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! mux.sink_0 nvstreammux name=mux batch-size=2 width=1920 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app/config_infer_primary.txt ! nvstreamdemux name=demux demux.src_0 ! queue ! nvdsosd ! nvvideoconvert ! 'video/x-raw,format=I420' ! x264enc ! h264parse ! splitmuxsink location=out_0.mp4 uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h265.mp4 ! mux.sink_1 demux.src_1 ! queue ! nvdsosd ! nvvideoconvert ! 'video/x-raw,format=I420' ! x264enc ! h264parse ! splitmuxsink location=out_1.mp4

Please pay attention to the positions of “nvdsosd” and “capsfilter”.

Thank you very much, I will try it.

I tried but it still error:

Please use “export GST_DEBUG=3” to enable more log for debugging.

And please use “gst-inspect-1.0 x264enc” to check whether there is x264enc in your platform.

I checked and found I can call x264enc:

and I ran it with mode GST_DEBUG=3, here is its log result:
log.txt (28.5 KB)
I checked the log and saw almost all warnings and the only error was:

ERROR: from element /GstPipeline:pipeline0/GstNvInfer:nvinfer0: Internal data stream error.
Additional debug information:
gstnvinfer.cpp(2369): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:nvinfer0:
streaming stopped, reason not-negotiated (-4)

I don’t know why and how to fix it please help me.

Can the pipeline I gave work in your machine?

I see it still fails even though I run the pipeline you provided:

gst-launch-1.0 -e uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! mux.sink_0 nvstreammux name=mux batch-size=2 width=1920 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app/config_infer_primary.txt ! nvstreamdemux name=demux demux.src_0 ! queue ! nvdsosd ! nvvideoconvert ! ‘video/x-raw,format=I420’ ! x264enc ! h264parse ! splitmuxsink location=out_0.mp4 uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h265.mp4 ! mux.sink_1 demux.src_1 ! queue ! nvdsosd ! nvvideoconvert ! ‘video/x-raw,format=I420’ ! x264enc ! h264parse ! splitmuxsink location=out_1.mp4

This is the log file I recorded when running the above pipeline:
log.txt (27.4 KB)

This is a bug with DeepStream 6.2. Please upgrade to DeepStream 6.3 GA

Thank you very much. I will try it.