Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Both • DeepStream Version 7.0 • JetPack Version (valid for Jetson only) 6.0 • TensorRT Version • NVIDIA GPU Driver Version (valid for GPU only) 560.35.03 • Issue Type( questions, new requirements, bugs) Bug • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
I am currently developing a parallel model analysis program by referencing the deepstream_parallel_inference_app code.
Using the metamux plugin to apply parallel models and streaming the video via RTSP through the tiler works as expected. However, when the nvstreamdemux plugin is added to the pipeline after the metamux plugin, the pipeline transitions to the Play state but fails to receive any video.
Even when we modify the deepstream_parallel_inference_app code (outside of our program) by adding nvstreamdemux to the sink side and running it, the same issue occurs.
And if the nvstreamdemux plugin is removed from the same pipeline, it works correctly.
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
what do you mean about “but fails to receive any video.”? how did you observe?
as the pipeline ablow shown, if using two fakesink after nvstreamdemux, will the app run well?
nvstreamdemux-> fakesink1
-----------------------> fakesink2
The issue of not receiving any video can be observed through the following logs. To verify whether the pipeline works as expected using a fakesink as suggested, we used a tee to split one stream for rendering via OSD renderer. In the case where nvstreamdemux was used, the renderer screen did not appear, accompanied by the following logs. However, in the case where nvstreamdemux was removed, it functioned correctly. The pipeline diagram is attached below.
Following the suggested method, we connected a fakesink via nvstreamdemux and used a tee to verify functionality by checking if the OSD renderer operates. It was confirmed that the pipeline worked correctly when nvstreamdemux was not used. However, when nvstreamdemux was connected, it did not function properly in the same way.
Thanks for the sharing! DS7.0 corresponds to driver R535. please install the right component version. please refer to this link. can you reproduce this issue on latest DeepStream7.1?
The issue occurs identically on Jetson Orin NX and Jetson Orin Nano.
With JetPack 6.0, the DeepStream version is 7.0, and the same problem arises when running the above pipeline in this environment. While we can test DeepStream 7.1 in a server environment, updating the firmware on Jetson boards is expected to take a significant amount of time. Could the development team verify this on their end?
Sorry for the late reply! using “nvmetamux + nvstreamdemux+fakesink” in deepstream_parallel_inference_app, my app also can’t run well. it is related to the complex using of nvstreamdemux. AKY, there will be two nvstreamdemux in the pipeline.
we are investigating. will get back to you if there are any updates.
please add this fix fix.diff (2.2 KB) to deepstream_parallel_inference_app, then add your own logics.
After adding this fix, the app including “metmux + nvstreamdemux+ 4 fakesink” pipeline works well. Here are my test code, log and pipeline graph. I used “./apps/deepstream-parallel-infer/deepstream-parallel-infer -c configs/apps/bodypose_yolo/source4_1080p_dec_parallel_infer.yml”. deepstream_parallel_infer_app_demux.cpp (48.2 KB) log-116.txt (108.3 KB) demux+4fakesink.zip (1.1 MB)