Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson AGX Orin
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1
• TensorRT Version 126.96.36.199
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I am currently working on a parallel deepstream pipeline.
I refer to the following repo, https://github.com/NVIDIA-AI-IOT/deepstream_parallel_inference_app
My pipeline is like the picture below.
When I save the output video, let’s say output_1 and output_2, the issue rises.
The bounding box of the Yolov4 appears in the second video, output_2 with the bbox of Bodypose2D.
Is it not possible to encode two separate videos in the same pipeline?
The configurations decide which model’s output to which stream. Can you post all your configurations?
Thanks for the reply Fiona.
Here are my config files.
## 0=FP32, 1=INT8, 2=FP16 mode
Well I made the pipeline with the code like the following link.
So the config files I use are those two.
Does nvstreamdemux have to attach in the pipeline?
Yes. The nvstreamdemux should be in the pipeline and you need to implement the function of removing the sink branch when the corresponding source is removed.
Okay, I’ll try and share the result :) Thanks a lot!
I got one more related issue, well, it could be a simple question about queue.
Before adding streamdemux-streammux after tee(between tee-pgie), i added queue here just like the pic. below.
and the overlay issue(output of pgie_0 also shows on the output of pgie_1) does not happen on every frame.
So what are the reason for this? Just adding queue made the branch more independent.
Okay I’ll try and share the result
Hi @young2theMax ,
Do you still need support for this topic? Or should we close it? Thanks.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.