Fps drop in multiple nvinfer amd adding queue between pgie and sgiede

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 7.0
• JetPack Version (valid for Jetson only)
• TensorRT Version deepstream 7.0 installation docs
• NVIDIA GPU Driver Version (valid for GPU only) 535
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

hi want to add queue between pgie and sgie but didn’t work i did it for increassing fps in my pipeline
but when i add multiple nvinfer inmy pipeline i’ve got a huge fps drop for example i try 2 yolov4 trained model from two …
when i use them separeatly they work correctly and i can even add 3 camera with 20 fps(didn’t try more camera untill now)
but when i use them in a row (i mean using operate-on-gie) i get unded 7 fps in each camera do you have any ideo about it

Adding queues does not improve any performance.

When you add multiple nvinfer instances (pgie+sgie or more nvinfer instance), more GPU usage will be consumed. You can use nvidia-smi to monitor GPU usage to check whether the fps dropping is caused by the overloading of GPU.

The number of objects detected in pgie also affects GPU usage

If you want to use the yolo series on deepstream, you can refer to this repository

Ty alot but i’m using deepstream 7.0 which branch of tensorrt should i use? Because it doesn’t have 8.6.2.3 as said in deepstream 7.0 installation documents

There is no branch for DS-7.0, although we have only tested it against DS-7.1, the code is independent of the TRT version and should be easily ported to DS-7.0

1 Like

So can i use tensorrt branch v8.6 and use it in deepstream?

You can give it a try, I think there’s no problem

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.