Running multiple primary inference models parallely

Hi,
I have tried running yolov5 and SSD nvinfer models in 2 different pipelines with same input ip camera source and I am experiencing fps drop than running it individually.
Is there any way to run 2 primary inference models parallely with same input source and without any fps drop

Hi,

Do you use Deepstream?
If yes, could you check the GPU utilization with tegrastats to see if any resources remaining first?

Thanks.

yes, deepstream is used its 2K utilised I don need to run more…I need to run these 2 parallelly without fps drop with deepstream

Hi,

It sounds there is some zoom for parallelism.

Could you share a complete source/script with us?
So we can check it directly?

Thanks.

I cant share the script because of confidentiality matters. but its deepstream-test3 code with two pipelines one with nvinfer using yolo model and other with nvinfer using ssd model as primary inference engine.