Deepstream on GPU

Please provide complete information as applicable to your setup.

**• Hardware Platform GPU **
• DeepStream Version - 6.0

How many RTSP cameras at what FPS for running deepstream 6.0 does A2 tensor 16gb GPU supports?

I need to run around 250 rtsp cameras with people net model and 3-4 classification models

Please suggests us better GPU which will be suitable above requirement,


Deepstream - 6.0
GPU - RTX 6000

TensorRT - 8.2.1
Cuda - 11.4

  1. We have a video at 25FPS. If we use the peoplenet model for inference where streammux batch size is 1 we get an FPS of around 60. Similarly if i increase the streammux batch size to 2 the FPS increases to around 120.

  2. A similar setup was made where in same video was used as source for 30 Times. Initially the FPS which we achieved was nearly around 8 to 9. We increased the streammux batch size to 64 then all the streams were inference at 15 to 20 FPS. GPU memory consumption was 8 GB / 24 GB

When we increased the source to 40 the FPS dropped to around 11 to 13 FPS. we increased the streammux batch size to 128 but the FPS did not improved. We also increased the batch size of primary GIE to 128 as well. That did not solved the issue. GPU memory consumption was 10 GB / 24 GB.

We further increased the batch size to 196,200,256 in an assumption that more GPU memory will be utilized, but we got core dumped error.

Following are some questions.

  1. What is the maximum batch size which deepstream supports.
  2. Is deepstream limiting the GPU usage? If no then why we are getting dropped in FPS even though our GPU is not fully utilized.(fake sink and egl sink both outputs same fps)
  3. May we have a config file for deepstream-test-5 which may utilize the GPU at its full potential and run 50 video streams at higher FPS.

We should select one proper batch size in nvstreammux and nvinfer model to balance memory consumption and performance.

batch-size please refer to Frequently Asked Questions — DeepStream 6.1.1 Release documentation

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.