What is the maximum number of serial models supported by nvinfer?

For model inference, I have 5 master engine models and I want to perform serial inference. I want to know the maximum number of serial models supported by nvinfer.

  • deepstream-app version 6.1.0
  • DeepStreamSDK 6.1.0
  • CUDA Driver Version: 11.4
  • CUDA Runtime Version: 11.0
  • TensorRT Version: 8.2
  • cuDNN Version: 8.4
  • libNVWarp360 Version: 2.0.1d3
    device on : jetson orin

Which do you mean, run mutilple models with one nvinfer or running multiple nvifers at the same time?

running multiple nvifers at the same time.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one.
Thanks

It depends on the memory size, codec capability , etc… of your device.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.