Can Deepstream run 1 application across different GPUs

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) H100
• DeepStream Version 7.0
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi, I would like to ask

  1. Can i use deepstream to run on 2 different GPUs? For example, I would like my decoding, streammux to be done on GPUID 0 and inference to be done on GPUID 1.
  2. Lets say I am decoding 1 stream @25fps and it uses 500MB of VRAM, if I am reading the stream @1fps, will the VRAM drop? Assuming 1fps is set from my camera source, not dropping frames within Deepstream
  1. yes. you can set gpu-id property of nvstreammux and nvinfer plugin.
  2. In theory, the VRAM will drop. if using 25pfs, the decoder needs more buffers for reference frames.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.