i would like to know if i can use deepstream 6.3 and jetpack 5.1.2 with an nvidia RTX 4090 Graphic card to do inference.
I would like also to know if it is possible to add more than one GPU (RTX 4090) on the same server to do inference.
Not very clear on your setup, can you elaborate?
Jetpack5.1.2 is for Jetson, it doesn’t support dGPU such as RTX4090, not sure how you combined them together.
For a workstation with multiple GPU, you can select the GPU for inference by option
[gpu-id](https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html) in nvinfer plugin.
You right, i said Jetpack just to make the question short.
my setup is:
- on an x64_86 environment
- cuda 11.4
- tensort 8.xxx
- deepstream 6.3
I would like to know if the RTX4090 is currently working with deepstream application.
And i would also to know if i can extend my setup adding one more RTX 4090 and do inference with multiple gpus.
I’m closing this topic due to there is no update from you for a period, assuming this issue was resolved. If still need the support, please open a new topic. Thanks
The setup is fine, this is the dGPU setup, please installed the dependencies as below:
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.