Pgie creation with nvinferserver doesn't work

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) NVIDIA GeForce RTX 3090
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.6.1.6
• NVIDIA GPU Driver Version (valid for GPU only) 535.113.01
• Issue Type( questions, new requirements, bugs) Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) I am interested on using the NVIDIA® Triton Inference Server along with DeepStream. I have tried running the python sample deepstream-rtsp-in-rtsp-out but when selecting the nvinferserver option the pgie can’t be created:

image

I haven’t made any changes in the code and I have followed the instructions that the README tells:

Do I have to do something else?

Thank you,

Mikel

please install the libs according to the readme. what is the start command-line? seems creating nvinfer/nvinferserver plugin failed, could you share the result of “ldd /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_inferserver.so” and “ldd /opt/nvidia/deepstream/deepstream/lib/libnvds_infer_server.so” ?

The “ldd /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_inferserver.so” result:


And the result of “ldd /opt/nvidia/deepstream/deepstream/lib/libnvds_infer_server.so”:

Seams like the tritonserver lib is not installed. How should I install it?

please refer to /opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app-triton/README. you can use DeepStream triton docker.

So I should run, deepstream-app -c inside the triton docker?

yes, in DeeStream triton docker, all triton libs are already installed.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.