Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) NVIDIA GeForce RTX 3090 • DeepStream Version 6.3 • JetPack Version (valid for Jetson only) • TensorRT Version 8.6.1.6 • NVIDIA GPU Driver Version (valid for GPU only) 535.113.01 • Issue Type( questions, new requirements, bugs) Question • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) I am interested on using the NVIDIA® Triton Inference Server along with DeepStream. I have tried running the python sample deepstream-rtsp-in-rtsp-out but when selecting the nvinferserver option the pgie can’t be created:
I haven’t made any changes in the code and I have followed the instructions that the README tells:
please install the libs according to the readme. what is the start command-line? seems creating nvinfer/nvinferserver plugin failed, could you share the result of “ldd /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_inferserver.so” and “ldd /opt/nvidia/deepstream/deepstream/lib/libnvds_infer_server.so” ?