Deepstream with standalone triton server

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only) 4.5.1
• TensorRT Version 7.2.3
• NVIDIA GPU Driver Version (valid for GPU only) 360.32.03
• Issue Type( questions, new requirements, bugs) Questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hello,

I need to know, is using DeepStream with standalone triton supported? I want to use single triton server for all my pipelines. Can you please share some reference example do so?
I want to build my pipeline like this.

camera → uridecodebin → streammux → Triton client → triton server (External) —> Triton client (Meta Data) → nvosd → RTSP server

I have tried with nvinferserver and able to run nvinferserver plugin in DeepStream pipeline but I want to run multiple camera or same camera for different analytics use case on same system. I heard that using standalone triton server would be optimised way and can levarage triton server capabilities.

what is your suggestion?

Sorry for the late response, is this still an issue to support? Thanks

Yes, if you can guide me would be great help. Thanks for your response.

nvinferserver is the only way to integrate triton with deepstream. Gst-nvinferserver — DeepStream 6.1.1 Release documentation

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.