Please provide complete information as applicable to your setup.
**• Hardware Platform (Jetson / GPU) T4
**• DeepStream Version 6.0 • JetPack Version (valid for Jetson only) • TensorRT Version
**• NVIDIA GPU Driver Version (valid for GPU only) 440.33.01
**• Issue Type( questions, new requirements, bugs) bugs • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi, I’m using Python API.When I run the program in DeepStream’s Triton Inference Server container.Its ok.But when I run it in DeepsTreams container using Triton Inference Server through gRPC, I met this error.The Triton Inference Server version is 21.08, the same as DeepStream’s Triton Inference Server container.
the error PRINT is:
Sorry!
DeepStream’s Triton Inference Server container is “nvcr.io/nvidia/deepstream:6.0-triton” right?
What is “DeepsTream` s container using Triton Inference Server”?
In addition to supporting native inference, DeepStream applications can communicate with independent/remote instances of Triton Inference Server using gRPC, allowing the implementation of distributed inference solutions.
Here is A machine installed deepstream which is used to video decode,B machine installed triton which is used to inference.A machine communicate with B machine using gRPC.
How to use nvcr.io/nvidia/deepstream:6.0-triton instead of Triton server docker in B machine?