Hi all, does TensorRT Inference Server (TRTIS) support running video analytics inference with DeepStream?. I read that DeepStream works with TensorRT optimized inference engines as input to run the inference with DeepStream, and I need to know if DeepStream works with TRTIS also; if so, where can I find more information about?
Deepstream SDK can be used with TRT. Please reference https://developer.nvidia.com/deepstream-sdk?ncid=em-ded-nddmsk20ntsrvtwr-44658
Are you running into specific issues with using TRTIS for deepstream?
Hi NVES, does Deepstream SDK work with the TensorRT Inference Server or only with TRT?. I need to know if I can run several models using DeepStream on TRTIS. The link you sent me only describes DeepStream-TRT but not DeepStream-TRTIS. Thanks in advance for your comments and help
I don’t know of a reason why TRTIS wouldn’t work with multiple Deepstream models. If you experience issues, please let us know.
Is there some technical documentation that explains how to work with TRTIS and multiple Deepstream models?