Using TensorRT 8.4 with triton server

Hi,
I’m using trt 8.4 to export my model (I can’t use older version due to incompatibility) and I saw currently there isn’t a ready docker that runs trt 8.4 backend
I tried to understand if its possible and how to compose a new docker with the newer trt version
currently using the latest triton version available (22.06)
Thanks

Hi,
UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.

Thanks!

Hi, Thanks for the respond but I’m not sure I understand how this will solve my problem
I’d like to execute with triton server. I’m not sure if the onnx-tensorrt can help in this scenario
I hope I’m wrong though : )
Thanks

Hi,

Sorry, you may need to wait for the upcoming triton container release.
You may give a try upgrading the tensorrt inside the container by downloading. But we cannot guarantee it works.
https://developer.nvidia.com/nvidia-tensorrt-8x-download

Thank you.