Hello everyone,
I’m currently in the process of setting up Nvidia Triton Inference Server on my local machine and have been following the latest version requirements closely to ensure compatibility and optimal performance. According to the documentation and setup guides, the recommended versions for a successful installation are as follows:
- Triton Inference Server: Version 2.44
- Operating System: Ubuntu 22.04
- CUDA Toolkit: NVIDIA CUDA 12.4.0.41
- TensorRT: Version 8.6.3
However, while gathering the necessary components, I encountered a roadblock with TensorRT. It appears that TensorRT 8.6.3, as specified in the requirements, is not available for download or installation. The highest version I was able to find under the TensorRT 8.6 series is 8.6.1.
This discrepancy has left me puzzled and uncertain about how to proceed with the installation.
Thank you in advance.