Nvidia-tensorrt deployment on a machine without connection to the internet except a repository-manager



I want to enable deployment of nvidia-tensorrt on a computer which has no access to the outside world except to a repository-manager (artifactory), during the deployment process.

we configured a proxy to https://developer.download.nvidia.com/compute/redist/, managed to fetch nvidia-tensorrt but fail fetching its direct dependencies: nvidia-cublas, nvidia-cuda-nvrtc, nvidia-cuda-runtime, nvidia-cudnn.

  1. Can you share which steps are needed in order to enable a repository-manager to give access to nvidia-tensorrt and its dependencies?

  2. are there additional repositories to link to s.t. during installation all needed trt’s dependency tree can be accessed?
    following Installation Guide :: NVIDIA Deep Learning TensorRT Documentation, I managed to see only the above nvidia repository mentioned.

  3. should I connect these repositories using credentials of a registered user or a certain key?
    are there any additional configuration required?



TensorRT Version:
CUDA Version: 11.2
CUDNN Version: 8.1.1
Operating System + Version: ubuntu18.4
Python Version (if applicable): 3.8
TensorFlow Version (if applicable): 2.5.5

Please refer to the latest installation steps.