Is there any way to run a tensorrt engine without tensorrt environment? In another word, what is the minimal trt runtime environment?


I have two machine, say machine A and B, with entirely same hardware and cuda environment. But I only have one with TensorRT environment, say machine A. Now I have built an engine on machine A. How can I make it run on machine B with python API?

Can I just install pip install tensorrt? If not, what else should I do?

Please refer to the installation steps from the below link if in case you are missing on anything

Also, we suggest you to use TRT NGC containers to avoid any system dependency related issues.