Tensorrt minimum runtime on docker

Hi ,every one

I would like to know if the official provides a minimum runtime image for Tensorrt? Because the Tensorrt image on NGC is very large large image on ngc , I hope to have a lightweight runtime image. Does the official have such an image? If not, does it mean that I need to build it myself from the basic image.

Looking forward to your reply

Hi @1342868324 ,
If not image, you can also install trt locally using any one of the methods.

However on the lighter version, i am checking with the team.


Thank you for your reply.

Lightweight runtime images are very helpful for deployment and management (k3s/k8s). The disk usage of images downloaded from ngc is approximately 5g-10g, and I hope it can be reduced to 0.5g-2g. I believe many companies have this issue when deploying trt images. Please check the requirements in this regard.

Thank you for your feedback, Will pass this on to Engineering Team.

Can you try running your model with trtexec command, and share the ā€œā€ā€“verbose"" log in case if the issue persist

You can refer below link for all the supported operators list, in case any operator is not supported you need to create a custom plugin to support that operation

Also, request you to share your model and script if not shared already so that we can help you better.

Meanwhile, for some common errors and queries please refer to below link:


Hello, thank you for your reply.

My TRT model did not have any issues with conversion and inference, I just want to know if the minimum running image of TRT can reduce size