where is trtexec?


I saw many examples using ‘trtexec’ to profile the networks, but how do I install it? I am using sdkmanager with Jetson Xavier.

Hi ydjian,

In the NGC TensorRT container (https://ngc.nvidia.com/catalog/containers/nvidia:tensorrt), trtexec is on the PATH by default, so you can just use it.

If TensorRT is installed manually, I believe you can find the code to build trtexec in /usr/src/tensorrt/samples/trtexec/ where you can run make to build it.

Once it’s built, then it should be located in /usr/src/tensorrt/bin, or a similar path.

If these files aren’t located in /usr/src/…, then you made be able to find similar paths by running this on the command line:

find / -name tensorrt

NVIDIA Enterprise Support


I have the same issue. I have tried making the tetexec from /usr/src/tensorrt/samples/trtexec but it does not recognize the trtexec command ! I tried it on xavier and tx2 with jetpack 4.4, TensorRT 7.1.3 and CUDA 10.2.
Then I tried using the Docker image you mentioned but I also get the following error:

docker: Error response from daemon: OCI runtime create failed: container_linux.go:349: starting container process caused “process_linux.go:449: container init caused “process_linux.go:432: running prestart hook 0 caused \“error running hook: exit status 1, stdout: , stderr: exec command: [/usr/bin/nvidia-container-cli --load-kmods configure --ldconfig=@/sbin/ldconfig.real --device=all --compute --utility --video --require=cuda>=9.0 --pid=461 /var/lib/docker/overlay2/0b97283d6c083846d22a2a2ddb1b293a2b17046cb107c3bb5cd8e79934ce5779/merged]\\nnvidia-container-cli: mount error: mount operation failed: /usr/src/tensorrt: no such file or directory\\n\”””: unknown.

What I intend to do is making trt engines from my .onnx models. Can you help me with this issue ?

Please refer to the installation steps from the below link if in case you are missing on anything
Also, we suggest you to use TRT NGC containers to avoid any system dependency related issues.


Thank you for your reply.
As I mentioned I am using a Jetson Xavier so TensorRT is already installed with the jetpack. I also used the Docker image from the link you provided and it gives me the error I posted so it does not really help me. Any other suggestions ?

Is there a way to install using the network repo that includes trtexec?

The instruction in 4.1.1 here Installation Guide :: NVIDIA Deep Learning TensorRT Documentation does not inclued trtexec



Please checkout,
TensorRT/samples/trtexec at master · NVIDIA/TensorRT · GitHub.

Thank you.