TensorRT supported on Jetson Nano

According to this Github Repo https://github.com/dusty-nv/jetson-inference

and

Deep Learning SDK Documentation

https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html?#deploy_cloud
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html?#deploy_embed__ul_shg_cx2_rdb

TensorRT is available on Jetson

But when a try to use

import tensorrt

I get this

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'tensorrt'

Is TensorRT supported on Jetson Nano? If not will there ever be a support for Jetson nano

https://docs.nvidia.com/deeplearning/sdk/tensorrt-sample-support-guide/index.html?#samples
Most of these TesnorRT examples used tesnorrt in python

Hi santhosh.dc, yes, TensorRT is supported on Nano.

It should be located under /usr/src/tensorrt

Do you see it there?

1 Like

According this https://devtalk.nvidia.com/default/topic/1043135/jetson-tx2/aboat-tensorrt-python-api-on-tx2/#, python API for tensorflow is not available. Reason being PyCUDA.

I was able to install PyCUDA on jetson nano https://devtalk.nvidia.com/default/topic/1027116/jetson-tx2/installing-pycuda-on-jetson-tx2/post/5352556/#5352556 with few failures in the test drivers after installtion.

I can’t find any online guide as to how I can create a Python API to wrapper around C++ tensorRT, for it to be used.

Since that thread was posted, TensorRT Python API is now available in JetPack for Jetson. You can find the Python samples in /usr/src/tensorrt/samples/python/

See the python branch of jetson-inference for this, it contains Python bindings to the C++ code using TensorRT:
https://github.com/dusty-nv/jetson-inference/tree/python

1 Like

To anyone who stumbles on to this post with same issue
git clone

$ git clone -b python https://github.com/dusty-nv/jetson-inference