Hi,installed the tensorflow,i want to know TensorRT’s Python API can supported on the TX2 now?
thanks!
Hi,
TensorRT python API is not supported on Jetson platform due to pyCUDA.
However, the python parser is working well.
Here are some alternatives for your reference:
1. Python → [Wrapper] → C++ inference
https://github.com/AastaNV/ChatBot
2. TensorFlow-TensorRT
https://github.com/NVIDIA-Jetson/tf_trt_models
Thanks.
I’ve created a TensorRT GoogLeNet example, in which I used Cython to wrap C++ code so that I could do TensorRT inferencing using python directly. Hope it helps.
- Running TensorRT Optimized GoogLeNet on Jetson Nano: https://jkjung-avt.github.io/tensorrt-googlenet/
- jkjung-avt/tensorrt_demos: https://github.com/jkjung-avt/tensorrt_demos
The code was tested on Jetson Nano, but it should work on Jetson TX2 too.