TensorRT 5 Python API for Jetson AGX Xavier?

Any idea when we might get a Python API for TensorRT? I have some things I’d like to try, currently running in TensorRT 5 on a P4000, and I’d like to see how they perform with fp16 on the AGX Xavier.



TensorRT Python API doesn’t support Jetson yet.
But you can use the python parser as an alternative.

1. Python -> [Wrapper] -> C++ inference

2. TensorFlow-TensorRT



According to Jetpack 4.2 webpage (https://developer.nvidia.com/embedded/jetpack):

So, is this Python API what we need to do now, instead of “trtexec” command line in terminal? I’d like to accomplish the benchmarks shown here: https://developer.nvidia.com/embedded/jetson-agx-xavier-dl-inference-benchmarks

But this page just shows that you use, for example:

./trtexec --avgRuns=100 --deploy=resnet50.prototxt --int8 --batch=8 --iterations=10000 --output=prob --useSpinWait

I’ve tried trtexec with and without the ./ , and neither are recognized. I’ve flashed Jetpack 4.2 on my Jetson AGX Xavier. And I can confirm Cuda, cuDNN, and TensorRT by using dpkg -l | grep XXXX

Is there a new webpage documenting this (the Python API) that I missed? Thank you!

To answer my own question, there is indeed documentation of the Python API: https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/python_api/index.html