Description
Is there a tensorrt inference sample code in python , catering to the inference for dynamic batch sizes?
I am running inference on Jetson AGX orin using Jetpack 5.1.
Is there a tensorrt inference sample code in python , catering to the inference for dynamic batch sizes?
I am running inference on Jetson AGX orin using Jetpack 5.1.
Hi,
Please refer to the below link for Sample guide.
Refer to the installation steps from the link if in case you are missing on anything
However suggested approach is to use TRT NGC containers to avoid any system dependency related issues.
In order to run python sample, make sure TRT python packages are installed while using NGC container.
/opt/tensorrt/python/python_setup.sh
In case, if you are trying to run custom model, please share your model and script with us, so that we can assist you better.
Thanks!