Hi @AK51, I’ve not tried the TAO Toolkit Pose Detection models through jetson-inference before. It very likely requires different pre/post-processing than what is in jetson-inference poseNet code today (which is made to support the models from trt_pose). I will take a look into this, but currently I would recommend running the TAO Pose models through DeepStream.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| How to use .trt file for inference on jetson nano | 4 | 1553 | October 18, 2021 | |
| How to use tlt trained model on Jetson Nano | 7 | 2111 | October 12, 2021 | |
| How to run purpose built model Peoplenet on Jetson Nano in my own application? | 11 | 3744 | September 7, 2021 | |
| Python App Cutom Model on the Jetson Nano | 10 | 1184 | October 12, 2021 | |
| TensorRT Jetson Nano ONNX Inference | 1 | 550 | August 25, 2020 | |
| TRT POSE integration | 2 | 530 | October 12, 2021 | |
| Doing tlt inference only with tensorrt | 3 | 948 | October 9, 2021 | |
| trt engine inference in python without deepstream | 9 | 1423 | October 12, 2021 | |
| Failed to generate TRT .engine from ONNX model generated using TAO | 7 | 85 | November 18, 2024 | |
| Deepstream6.3 tao_pretrained_models | peopleNet .onnx to .engine | 5 | 591 | January 16, 2024 |