Hi,
We’ve been using TensorRT for several years with different neural networks on different platforms, jetson (xavier), desktop (2080), server (T4), …
We’ve just started supporting Jetson Orin with our current models and we have found an odd issue.
Some networks are returning different values on Jetson Orin AGX with JetPack 5.1
I have created an example using pose estimation to reproduce the problem, you can download the code from the following link:
Steps to reproduce the problem:
1 - Install onnxruntime
$> pip3 install onnxruntime
2 - Build plan file using trtexec
$> ./trtexec_pose.sh build
3 - Run plan file and save outputs
$> ./trtexec_pose.sh infer
4 - Get results for the same input using onnxruntime
$> python3 onnxruntime_pose.py
5 - Finally compare trtexec and onnxruntime results
$> python3 check_results.py
Same steps on Jetson Xavier AGX (JetPack 4.5) or Tesla T4 or RTX 2080ti … give equivalent results when we use onnxruntime or tensorrt.
We have confirmed this issue can be reproduced on Orin with JetPack 5.1.1.
Our internal team is checking on this. Will update more information for you later.