Problem converting trt pose to onnx model

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Orin NX 16Gb - Seeed A608 carrier

• DeepStream Version

• JetPack Version (valid for Jetson only)

• TensorRT Version

• Issue Type( questions, new requirements, bugs)
Can not convert correctly pth to onnx model, the obtained model is not usable in deep stream

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
I am trying to integrate trt_pose to a deepstream pipeline so I am following the instructions from this tutorial:
and this repository: GitHub - NVIDIA-AI-IOT/deepstream_pose_estimation: This is a DeepStream application to demonstrate a human pose estimation pipeline.

When I try to convert the pth file to onnx format using the script as explained here:
I obtain an onnx model then I include this in my deepstream pipeline and it does not work as expected (the joints are randomly drawn).

In the other hand if I use the onnx file generated by the author of the tutorial which is here: deepstream_pose_estimation/pose_estimation.onnx at master · NVIDIA-AI-IOT/deepstream_pose_estimation · GitHub it runs correctly in my pipeline (body pose seems correct)

In both cases the deepstream pipeline is the same the only difference is that in the first case I use the onnx file generated by export_for_isaac script and in the second case I take the onnx stored in the github repository

As the repository is old I am wondering if this problem is related to my tensorrt or torch version

Could you please help me with this?
Are you able to reproduce this issue please?