I had tried exporting according to this but failed using both dynamo_export and export. Wonder if it is possible to try exporting through TAO.
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks
For the pth model which is trained via TAO Toolkit, it is sure to export it to onnx file.
For example, https://github.com/NVIDIA/tao_pytorch_backend/blob/e5010af08121404dfb696152248467eee85ab3a7/nvidia_tao_pytorch/cv/dino/scripts/export.py,
The command line can be found in corresponding user guide or notebook.
Refer to DINO - NVIDIA Docs.
There is several docker from TAO. See TAO Toolkit | NVIDIA NGC. One is from pytorch backend.
You can login into the docker. And run the export command.
$ docker run --runtime=nvidia -it --rm nvcr.io/nvidia/tao/tao-toolkit:5.1.0-pyt /bin/bash,
Then inside the docker in the command without “tao” in the beginning. For example, for DINO network,
$ dino export xxx
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.