How to convert uff model to onnx format


A clear and concise description of the bug or issue.


TensorRT Version: 7.2.0
GPU Type: V100
Nvidia Driver Version:
CUDA Version: 12.0
CUDNN Version:
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.10
TensorFlow Version (if applicable): 2.0
PyTorch Version (if applicable): 1.10
Baremetal or Container (if container which image + tag):

Hi i am trying to convert a uff model to onnx format can you let me knw the process or procedure , i do not have the base tensorflow / pytorch model with me …The model architecture is customized object detection model

Thanks in advance

Load the model in pytorch, then use the following export command:

arr= torch.rand((200, 200)).cuda()

Hi ,

As mentioned previously i donot have the pytorch model with me i only have the uff model i want to convert uff–> onnx

UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.


But this example if for if i am having onnx model with me , i am having only uff model either i have to convert it to tensorrt and then to onnx or directly to onnx . Is there any example which u can share which performs the following process


Please try tf2onnx in case it helps you (uff → .pb → onnx).

If you still need further assistance, please reach out to the ONNX related platform to get better help.

Thank you.