A clear and concise description of the bug or issue.
TensorRT Version: 7.2.0 GPU Type: V100 Nvidia Driver Version: CUDA Version: 12.0 CUDNN Version: Operating System + Version: Ubuntu 18.04 Python Version (if applicable): 3.10 TensorFlow Version (if applicable): 2.0 PyTorch Version (if applicable): 1.10 Baremetal or Container (if container which image + tag):
Hi i am trying to convert a uff model to onnx format can you let me knw the process or procedure , i do not have the base tensorflow / pytorch model with me …The model architecture is customized object detection model
But this example if for if i am having onnx model with me , i am having only uff model either i have to convert it to tensorrt and then to onnx or directly to onnx . Is there any example which u can share which performs the following process