Cannot convert ONNX model using tao_byom

Hello.

I have a yolo ONNX model which I have downloaded from here.

I have created Conda environment and everything as per steps mentioned in the documentation of tao_byom here. While trying to convert the model, it is throwing me error google.protobuf.message.DecodeError: Error parsing message with type 'onnx.ModelProto'

Please help understand/resolve this issue

• Hardware: X86 (RTX 3090 and AMD Threaripper)
• Network Type (Yolo_v3)
• How to reproduce the issue ? Just Download the any ONNX model from the link above and try to convert it using tao_byom. The issue will get reproduced

Hi,
From BYOM Converter — TAO Toolkit 3.22.05 documentation
Currently, only the following models are supported by the TAO BYOM Converter:

  • Classification
  • UNet

So, if I understand correctly, we cannot directly convert any object detection model using tao byom and if I have any other format you guys are forcing the user to run the tao_converter on the same system where user want to deploy the model (source)

I have an ONNX model which I want to convert to an .plan file to deploy using ROS. If I use tao-converter it won’t let me convert as I have to provide an encoding key to convert. Which, I don’t have for this ONNX model.

How do you convert the model than ? I am confused.

Thanks for the quick reply
Mayank

Yes, for BYOM, currently only the Unet and Classification models are supported by the TAO BYOM Converter.

The tool tao-converter is used to convert .tlt model into .etlt model.

If the .onnx model is not Unet or Classification, currently it is not workable to convert it to be a pretrained model to be running with TAO.

But you can build a TensorRT engine from ONNX using the trtexec tool . See Quick Start Guide :: NVIDIA Deep Learning TensorRT Documentation . But this is not a topic of TAO. It is a topic of TensorRT.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.