Decoding etlt files to onnx with batch size defined

Hi,

So, I have a .etlt model file for a yolo_v4 model generated from a tao3.xtf1.15 container, I am using the code given in tao_toolkit_recipes/tao_forum_faq/FAQ.md at main · NVIDIA-AI-IOT/tao_toolkit_recipes · GitHub to convert it to .onnx. The issue I am facing is that when I try to validate the model using pypi onnx lib’s onnx.checker.check_model(model) I am getting the following error “Traceback (most recent call last):
File “/opt/paralaxiom/vast/test/scripts/validate_onnx_model.py”, line 5, in
onnx.checker.check_model(model)
File “/home/vast/miniconda3/envs/onnx_env/lib/python3.12/site-packages/onnx/checker.py”, line 179, in check_model
C.check_model(
onnx.onnx_cpp2py_export.checker.ValidationError: Field ‘shape’ of ‘type’ is required but missing.” I checked the model graph and I can see that the input batch dimension is dynamic and this maybe the reason for incomplete shape tensors, how do I debug this. I guess this error is debugged if I export the model with batch size defined .

Please try to modify a dynamic batch size of “-1” to a fixed batch size using ONNX GraphSurgeon.
Example,

  1. First, import the necessary libraries:
import onnx
import onnx_graphsurgeon as gs
  1. Load the ONNX model:
model = onnx.load("path/to/your/model.onnx")
graph = gs.import_onnx(model)
  1. Modify the input shapes:
for input in graph.inputs:
    input.shape[0] = 64  # Replace 64 with your desired batch size
  1. Update any Reshape nodes that might be using dynamic batch size:
for node in graph.nodes:
    if node.op == 'Reshape':
        shape = node.inputs[1]
        if isinstance(shape, gs.Constant):
            new_shape = shape.values
            if new_shape[0] == -1:
                new_shape[0] = 64  # Replace 64 with your desired batch size
            node.inputs[1] = gs.Constant(name=shape.name, values=new_shape)
  1. Export the modified graph back to ONNX:
onnx.save(gs.export_onnx(graph), "path/to/your/modified_model.onnx")

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.