TensorRT Parsing ONNX Model Error

Hi NVES,

Thanks very much.

  1. I ran check_model.py snippet on my ONNX model file. It ran and finished quietly. I assume this means it is validated successfully. The full code in my check_model.py is as below.
    import sys
    import onnx
    filename = r"C:\mypath\mymodel.onnx"
    model = onnx.load(filename)
    onnx.checker.check_model(model, full_check=True)

  2. As to the full log, could I work with Nvidia directly, instead of sharing such information on the public forums?