Error on parsing ONNX model

Hi,

I tried importing any ONNX model:

ONNX IR version: 0.0.3
Opset version: 6
Producer name: pytorch
Producer version: 0.3
Domain:
Model version: 0
Doc string:

ERROR: (Unnamed Layer* 76) [ElementWise]: elementwise inputs must have same dimensions or follow broadcast rules (input dimensions were [256,16,32] and [256,15,31])
While parsing node number 80 [Conv -> “321”]:
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/builtin_op_importers.cpp:430 In function importConv:
[8] Assertion failed: tensor_ptr->getDimensions().nbDims == 3
ERROR: failed to parse onnx file

I also see in the Tensorrt guide says "The ONNX Parser shipped with TensorRT 5.0 RC supports ONNX IR (Intermediate Representation) version 0.0.3, opeset version 7.

In general, the newer version of the ONNX Parser is designed to be backward compatible, therefore, encountering a model file produced by an earlier version of ONNX exporter should not cause a problem"

Did anyone else encounter a similar problem/situation?

Hello,

To help us debug, can you please share a small repro package including the onnx model and source used to import. You can dm me if you don’t want to post share publically on forum.

regards,
NVIDIA Enterprise Support

Hey. Is there a solution meanwhile?