Hi everyone,
So, I have been using Tensorflow 1.13 and have no problems parsing converted TF 1.13 to ONNX models (by tf2onnx) using TensorRT 6.0.1.5 on my Windows/Linux laptop.
I have flashed my Jetson Nano memory card using the latest JetPack (TensorRT 5.1.6.1) and the ONNX parser cannot seem to be able to parse my models. I have tried using TF 1.12 to train a model, tf2onnx to convert the model to the ONNX format but I am still getting the same error. The error is as follows:
ONNX IR version: 0.0.6
Opset version: 7
Producer name: tf2onnx
Producer version: 1.5.3
Domain:
Model version: 0
Doc string:
WARNING: ONNX model has a newer ir_version (0.0.6) than this parser was built against (0.0.3).
While parsing node number 0 [Mul]:
ERROR: builtin_op_importers.cpp:353 In function importScaleOp:
[8] Assertion failed: get_shape_size(weights.shape) == get_shape_size(dims)
Parsing the ONNX model failed.
Could you, please help me figure out what the problem is ?
Thanks :)