Tf2onnx

pb to ONNX by tf2onnx
But i got this error in deepstream

ONNX IR version: 0.0.5
Opset version: 10
Producer name: tf2onnx
Producer version: 1.6.2

[07/22/2020-11:48:20] [W] [TRT] /root/onnx-tensorrt/onnx2trt_utils.cpp:235: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
While parsing node number 101 [Resize]:
ERROR: /root/onnx-tensorrt/builtin_op_importers.cpp:2540 In function importResize:
[8] Assertion failed: scales.is_weights() && “Resize scales must be an initializer!”

How can i fix it
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)

Hi,

Here is a similar issue that fixed by using opset-11 onnx format:
https://github.com/NVIDIA/TensorRT/issues/386#issuecomment-604183816

May I know which opset version do you use?
If not opset-11, would you mind to give it a try?

You can check if this issue is fixed by running our trtexec binary.

$ /usr/src/tensorrt/bintrtexec --explicitBatch --onnx=[name].onnx

Thanks.

1 Like

like this issue
i turn on opset11
error is :
ERROR: ModelImporter.cpp:92 In function parseGraph:
[8] Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
ERROR: Failed to parse onnx file
ERROR: failed to build network since parsing model errors.
ERROR: failed to build network.

At tensort7 load onnx resize ops error end ,say use fixed shape is work. How can use fixed shape in Deepstream.

Hi,

We would like to reproduce this issue on our environment before giving a further suggestion.
Could you share your onnx model with us first.

Thanks.

it was solved by onnx-tensorrt