Miss ouput when convert Onnx to TensorRT

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.0
• TensorRT Version 7.1.3.4
• NVIDIA GPU Driver Version (valid for GPU only) CUDA 102
Hi,
I am trying to convert onnx to tensorRT. The onnx file output a 78x1x246 dims, but when convert to tensorRT only have 1x246 dims and the 78 losted. I checked onnx file with onnxruntime, it output a 78x1x246 dims.
Do you know that why? How can i fix it?
thanks

Hi,

How do you convert the onnx model into TensorRT?
Would you mind to try it with trtexec first?

$ /usr/src/tensorrt/bin/trtexec --onnx=[your/model/name] --verbose

Thanks.

Hi,
I run bellow command.

$ /usr/src/tensorrt/bin/trtexec --onnx=[your/model/name] --verbose

thanks.

here is my onnx file https://drive.google.com/file/d/1Prh6zd9kz9AbMmC9XyfcADqXTOi9VnTl/view?usp=sharing
can you try convert to tensorrt? thanks so much.

Hi,

Would you mind to share the log output from trtexec with us?
Thanks.

Hi,
The log ouput nothing and i can infer engine file with deepstream but the output incorrect.
thanks.

Hi,

Thanks for sharing your model.
We can reproduce this issue internally and pass it to our internal team for further checking.

We will update more information with you once we got any feedback.

Hi,

We have double checked this issue and TensorRT do generate the correct 78x1x246 output without any issue.

Could you check it again?

$ /usr/src/tensorrt/bin/trtexec --onnx=crnn.onnx  --dumpOutput


[11/06/2020-11:02:22] [I] median: 19.316 ms
[11/06/2020-11:02:22] [I] percentile: 19.3729 ms at 99%
[11/06/2020-11:02:22] [I] total compute time: 3.03206 s
[11/06/2020-11:02:22] [I] Output Tensors:
[11/06/2020-11:02:22] [I] output0: (78x1x246)

Thanks.