Inference TensorRT randomly returns nan

Description

I use onnx-tensorrt to inference my model.
In my python code, I am calling TensorRT-8.0 api to load and parse the ONNX model. It rans well. But in my C++ code. It randomly returns nan.

Environment

TensorRT Version: 8.4.1
Nvidia Driver Version: 470.182.03
CUDA Version: 11.4
CUDNN Version: 8.4.0
Operating System + Version: linux18.04

Relevant Files

Steps To Reproduce

Hi,
Can you try running your model with trtexec command, and share the “”–verbose"" log in case if the issue persist

You can refer below link for all the supported operators list, in case any operator is not supported you need to create a custom plugin to support that operation

Also, request you to share your model and script if not shared already so that we can help you better.

Meanwhile, for some common errors and queries please refer to below link:

Thanks!

verbose from trtexec.odt (74.4 KB)
Hi,
I Have upload my verbose.