Tensorrt output issue


When I use tensorrt to do Inference with C++, all confidence value of model’s output is very small, it result in that there are no detecttion results. However, when I use tensorrt to do Inference with python, it works well. Can you give me some advice? Thank you!


TensorRT Version : 8.4.1
GPU Type : 3060
Nvidia Driver Version : 463
CUDA Version : 11.2
CUDNN Version : 8.2.1
Operating System + Version : win10


Could you please share with us a minimal issue repro model, and scripts to try from our end for better debugging.

Thank you.