Accuracy Drop when converting form onnx to trt fromat

• Hardware Platform (Jetson / GPU) : GeForce RTX 4090
• DeepStream Version : 6.3
• JetPack Version (valid for Jetson only)
• TensorRT Version: 8.5
• NVIDIA GPU Driver Version (valid for GPU only): 535.171.04
• Issue Type( questions, new requirements, bugs): Question

I have a trained YoloV8 model in onnx my model validation mAP was 87.95%. However after converting the model into tensorrt engine file the mAP was reduced to 84.64%. Could you provide me the reason why this is happening and is there a way to prevent it?

It is TensorRT related. Please raise the topic in TensorRT forum.