I am having the following error while running inference on a trt engine
The engine file is for object detection model ‘ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8’.
[TensorRT] ERROR: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.)
TensorRT version: 126.96.36.199
onnx version: 1.10.2
I have uploaded my onnx and trt engine file for the model, and the script I am using for inference.
And I follow the code given in the link: https://developer.nvidia.com/blog/speeding-up-deep-learning-inference-using-tensorflow-onnx-and-tensorrt/ to convert onnx to trt file.
Please have a look.