Tensorrt inference error

Description

python: engine.cpp:1104: bool nvinfer1::rt::Engine::deserialize(const void*, std::size_t, nvinfer1::IGpuAllocator&, nvinfer1::IPluginFactory*): Assertion `size >= bsize && “Mismatch between allocated memory size and expected size of serialized engine.”’ failed.
Aborted (core dumped)

Environment

TensorRT Version: 5.1.5
GPU Type: GeForce RTX 2060
Nvidia Driver Version: 418.67
CUDA Version: 10.1
CUDNN Version: 10.1
Operating System + Version: ubuntu 16.04
Python Version (if applicable): 3.6.8
PyTorch Version (if applicable): torch 1.6.0+cu101

Yolov5 (pytorch)
I am using a Yolov5 pretrained model, I have successfully created yolov5s.engine file, but something problem in yolov5_trt.py file.
refrence link : tensorrtx/yolov5 at master · wang-xinyu/tensorrtx · GitHub

Hi @yugendra,
You are using and old version of TRT.
We recommend you to use the latest release.
https://developer.nvidia.com/nvidia-tensorrt-7x-download

Thanks!

1 Like

Hi @AakankshaS ,

Thank you for the quick response.