Converted engine file (mmdet2trt) is not working in DeepStream

his @Morganh

I have trained an object detection model using mmdet hrnet. To run it on DeepStream I have created an engine file using mmdet2trt converter (GitHub - grimoire/mmdetection-to-tensorrt: convert mmdetection model to tensorrt, support fp16, int8, batch input, dynamic shape etc.). I got the engine file successfully from it.

Config which I am using for it is :
config_infer_primary_fasterRCNN.txt

after running the DeepStream command I am getting this error:

Any help is appreciated.
thanks

The TRT version does not match. Please make sure the TRT version where you generate trt engine matches the TRT version where you run inference.

1 Like