Deepstream yolov3 "getPluginCreator could not find plugin YoloLayerV3_TRT version 1"

Hi,
I tried to run the deepstream with YoloV3 by using “deepstream-app -c deepstream_app_config_yoloV3.txt” and everything works normally, but when I run it using this pipeline:

gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-4.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/config_infer_primary_yoloV3.txt batch-size=1 unique-id=1 ! nvvideoconvert ! nvdsosd ! nveglglessink

I got an error like this.

0:02:22.411401748 10264 0x55abf7579400 INFO                 nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:generateTRTModel(): Storing the serialized cuda engine to file at /opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/model_b1_int8.engine
0:02:23.409265274 10264 0x55abf7579400 ERROR                nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:log(): INVALID_ARGUMENT: getPluginCreator could not find plugin YoloLayerV3_TRT version 1
0:02:23.409307270 10264 0x55abf7579400 ERROR                nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:log(): safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
0:02:23.409339140 10264 0x55abf7579400 ERROR                nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:log(): INVALID_STATE: std::exception
0:02:23.409616195 10264 0x55abf7579400 ERROR                nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:log(): INVALID_CONFIG: Deserialize the cuda engine failed.
0:02:23.413256084 10264 0x55abf7579400 ERROR                nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]:initialize(): Failed to create engine from serialized stream
0:02:23.413296975 10264 0x55abf7579400 WARN                 nvinfer gstnvinfer.cpp:692:gst_nvinfer_start:<nvinfer0> error: Failed to create NvDsInferContext instance
0:02:23.413321517 10264 0x55abf7579400 WARN                 nvinfer gstnvinfer.cpp:692:gst_nvinfer_start:<nvinfer0> error: Config file path: /opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/config_infer_primary_yoloV3.txt, NvDsInfer Error: NVDSINFER_TENSORRT_ERROR
ERROR: Pipeline doesn't want to pause.
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
ERROR: from element /GstPipeline:pipeline0/GstNvInfer:nvinfer0: Failed to create NvDsInferContext instance
Additional debug info:
gstnvinfer.cpp(692): gst_nvinfer_start (): /GstPipeline:pipeline0/GstNvInfer:nvinfer0:
Config file path: /opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/config_infer_primary_yoloV3.txt, NvDsInfer Error: NVDSINFER_TENSORRT_ERROR
Setting pipeline to NULL ...
Freeing pipeline ...
double free or corruption (!prev)
Aborted (core dumped)

i used RTX2060, cuda 10.2, tensorrt 7, and deepstream 4.0
My question is what is the “plugin YoloLayerV3_TRT version 1” ? how to install it? and can we run deepstream with Yolo V3 using “config_infer_primary_yoloV3.txt”?

Hi @tunggalmat
For error “could not find plugin YoloLayerV3_TRT version 1”, did you run “make -C nvdsinfer_custom_impl_Yolo” ? Could you confirm there is lib : /opt/nvidia/deepstream/deepstream-4.0/lib/libnvdsinfer_custom_impl_Yolo.so .

root@addf398b188a:/opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo# make -C nvdsinfer_custom_impl_Yolo
make: Entering directory ‘/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo’
g++ -o libnvdsinfer_custom_impl_Yolo.so nvdsinfer_yolo_engine.o nvdsparsebbox_Yolo.o yoloPlugins.o trt_utils.o yolo.o kernels.o -shared -Wl,–start-group -lnvinfer_plugin -lnvinfer -lnvparsers -L/usr/local/cuda-10.2/lib64 -lcudart -lcublas -lstdc++fs -Wl,–end-group
make: Leaving directory ‘/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo’

Hi @mchi
Yes, I already did run “make -C nvdsinfer_custom_impl_Yolo” before I did everything and I confirmed there is lib: /opt/nvidia/deepstream/deepstream-4.0/lib/libnvdsinfer_custom_impl_Yolo.so

I tried running deepstream with Yolov3 on Jetson Nano and DGX2, and everything worked fine and there were no errors as above. this problem only occurs on my rtx2060.

So, what should I do ? is there another way to install the YoloLayerV3_TRT version 1 plugin?

“YoloLayerV3_TRT” is in libnvdsinfer_custom_impl_Yolo.so, and normally /opt/nvidia/deepstream/deepstream-4.0/lib/libnvdsinfer_custom_impl_Yolo.so will be loaded to register “YoloLayerV3_TRT” for the inference.
And, I tried your command, it works on my side. Did you modify config_infer_primary_yoloV3.txt? Because config_infer_primary_yoloV3.txt specifies custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so

this problem only occurs on my rtx2060

Do you mean this is GPU card related?

@mchi
I did not modify config_infer_primary_yoloV3.txt. everything is the same as the default. and the config, also specifies custom-lib-path = nvdsinfer_custom_impl_Yolo / libnvdsinfer_custom_impl_Yolo.so

this is my config file :
[property]
gpu-id=0
net-scale-factor=1
#0=RGB, 1=BGR
model-color-format=0
custom-network-config=yolov3.cfg
model-file=yolov3.weights
#model-engine-file=model_b1_int8.engine
labelfile-path=labels.txt
int8-calib-file=yolov3-calibration.table.trt5.1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=1
num-detected-classes=80
gie-unique-id=1
is-classifier=0
maintain-aspect-ratio=1
parse-bbox-func-name=NvDsInferParseCustomYoloV3
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so

yes I use a laptop with a rtx2060 gpu card. does the yololayerv3_trt plugin not work on rtx2060 ?

May I know if you ever tried this other GPUs?

Thanks!

Hi tunggalmat,

I’m closing this topic due to there is no update from you for a period, assuming this issue was resolved. If still need the support, please open a new topic. Thanks

The Solution is upgrade the Deepstream sdk to version 5.0. And everything works well.