Inference using Yolov5l6.engine

Please provide the following information when requesting support.

Hardware - GPU RTX3080
Hardware - CPU 11th Gen Intel® Core™ i7-11850H
Operating System ubuntu 20.04
Riva Version
TLT Version (if relevant)
How to reproduce the issue ? python3 deepstream_rtsp.py

ERROR: [TRT]: 3: getPluginCreator could not find plugin: YoloLayer_TRT version: 1
ERROR: [TRT]: 1: [pluginV2Runner.cpp::load::292] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
ERROR: [TRT]: 4: [runtime.cpp::deserializeCudaEngine::76] Error Code 4: Internal Error (Engine deserialization failed.)
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:1528 Deserialize engine failed from file: /opt/nvidia/deepstream/deepstream-6.0/sources/deepstream_python_apps/apps/own_test/models/yolov5l6.engine

i am not able to load custom yolo model engine file.

please refer to CSDN-专业IT技术社区-登录

That repo says for deepstream-5.0 we are using deepstrea-6.0 trt 8.1 and cuda 11.2

please try to port it

is there any tutorial for this, to implement custom yolo nvinfer functions

You can find it in TRT doc - Developer Guide :: NVIDIA Deep Learning TensorRT Documentation

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.