How to run the yolov5s model on the Nano?

My Nano 2G environment is as follows.

image

I compiled and ran yolov5_tensorrt, the yolov5s.engine file is generated. Running the yolov5s program for image detection and video detection is normal. When I use yolov5s.engine for deepstream-test5-app, the following errors occur. Please guide me how to deal with this problem.

$ /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test5/deepstream-test5-app -tc config.txt

 *** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***

Opening in BLOCKING MODE
0:00:02.791838930 11903   0x557aa88690 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1161> [UID = 1]: Warning, OpenCV has been deprecated. Using NMS for clustering instead of cv::groupRectangles with topK = 20 and NMS Threshold = 0.5
ERROR: [TRT]: 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
ERROR: [TRT]: 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
ERROR: Deserialize engine failed from file: /home/jetson/JetsonX/yolov5s.engine
0:00:08.335878414 11903   0x557aa88690 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/home/jetson/JetsonX/yolov5s.engine failed
0:00:08.338798607 11903   0x557aa88690 WARN                 nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/home/jetson/JetsonX/yolov5s.engine failed, try rebuild
0:00:08.338859023 11903   0x557aa88690 INFO                 nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
ERROR: failed to build network since there is no model file matched.
ERROR: failed to build network.
0:00:08.901675892 11903   0x557aa88690 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1934> [UID = 1]: build engine file failed
0:00:08.902872812 11903   0x557aa88690 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2020> [UID = 1]: build backend context failed
0:00:08.902925676 11903   0x557aa88690 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1257> [UID = 1]: generate backend failed, check config file settings
0:00:08.903049217 11903   0x557aa88690 WARN                 nvinfer gstnvinfer.cpp:841:gst_nvinfer_start:<primary_gie> error: Failed to create NvDsInferContext instance
0:00:08.903076665 11903   0x557aa88690 WARN                 nvinfer gstnvinfer.cpp:841:gst_nvinfer_start:<primary_gie> error: Config file path: /home/jetson/JetsonX/config_infer_primary_nano.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
** ERROR: <main:1455>: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie: Failed to create NvDsInferContext instance
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(841): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie:
Config file path: /home/jetson/JetsonX/config_infer_primary_nano.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
App run failed

Hi,

ERROR: [TRT]: 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)

Do you use the same source with the plugin that you use for generating the TensorRT engine?
Based on the log, it complains about a plugin implementation used in creating doesn’t exist.

More, since Nano 2GB has really limited resources, Yolo5s might not able to fit into the memory.

Thanks.

1 Like

Because yolov5s, the program compiled by yolov5-tensorrt, can successfully load yolov5s.engine and detect images, I think the memory of Nano2G running this model should be sufficient.

I tried to change the engine file to resnet10.caffemodel_b1_gpu0_fp16.engine and test5 can work normally, but if you change to yolov5s.engine, the above error will occur. I guess it is because there is an IPluginCreator entity type that is not recognized by test5 in yolov5s.engine. If so, how can I find the code of this entity and register it?

[primary-gie]
enable=1
gpu-id=0
#model-engine-file=resnet10.caffemodel_b1_gpu0_fp16.engine
model-engine-file=yolov5s.engine

Hi,

Do you use the same JetPack and TensorRT versions for conversion and deserialization?

The plugin should exist on conversion time, please also link it when inferring the engine.
For TensorRT trtexec, this can be done via using the --plugins config.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.