Please provide complete information as applicable to your setup.
Hardware Platform (Jetson / GPU): jetson Xavier Nx
DeepStream Version: 6.0
JetPack Version (valid for Jetson only): 4.6.0
TensorRT Version: 8001
Reproduction:
1、before update, the pipeline runs well
2、use the timeout callback to detect if need update
3、if update, then call the following api to reset model engine file
pgie.set_property("model-engine-file", model_file)
(the model_file is same as before, just for test)
4、error logs:
0:00:19.580137584 1399 0x7f2c080f50 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2004> [UID = 1]: Use deserialized engine model: model/model_2023-11-11/yolox_s.engine
0:00:19.694541567 1399 0x807d320 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus: [UID 1]: Load new model:model/model_2023-11-11/yolox_s.engine sucessfully
0:00:19.717468515 1399 0x807c0a0 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:59> [UID = 1]: Could not find output coverage layer for parsing objects
0:00:19.717600997 1399 0x807c0a0 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:735> [UID = 1]: Failed to parse bboxes
0:00:19.740391271 1399 0x807c0a0 ERROR nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:59> [UID = 1]: Could not find output coverage layer for parsing objects
But, the model engine file is same as before, and the postprecess lib is the same too, why error happens.
from the logs, the model’s output layer name is “output”. output-blob-names should be set to output. if still have error, please share the nvinfer’s configuration file.
Removed. still have error.
A strange thing, even if I set the output-blob-names to the wrong value, such as coverage, the pipeline runs well except update model.
nvinfer plugin is opensource. this “Could not find output coverage layer for parsing objects” error s in DetectPostprocessor::parseBoundingBox.
did you use the correct configuration file? if setting parse-bbox-func-name and custom-lib-path, nvinfer will use custom parsing function. you can add log in DetectPostprocessor::fillDetectionOutput to check why custom parsing function did not take effect. especially you need to rebuild plugin and replace /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_infer.so with the new so after modifying the code.
Before update model, the pipeline runs well, so the custom parsing function is correct, error happens only when I update the model while the pipeline is running.
Thanks for the sharing. it is because “setting parse-bbox-func-name” and “custom-lib-path” are not reused after updating model. here is a solution.
In function DsNvInferImpl::initNewInferModelParams of /opt/nvidia/deepstream/deepstream/sources/gst-plugins/gst-nvinfer/gstnvinfer_impl.cpp, add the following code in bold.
then rebuild libnvdsgst_infer.so according to readme, then replace /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_infer.so with the new so.