Error while parsing SSD model in deepstream app

Hi,

I created an engine file of Caffe based pure SSD using sample /usr/src/tensorrt/samples/sampleSSD

I added below lines to save engine file

ofstream p("./ssd.engine");
    p.write((const char*)(*trtModelStream)->data(),(*trtModelStream)->size());
    p.close();

after

(*trtModelStream) = engine->serialize();

.

I updated model-engine-file in file dstest3_pgie_config.txt with engine I created.

Also,

parse-bbox-func-name=NvDsInferParseCustomSSD
custom-lib-path=…/…/…/…/sources/objectDetector_SSD/nvdsinfer_custom_impl_ssd/libnvdsinfer_custom_impl_ssd.so

When I run the sample I get error as :

Could not find NMS layer buffer while parsing
0:00:04.853884113 27835 0x5597856f20 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger: NvDsInferContext[UID 1]:fillDetectionOutput(): Failed to parse bboxes using custom parse function

Can you please guide me how to solve this?

Hi,

Suppose you are using a Caffe-based SSD model. Please correct me if this is not correct.

The error indicates that there is no NMS layer inside your model.
Please noticed that the SSD architecture is slightly different among the different DL frameworks.

Do you use the same architecture of VGG_VOC0712_SSD_300x300_iter_120000.caffemodel?
If yes, the plugin (Normalize, PriorBox and DetectionOutputyou) should be already built-in.

You can try to use deepstream-app with the model path updated to see if works first.
Thanks.

Yes I use same Caffe-based SSD model as you mentioned.

I have updated model-file with VGG_VOC0712_SSD_300x300_iter_120000.caffemodel and proto-file with deploy.prototxt in deepstream-app configuration file.

Updated output-blob-names with detection_out;keep_count

I run deepstream-app I got error as:

:01:49.582022653 4850 0x3b507190 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:parseBoundingBox(): Could not find output coverage layer for parsing objects
0:01:49.582138178 4850 0x3b507190 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:fillDetectionOutput(): Failed to parse bboxes

Hi,

Please check if the Deepstream config is well updated.

Could not find output coverage layer for parsing objects
Failed to parse bboxes

It looks like your Deepstream still seeks for the coverage and bboxes layers as output.
Thanks.

Hi AastaLL,

I have a similar error with SSD

Could not find NMS layer buffer while parsing

// ISSUE RESOLVED

As I understand it the below parameter in the Deep Stream Config file is not right one.

parse-bbox-func-name=NvDsInferParseCustomSSD

Maybe the NvDsInferParseCustomSSD which is a cpp file does not include all the layers of your network or it has some of them missing. I exactly dont know how to edit the cpp file and add layers to it but maybe we see that in tensorrt and modify the file in
/sources/objectDetector_SSD/nvdsinfer_custom_impl_ssd
which is in the deepstream-app folder. For the start I’m trying to figure out what is the difference between the file in the above mentioned location and a successfully implemented sample in the below link
https://github.com/NVIDIA/retinanet-examples/blob/master/extras/deepstream/deepstream-sample/nvdsparsebbox_retinanet.cpp

With cross checking and looking into the network architectures the problem may be fixable solution.
I’m also working on it if I find a way will let you know

Thanks

Hi Psgr,

we have solved it, thanks for support

Hi @Ravik, could you please share the steps of how you were able to solve it?