Deepstream loading custom detection model trained from DIGITS

I followed exactly from the DIGITS tutorial of retrainning the detection model from the website: https://github.com/dusty-nv/jetson-inference/blob/master/docs/detectnet-training.md#testing-detectnet-model-inference-in-digits

At the end, I’ve got the dog detection model. It’s working perfectly when I test it on the website using DIGITS, so I downloaded the model and extracted it on my Xavier to get the following files:

train_val.prototxt
deploy.prototxt
original.prototxt
solver.prototxt
mean.binaryproto
snapshot_iter_38600.caffemodel
info.json

Then I modify the deepstream sample config files “source4_720p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt” and “config_infer_primary.txt” to make use my own dog detection model. Specifically, I’ve changed the property group into following:

[property]
net-scale-factor=0.0039215697906911373
model-file=…/…/models/DOG_Model/snapshot_iter_38600.caffemodel
proto-file=…/…/models/DOG_Model/deploy.prototxt
labelfile-path=…/…/models/DOG_Model/label.txt
output-blob-names=bboxes
batch-size=30

0=FP32, 1=INT8, 2=FP16 mode

network-mode=2
num-detected-classes=2
interval=0
gie-unique-id=1
parse-func=1

Also my label.txt includes simply(based on the DIGITS tutorial website):

dontcare
dog


But, when I run the deepstream-app, I only get the video playing with no bouding box showing. In addition, the console shows the following messages:

Error: Could not find coverage layer while parsing output.
Error: Could not find coverage layer while parsing output.
.
.
.
Error: Could not find coverage layer while parsing output.
Error: Could not find coverage layer while parsing output.
** INFO: <bus_callback:121>: Received EOS. Exiting …

Quitting
App run successful


I don’t know where went wrong. Is it because I can’t simply utilize the model trained from DIGITS in Deepstream sample or is it because the label.txt should only include “dog” or what? I am kind of lost.

Hi,

The default support format in deepstream is ResNet.
For DetectNet, here is a sample for your reference:
https://github.com/AastaNV/DeepStream

Thanks.

Hi,

Thank you for pointing me a direction. But, since I am using Deepstream 3.0 on Xavier, I am still not able to run my deepstream-app after I make and get the custom lib file “libnvparsebbox.so.” I modified the config files to fit the Deepstream 3.0 requirements. Specfically, I changed the [property] group in my “config_infer_dog.txt” into the following:

[property]
net-scale-factor=1
model-file=…/…/models/DOG_Model/snapshot_iter_38600.caffemodel
proto-file=…/…/models/DOG_Model/deploy.prototxt
labelfile-path=…/…/models/DOG_Model/label.txt
batch-size=4

0=FP32, 1=INT8, 2=FP16 mode

network-mode=2
num-detected-classes=1
interval=0
gie-unique-id=1

parse-func=0
output-blob-names=coverage;bboxes
parse-bbox-func-name=parse_bbox_custom_detectnet
custom-lib-path=/home/nvidia/Downloads/deepstream_sdk_on_jetston/sources/apps/detectNet/libnvparsebbox.so

#enable-dbscan=1


And, I changed the [primary-gie] group in application configuration files into the following to link to my “config_infer_dog.txt”:

[primary-gie]
enable=1
labelfile-path=…/…/models/DOG_Model/label.txt
batch-size=2
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
config-file=config_infer_dog.txt


However, I think since there are some underlying difference between Deepstream 1.5 and Deepstream 3.0, after I run deepstream-app, the console show the following error message:

nvidia@jetson-0424318032697:~/Downloads/deepstream_sdk_on_jetson$ deepstream-app -c ./samples/configs/deepstream-app/dog_detect.txt

Using winsys: x11
Error. Could not open library containing custom implementation
nvbuf_utils: dmabuf_fd 0 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
nvbuf_utils: dmabuf_fd 0 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
.
.
.
nvbuf_utils: dmabuf_fd 0 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
nvbuf_utils: dmabuf_fd 0 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
** ERROR: main:613: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie_classifier: Failed to initialize infer context
Debug info: /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvinfer/gstnvinfer.c(2170): gst_nv_infer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie_classifier
App run failed


Can you help me find how do I solve this problem? Since I guess the error happens because I can’t use the custom lib for Deepstream 1.5 for 3.0, I have been looking up in the Deepstream3.0 plugin manual to find clues for implementing my own Deepstream 3.0 parser function for DetectNet, but I couldn’t figure out quite well the way to successfully implement one for DetectNet by referencing the provided sample code and makefile in the source directory in sources/libs/nvdsparsebbox.

Is there any Deepstream 3.0 sample for DetectNet?

(Or, I am happy to retrain the another model that can be run by Deepstream 3.0. Where can I find the pretrained ResNet model and ResNet-derived DetectNet network like the ones provided in the tutorial?)

Hi,

You can use this custom plugin sample as base:
{deepstream_sdk_on_jetson}/sources/objectDetector_SSD
And update the parser code for DetectNet into the nvdsparsebbox_ssd.cpp directly.

For ResNet model, you can find it in the package directly:
${deepstream_sdk_on_jetson}/samples/models/Primary_Detector/resnet10.prototxt

Thanks.

Hi,

Thank you for guiding me futher.

I tried to update the nvdsparsebbox_ssd.cpp with the parser code for DetectNet. I tried two ways:

  1. Concatenate the parsing code for DetectNet to the code in nvdsparsebbox_ssd.cpp and include the “nvparsebbox.h.”

  2. Replace the function “NvDsInferParseCustomSSD” with the parsing code for DetectNet entirely and “make” the files

I wonder if I was really doing it right? Both methods don’t work for me apparently as the error message on the console shows:

error: conflicting declaration of C function ‘bool parse_bbox_custom_detectnet(const std::vector&, const NvDsInferNetworkInfo&, const NvDsInferParseDetectionParams&, std::vector&)
CHECK_CUSTOM_PARSE_FUNC_PROTOTYPE(parse_bbox_custom_detectnet);


So, I tried to dive into the Deepstream 3.0 source code and I noticed that the way to implement the custom parsing function are different between the provided paring function for DetectNet and any custom parsing function in Deepstream 3.0 (e.g the “NvDsInferParseCustomSSD” function in nvdsparsebbox_ssd.cpp). Most importantly, the Deepstream 3.0 documentation in “nvdsinfer_custom_impl.h” file says that the custom parsing function should be of the type NvDsInferParseCustomFunc, which has totally different function signature than the provided DetectNet parsing function for Deepstream 1.5. (And, it’s the main reason why I got the error message above) I am wondering how I can update the provided parser code into “nvdsparsebbox_ssd.cpp” intuitively? And I really don’t want to implement one entire parsing code for DetectNet from scratch since I don’t have the knowledge of how exactly DetectNet works…

In addition, the parsing function in nvdsparsebbox_ssd.cpp is for the SSD UFF model, whereas the provided parsing code is for the DetectNet. I doubt that I can simply update the file without too much work.


Could you please give me more detailed instruction. I am just one step away from integrating the workflow of Deepstream and DIGITS for deep learning on my Xavier to see the result of my trained models from DIGITS.

Thank you.

Hi,

You will need some update due to the API change across version.
But the core implementation of the parser is identical.

Thanks.

Hi AastaLLL,
The resnet10.prototxt files are different in your repository and what comes with Deepstream 4.0, I guess the model is upgraded.

My question is that is the parser still valid that is given in your repository for the Deepstream 4.0 resnet10 model.
Actually I want to make python parser for the resnet10 model.

Thanks.