Frcnn on deepstream 5.0

I successfully installed deepstream -5.0 on my Jetson Nano. I have also trained a faster rcnn model using TLT 2.0 and generated the appropriate .engine file.

My goal is to deploy it using deepstream 5.0, i am following the instructions here to install the respective pre requisites:
https://docs.nvidia.com/metropolis/TLT/tlt-getting-started-guide/#intg_fasterrcnn_model

The 2 pre requsities were installing the custom parsers which i successfully did using this repository (the metropolis documentation mentioned a link to gitlab which is broken) and the other one is downloading the Tensor RT open source to get the custom plugins specifically: cropandresize plugin and the proposal plugin.

Attached are the results I am getting when running frcnn on deepstream 5.0; clearly there is something missing in terms of the proposal plugin. Because, these exact frames are yielding perfect results using the same exact model on TLT 2.0.

Hi ishan,
Firstly, for your mentioned “the metropolis documentation mentioned a link to gitlab which is broken”
Could you try it again? I did not see any problem.

Actually, for faster-recnn, GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream covers all the prerequisite. You can just follow it.

For your attaced png, is it the result from “tlt-infer”?

How about the result of deepstream?
Can you run below successfully?

./deepstream-custom -c pgie_config_file -i  xxx.h264 -d

That image is the result from running inference on deepstream. I used the same dataset and trained a detectnet 2 model on tlt 2.0 and converted it for deepstream and the inference results are great, but not for frcnn as the image above shows.

For faster-rcnn, could you please run “tlt-infer” against this test image?
Please paste the result here. I want to know if the “tlt-infer” get the same result or not.

I will do that.