How to run fasterRcnn Vgg caffeemodel with python test app3

,

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson )
• DeepStream Version-5.0
• JetPack Version: 4.4.1

I am using the primary config and ds config in ObjectDetector_fasterRcnn folder along with deepstream_test_app3(python) and getting the following error:
Error: gst-library-error-quark: Configuration file parsing failed (5)
following are the deepstream config and primary config file that I am using with test_app3.py.
config_infer_primary_fasterRCNN.txt (3.6 KB)
dstestrdx_pgie_config.txt (2.3 KB)

Hi,

Could you check the below topic to see if you are meeting the same issue?

Thanks

I have the following folder structure for running the fasterRCNN

I am following the commands in README:

Download model tar file using:
$ wget --no-check-certificate *
** https://dl.dropboxusercontent.com/s/o6ii098bu51d139/faster_rcnn_models.tgz?dl=0 *

** -O faster-rcnn.tgz**

  • Untar the model using:
    $ tar zxvf faster-rcnn.tgz -C . --strip-components=1 --exclude=ZF_*
  • Copy the prototxt file “faster_rcnn_test_iplugin.prototxt” from the
    ** data/faster-rcnn directory in TensorRT samples to this directory.**

Compile the custom library:
** $ make -C nvdsinfer_custom_impl_fasterRCNN**

when I run the command following command I get the following error:
gst-launch-1.0 filesrc location=…/…/samples/streams/sample_1080p_h264.mp4 ! *
** decodebin ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 *

** height=1080 ! nvinfer config-file-path= config_infer_primary_fasterRCNN.txt ! **
** nvvideoconvert ! nvdsosd ! nveglglessink sync=0**

ERROR: gst-launch-1.0: symbol lookup error: /home/dipesh/deepstream-5.0/sources/objectDetector_FasterRCNN/nvdsinfer_custom_impl_fasterRCNN/libnvdsinfer_custom_impl_fasterRCNN.so: undefined symbol: createRPNROIPlugin

Please help.

Hi,

We test the pipeline with gst-launch-1.0 and deepstream-app.
And both can work correctly in our environment.

Based on your log, it seems there are some issues in the TensorRT libraries.
Could you verify if the libraries are well-installed first?

$ dpkg -l | grep TensorRT

Thanks.

This is the output:

Hi,

Do you use the Jetson device?
The package you installed is for the desktop environment (amd64).

For Jetson, please find the packages from the JetPack installer.
Thanks.