when I input ./sample_fasterRCNN ,there are some informations as next.

&&&& RUNNING TensorRT.sample_fasterRCNN # ./sample_fasterRCNN
[I] Begin parsing model…
[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format ditcaffe.NetParameter: 353:25: Message type “ditcaffe.LayerParameter” has no field named “region_proposal_param”.
[E] [TRT] CaffeParser: Could not parse deploy file

I’m currently having the same issue when I try to build the engine from caffe model on Jetson Nano with TensorRT 5.0.6. This works on my host machine with TensorRT 5.1 though and I can serialize the engine. I would appreciate any hits on how to get this working with TensorRT 5.0.6.

This is the relevant topic on the forums but talks about TensorRT 4: https://devtalk.nvidia.com/default/topic/1045943/tensorrt/faster-rcnn-using-googlenet-as-feature-extractor-in-tensorrt-4/

I tried using the FRCNNPluginFactory provided in the factoryFasterRCNN.h, passing it to the parser->setPluginFactoryV2() but the error remains.

Also, to my understanding, with the call to initLibNvInferPlugins() the RPNROIPlugin should be registered and the caffe parser should be able to understand the “region_proposal_param” in the “RPROIFused” layer present in the prototxt. But this doesn’t happen for some reason.

I have same issue. Do you have any update?