Support for FasterRcnn and SSD_Mobilenets in python for tensorrt 5.0.2.6

Linux distro and version : Ubuntu 16.04 LTS
GPU type : GeForce GTX 1080
nvidia driver version : 396.44
CUDA version : 9.0
CUDNN version : 7.0.5
Python version [if using python] : 3.5.2
Tensorflow version : tensorflow-gpu 1.9
TensorRT version : 5.0.2.6
If Jetson, OS, hw versions : N/A

Describe the problem:

Tried uff_ssd sample provided in 5.0.2.6 Tensorrt version. I could see that it is specific for SSD_Inception. I have a problem where I have to use SSD_Mobilnet and Faster_RCNN models to run on a embedded platform using tensorrt.
Does tensorrt 5.0.2.6 have support SSD_Mobilnet and Faster_RCNN models? If yes could someone suggest some steps to follow and share some example scripts to convert tensorflow model(.pb) to a tensorrt model?

Hello,

Please reference TRT Developer Guide for

Faster_RCNN: https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#fasterrcnn_sample

Not supported, but you can reference SSD_mobilnet/TRT sample at GitHub - chenzhi1992/TensorRT-SSD: Use TensorRT API to implement Caffe-SSD, SSD(channel pruning), Mobilenet-SSD

Thanks for valuable suggestion. Looks like the suggested fasterrcnn sample is supported for C++.
Does the same apply for python too?

Having same issue , does Faster RCNN model built from tensorflow is applicable to python version?
If not what is the work around