TF-TRT with DeepStream SDK

I have a .pb file created for TF-TRT inference. Is there a way I can use this file with the current Deepstream 3.0 SDK?

Hi
I just converted a .pb file to tensorrt engine by taking the help of following :
[url]https://github.com/jeng1220/KerasToTensorRT/blob/master/trt_example.py[\url]

I was able to generate the engine file and modified the config file accordingly but the app was not running.I also want to run my engine file for inferencing with deepstream 3.0 sdk.
I have also posted regarding this in the following post:

https://devtalk.nvidia.com/default/topic/1049029/deepstream-for-tesla/getting-a-black-video-on-running-a-sample-deepstream-application/