Deploy tensorflow model on deep stream

I want to use .pb model file with deepstream. Is there a way to integrate .pb file or use it with deepstream?
I am using :
Jetson nano
deepstream5
Jetpack2.2
tensorRT7…0

Hi,

Deepstream uses TensorRT as the inference engine.
TensorRT currently supports Caffe, uff, and onnx format.

So please convert your .pb model into uff or onnx format first.
Here is a useful convert for your reference:

Thanks.