How to use model.pb with deepstream5.0

Hello,
I have trained fire detection.pb model using this tutorial https://pylessons.com/YOLOv3-TF2-custrom-train/ and i want to run this .pb model using deepstream 5.0 ,how it possible, please

Thanks.

Deepstream5.0

There are two solutions:

  1. convert the TF models to onnx and deploy in DS which will use gst-nvinfer(TensorRT) to do inference
    Reference:
    1.1 converting to onnx https://elinux.org/TensorRT/ONNX#How_to_convert_your_model_to_onnx.3F
    1.2 deepstream doc : https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html

  2. deploy the TF models directly which will use gst-nvinferserver(Triton) to do inference
    ds doc: https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinferserver.html
    ds sample: sources/apps/sample_apps/deepstream-segmentation-test

thanks