Convert frozen_inference_graph.pb to .uff failed!

I tried to convert ssd_mobilenet_v2_coco model to .uff format.
Follow this link.
[GitHub - smistad/convert-tensorflow-model-to-tensorrt-uff: Simple script to convert a frozen tensorflow .pb file to TensorRT UFF format]

This script generated a .uff file of 164 bytes and it’s not working.

Need help!!
Thanks

Hi @sajitsalahuddin,
UFF parser has been deprecated TRT 7 onwards.
Hence request you to use ONNX parser to generate TRT engine.

Thanks!