Convert frozen_inference_graph.pb to .uff failed!

I tried to convert ssd_mobilenet_v2_coco model to .uff format.
Follow this link.
[https://github.com/smistad/convert-tensorflow-model-to-tensorrt-uff]

This script generated a .uff file of 164 bytes and it’s not working.

Need help!!
Thanks

Hi @sajitsalahuddin,
UFF parser has been deprecated TRT 7 onwards.
Hence request you to use ONNX parser to generate TRT engine.


Thanks!