tensorflow.python.framework.errors_impl.InvalidArgumentError: Failed to import metagraph

I’m trying to convert a frozen inference tensorflow model (ssd_mobilenet_v2) to tensorRT but I’m getting following error:

2019-11-17 12:59:30.640324: E tensorflow/core/grappler/grappler_item_builder.cc:330] Failed to detect the fetch node(s), skipping this input
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 298, in convert
    self._convert_graph_def()
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 226, in _convert_graph_def
    self._run_conversion()
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 204, in _run_conversion
    graph_id=b"tf_graph")
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/grappler/tf_optimizer.py", line 41, in OptimizeGraph
    verbose, graph_id)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Failed to import metagraph, check error log for more info.

Here, is my code:

import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt

graph1 = get_frozen_graph("/home/xavier2/frozen_graphs/ssd_tomato_l1_frozen_graph.pb")
converter = trt.TrtGraphConverter(input_graph_def=graph1, precision_mode=trt.TrtPrecisionMode.FP16)
converted_graph_def = converter.convert()

Hi vedanshu,

Have you already solved this problem? I meet the same issue today.

Thank you.