Error while converting object detection model to tensorrt

hey trying to run the tensorrt example and io keep getting this error
Any help is appreciated thanks

Converting ssd_inception_v2_coco to trt…
2020-03-25 12:56:57.029303: F tensorflow/core/util/device_name_utils.cc:92] Check failed: IsJobName(job)
Aborted (core dumped)

Hi,

Some user also meet this issue. The root cause is that NMS cannot be run with CPU model on Jetson.
You can find more detail here: JetPack-4.3 for Jetson Nano

To workaround this issue, please using following patch which provided by jkjung-avt.

diff --git a/utils/od_utils.py b/utils/od_utils.py
index 2755bb5..b8ebe1b 100644
--- a/utils/od_utils.py
+++ b/utils/od_utils.py
@@ -52,7 +52,9 @@ def build_trt_pb(model_name, pb_path, download_dir='data'):
             get_egohands_model(model_name)
     frozen_graph_def, input_names, output_names = build_detection_graph(
         config=config_path,
-        checkpoint=checkpoint_path
+        checkpoint=checkpoint_path,
+        force_nms_cpu=False,
+        force_frcn2_cpu=False,
     )
     trt_graph_def = trt.create_inference_graph(
         input_graph_def=frozen_graph_def,
@@ -77,8 +79,8 @@ def load_trt_pb(pb_path):
             node.device = '/device:GPU:0'
         if 'faster_rcnn_' in pb_path and 'SecondStage' in node.name:
             node.device = '/device:GPU:0'
-        if 'NonMaxSuppression' in node.name:
-            node.device = '/device:CPU:0'
+        #if 'NonMaxSuppression' in node.name:
+        #    node.device = '/device:CPU:0'
     with tf.Graph().as_default() as trt_graph:
         tf.import_graph_def(trt_graph_def, name='')
     return trt_graph

Thanks.

Thanks for you quick reply. He is changing the code in build trt model and loading the converted model however i am having issues in the create_inference_graph function
also making the changes he suggested results in the following error

File “/home/mehul/Robotics/jetson-detectors/src/loader/trtloader.py”, line 32, in load_model
force_frcn2_cpu=False
TypeError: build_detection_graph() got an unexpected keyword argument ‘force_frcn2_cpu’

Solved! solution was to add this line force_nms_cpu=False in build_detection_graph
frozen_graph, input_names, output_names = build_detection_graph(
config=config_path,
checkpoint=checkpoint_path,
force_nms_cpu=False
)

Good to know this!
Thanks to share your status with us.