Error while converting object detection model to tensorrt

hey trying to run the tensorrt example and io keep getting this error
Any help is appreciated thanks

Converting ssd_inception_v2_coco to trt…
2020-03-25 12:56:57.029303: F tensorflow/core/util/] Check failed: IsJobName(job)
Aborted (core dumped)


Some user also meet this issue. The root cause is that NMS cannot be run with CPU model on Jetson.
You can find more detail here:

To workaround this issue, please using following patch which provided by jkjung-avt.

diff --git a/utils/ b/utils/
index 2755bb5..b8ebe1b 100644
--- a/utils/
+++ b/utils/
@@ -52,7 +52,9 @@ def build_trt_pb(model_name, pb_path, download_dir='data'):
     frozen_graph_def, input_names, output_names = build_detection_graph(
-        checkpoint=checkpoint_path
+        checkpoint=checkpoint_path,
+        force_nms_cpu=False,
+        force_frcn2_cpu=False,
     trt_graph_def = trt.create_inference_graph(
@@ -77,8 +79,8 @@ def load_trt_pb(pb_path):
             node.device = '/device:GPU:0'
         if 'faster_rcnn_' in pb_path and 'SecondStage' in
             node.device = '/device:GPU:0'
-        if 'NonMaxSuppression' in
-            node.device = '/device:CPU:0'
+        #if 'NonMaxSuppression' in
+        #    node.device = '/device:CPU:0'
     with tf.Graph().as_default() as trt_graph:
         tf.import_graph_def(trt_graph_def, name='')
     return trt_graph


Thanks for you quick reply. He is changing the code in build trt model and loading the converted model however i am having issues in the create_inference_graph function
also making the changes he suggested results in the following error

File “/home/mehul/Robotics/jetson-detectors/src/loader/”, line 32, in load_model
TypeError: build_detection_graph() got an unexpected keyword argument ‘force_frcn2_cpu’

Solved! solution was to add this line force_nms_cpu=False in build_detection_graph
frozen_graph, input_names, output_names = build_detection_graph(

Good to know this!
Thanks to share your status with us.