Convert tensorflow ssd model to uff warning: Warning: No conversion function registered for layer: NMS_TRT yet.

I want to use tensorrt5 to do inference. Using the sample in samples/sampleUffSSD with command:

convert-to-uff ssd_inception_v2_coco_2017_11_17/frozen_inference_graph.pb -p config.py

I got the following warning:

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: GridAnchor_TRT yet.
Converting GridAnchor as custom op: GridAnchor_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
No. nodes: 563
UFF Output written to ssd_inception_v2_coco_2017_11_17/frozen_inference_graph.uff

the full message:

Loading ssd_inception_v2_coco_2017_11_17/frozen_inference_graph.pb
WARNING: To create TensorRT plugin nodes, please use the `create_plugin_node` function instead.
WARNING: To create TensorRT plugin nodes, please use the `create_plugin_node` function instead.
UFF Version 0.5.5
=== Automatically deduced input nodes ===
[name: "Input"
op: "Placeholder"
attr {
  key: "dtype"
  value {
    type: DT_FLOAT
  }
}
attr {
  key: "shape"
  value {
    shape {
      dim {
        size: 1
      }
      dim {
        size: 3
      }
      dim {
        size: 300
      }
      dim {
        size: 300
      }
    }
  }
}
]
=========================================

=== Automatically deduced output nodes ===
[name: "NMS"
op: "NMS_TRT"
input: "concat_box_loc"
input: "concat_priorbox"
input: "concat_box_conf"
attr {
  key: "backgroundLabelId_u_int"
  value {
    i: 0
  }
}
attr {
  key: "confSigmoid_u_int"
  value {
    i: 1
  }
}
attr {
  key: "confidenceThreshold_u_float"
  value {
    f: 9.99999993923e-09
  }
}
attr {
  key: "inputOrder_u_ilist"
  value {
    list {
      i: 0
      i: 2
      i: 1
    }
  }
}
attr {
  key: "isNormalized_u_int"
  value {
    i: 1
  }
}
attr {
  key: "keepTopK_u_int"
  value {
    i: 100
  }
}
attr {
  key: "nmsThreshold_u_float"
  value {
    f: 0.600000023842
  }
}
attr {
  key: "numClasses_u_int"
  value {
    i: 91
  }
}
attr {
  key: "scoreConverter_u_str"
  value {
    s: "SIGMOID"
  }
}
attr {
  key: "shareLocation_u_int"
  value {
    i: 1
  }
}
attr {
  key: "topK_u_int"
  value {
    i: 100
  }
}
attr {
  key: "varianceEncodedInTarget_u_int"
  value {
    i: 0
  }
}
]
==========================================

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: GridAnchor_TRT yet.
Converting GridAnchor as custom op: GridAnchor_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
No. nodes: 563
UFF Output written to ssd_inception_v2_coco_2017_11_17/frozen_inference_graph.uff

Hi,

From TRT Engineering: The UFF converter has no way of knowing whether there’s a plugin for a given op, so it just emits warnings whenever it doesn’t recognize an op. You can ignore these warnings.

Thanks,
NVIDIA Enterprise Support