Ssdlite_mobilenetv2 can't convert to tensorrt engine

Environment

TensorRT Version: 7.0
GPU Type: TX2
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version:
Operating System + Version: ubuntu 18.04
Python Version (if applicable): 3.6
TensorFlow Version (if applicable): 1.15
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Description

When I try to convert the ssdlite_mobilenetv2 model to tensorrt engine, it can convert to uff, but uff can’t convert to engine.

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
WARNING:tensorflow:From /usr/lib/python3.6/dist-packages/uff/converters/tensorflow/converter.py:179: The name tf.AttrValue is deprecated. Please use tf.compat.v1.AttrValue instead.

Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
DEBUG [/usr/lib/python3.6/dist-packages/uff/converters/tensorflow/converter.py:96] Marking ['NMS'] as outputs
No. nodes: 589
UFF Output written to tmp.uff
#assertionnmsPlugin.cpp,82
Aborted (core dumped)

But I can convert successfully on ssd_mobilenetv2, I found the difference between ssdlite_mobilnetv2 and ssd_mobilenetv2, there is no “Concatenate” operation in the ssdlite_mobilenetv2 model. So the PriorBox OP’s output can’t connect to the NMS OP’s input, which leads the assertionnmsPlugin error. But I don’t know how to modify the config.py below. someone can help? thanks!


import graphsurgeon as gs

path = 'model/ssd_mobilenet_v2_coco_2018_03_29/frozen_inference_graph.pb'
TRTbin = 'TRT_ssd_mobilenet_v2_coco_2018_03_29.bin'
output_name = ['NMS']
dims = [3,300,300]
layout = 7

def add_plugin(graph):
    all_assert_nodes = graph.find_nodes_by_op("Assert")
    graph.remove(all_assert_nodes, remove_exclusive_dependencies=True)

    all_identity_nodes = graph.find_nodes_by_op("Identity")
    graph.forward_inputs(all_identity_nodes)

    Input = gs.create_plugin_node(
        name="Input",
        op="Placeholder",
        shape=[1, 3, 300, 300]
    )

    PriorBox = gs.create_plugin_node(
        name="GridAnchor",
        op="GridAnchor_TRT",
        minSize=0.2,
        maxSize=0.95,
        aspectRatios=[1.0, 2.0, 0.5, 3.0, 0.33],
        variance=[0.1,0.1,0.2,0.2],
        featureMapShapes=[19, 10, 5, 3, 2, 1],
        numLayers=6
    )

    NMS = gs.create_plugin_node(
        name="NMS",
        op="NMS_TRT",
        shareLocation=1,
        varianceEncodedInTarget=0,
        backgroundLabelId=0,
        confidenceThreshold=1e-8,
        nmsThreshold=0.6,
        topK=100,
        keepTopK=100,
        numClasses=91,
        inputOrder=[1, 0, 2],
        confSigmoid=1,
        isNormalized=1
    )

    concat_priorbox = gs.create_node(
        "concat_priorbox",
        op="ConcatV2",
        axis=2
    )

    concat_box_loc = gs.create_plugin_node(
        "concat_box_loc",
        op="FlattenConcat_TRT",
    )

    concat_box_conf = gs.create_plugin_node(
        "concat_box_conf",
        op="FlattenConcat_TRT",
    )

    namespace_plugin_map = {
        "MultipleGridAnchorGenerator": PriorBox,
        "Postprocessor": NMS,
        "Preprocessor": Input,
        "ToFloat": Input,
        "image_tensor": Input,
        "Concatenate": concat_priorbox,
        "concat": concat_box_loc,
        "concat_1": concat_box_conf
    }

    graph.collapse_namespaces(namespace_plugin_map)
    graph.remove(graph.graph_outputs, remove_exclusive_dependencies=False)
    graph.find_nodes_by_op("NMS_TRT")[0].input.remove("Input")

    return graph


Hi @zryang_ok,

Please refer to the below post.

In case if the issue persist, request you to share your model and script.

Thanks!

Hi, @AdkankshaS.
Thanks for your reply, I can run successfully on the ssd_mobilnetv2 in the tensorrt samples. Now I want to implement tensorrt on the ssdlite_mobilenetv2, but I failed. the model and script is there:
ssdlite_mobilenetv2 to tensorrt
can you take a look, Thanks!

Hi,
Apologies for delayed response.
I couldnt reproduce your issue.
However when I tried converting your model to onnx and could successfully do it, however when tried converting it to TRT, facing issues Unsupported ONNX data type: UINT8 (2) which is currently not supported in TRT.

Thanks!