Problem converting customised trained ssd mobilenet v2 to tensorrt

Loading models/ssd_mobilenet_v2/frozen_inference_graph.pb
NOTE: UFF has been tested with TensorFlow 1.12.0. Other versions are not guaranteed to work
WARNING: The version of TensorFlow installed on this system is not guaranteed to work with UFF.
WARNING: To create TensorRT plugin nodes, please use the create_plugin_node function instead.
WARNING: To create TensorRT plugin nodes, please use the create_plugin_node function instead.
UFF Version 0.6.3
=== Automatically deduced input nodes ===
[name: “Input”
op: “Placeholder”
input: “Cast”
attr {
key: “dtype”
value {
type: DT_FLOAT
}
}
attr {
key: “shape”
value {
shape {
dim {
size: 1
}
dim {
size: 3
}
dim {
size: 300
}
dim {
size: 300
}
}
}
}
]

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: Cast yet.
Converting Cast as custom op: Cast
Warning: No conversion function registered for layer: GridAnchor_TRT yet.
Converting GridAnchor as custom op: GridAnchor_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
No. nodes: 571
UFF Output written to models/ssd_mobilenet_v2/frozen_inference_graph.uff

Hi,

I got this error when converting ssd mobilenet v2 to tensorrt uff format.

Any help would be appreciated !!!

Hi Nvidia Team,

It would be useful if you reply to this ASAP

irrespective of a node being plugin node or not supported , the uff conversion will throw this error!!! u can go ahead with the created UFF. in case u are sure u are going to support all the unsupported layers by plugin node.

Hi Sir,

Thank you for your reply, in the above unsupported operations. I don’t know how to write the plugin layers for the unsupported operations and map to the tensorrt(facing issue while converting uff to engine format)

It would be very useful if you could share any code or links, helps is giving better understanding and clearing the error.

Hi,

The error indicates that some layers of your model is not supported by the TensorRT.
This will require you to write some plugin implementation for the non-supported layer.

For ssd_mobilenet, we already implement the required plugin and includes it in our sample.
You can find it in this folder: /usr/src/tensorrt/samples/python/uff_ssd/

Thanks.

Hello,

I have made “libflattenconcat.so” successfully,and add the path into the convert_to_uff.py. But the same error still occurs
when convert ssd mobilenet v2 to tensorrt

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
Warning: No conversion function registered for layer: GridAnchor_TRT yet.
Converting GridAnchor as custom op: GridAnchor_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
No. nodes: 1112
UFF Output written to sample_ssd_relu6.uff

Hi,

Do you meet any “ERROR” with it?
The log looks good to me.

Please noticed that TensorRT link the plugin implementation when creating the engine.
There is no way in the uffparser to know if there is a plugin implementation.
So it shows some warning message but doesn’t affect the generation of uff file.

Thanks.