Converting ssd_inception_v2_coco to TensorRT


I try to convert a retrained ssd_inception_v2_coco.

The original model I downloaded was this one:

I used the sample “uff_ssd” and I get the following error

[TensorRT] ERROR: UFFParser: Validator error: Cast: Unsupported operation _Cast
Building TensorRT engine. This may take few minutes.
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
  File "", line 218, in <module>
  File "", line 215, in main
  File "/usr/src/tensorrt/samples/python/uff_ssd/utils/", line 70, in __init__
    engine_utils.save_engine(self.trt_engine, trt_engine_path)
  File "/usr/src/tensorrt/samples/python/uff_ssd/utils/", line 83, in save_engine
    buf = engine.serialize()
AttributeError: 'NoneType' object has no attribute 'serialize'

I uploaded my sourcecode as well as my retrained model:

I do have a input named cast, but I cannot remove it.

Any idea what I can do?


The model files has “Cast” operations which is not supported in UFF parser.
Please refer the below link for supported UFF operations:

You can try tf2onnx + ONNX parser as an alternative.

Supported ops:



As I know so far, from tensorflow model r1.13.0, when use script “”, they will automatically add the “Cast” operation to your frozen model.

I fixed this issue by checkout back to r1.12.0, export frozen model then convert to UFF.

In file, add “Cast” operation as Input, for example :

namespace_plugin_map = {
        "MultipleGridAnchorGenerator": PriorBox,
        "Postprocessor": NMS,
        "Preprocessor": Input,
        "Cast": Input,
        "ToFloat": Input,
        "image_tensor": Input,
        "Concatenate": concat_priorbox,
        "Identity": concat_priorbox,
        "concat": concat_box_loc,
        "concat_1": concat_box_conf