Retraining Custom MobilenetV2-fn and use on TensorRT

Hello, I trained custom mobilenetv2_fn model with 6 classes on Tensorflow 1.15 version.
And then, I converted to pb file.
But when building TensorRT model, I have the following error.

Using output node NMS
Converting to UFF graph
Warning: No conversion function registered for layer: NMS_TRT yet.
Converting NMS as custom op: NMS_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_conf as custom op: FlattenConcat_TRT
Warning: No conversion function registered for layer: Unpack yet.
Converting Preprocessor/unstack as custom op: Unpack
Warning: No conversion function registered for layer: GridAnchor_TRT yet.
Converting GridAnchor as custom op: GridAnchor_TRT
Warning: No conversion function registered for layer: FlattenConcat_TRT yet.
Converting concat_box_loc as custom op: FlattenConcat_TRT
DEBUG [/usr/lib/python3.6/dist-packages/uff/converters/tensorflow/converter.py:96] Marking [‘NMS’] as outputs
No. nodes: 612
UFF Output written to ssd_mobilenet_v2_coco_2018_03_29/frozen_inference_graph_1.uff
[TensorRT] ERROR: UffParser: Validator error: Preprocessor/unstack: Unsupported operation _Unpack
Building TensorRT engine, this may take a few minutes…
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
File “main.py”, line 371, in
buf = trt_engine.serialize()
AttributeError: ‘NoneType’ object has no attribute ‘serialize’

Please help me how to fix it.
I attach pb file and python file I used.
I will expect good news ASAP.

https://drive.google.com/drive/folders/1DrCFP3T0mFSm1GNzRp8aude-Ona6SoMz?usp=sharing

Hi @akulov.eugen,
UFF parser has been deprecated from TRT 7 onwards, hence we recommend you to use ONNX parser instead with latest TRT release.

Thanks!

I am using TRT 6 now, and ONNX-TensorRT is slower than native TensorRT, I think.
====== First Issue ======
[TensorRT] ERROR: UffParser: Validator error: FeatureExtractor/MobilenetV2/Cast_2: Unsupported operation _Cast
====== Second Issue =======
I downloaded pretrained checkpoint file and pb file from google.
And then, after I convert that checkpoint file to pb in my own according to guide and compare with the downloaded pb file, the structure is different.
What is wrong?
Could you please help me?

Hi @akulov.eugen
UFF parser is not supported with latest TRT releases.
Another option you can give a try is TF-TRT.

Thanks!

I mean, I am using TensorRT 6, not latest version.

Could you please give me help?

Hi @akulov.eugen
The UFF support itself has been deprecated.
Hence it is suggested to use latest TRT release.

Thanks!

I surely need to use the old version.
Could you help it?

You may need to write custom plugin to support the unsupported layers.

Thanks!

I was going to use tensorflow-onnx-tensorrt option for my custom ssd mobilenet model.
And then, I successfully generated onnx file, however when trying to convert into tensorrt, it continued to fail.
I fixed issue with uint8, but many other issues.
I tried with many articles related to it, but not found the suitable solution for me.
Please help.

Hi @akulov.eugen,
Can you please share your onnx model.

Thanks!