[TensorRT] ERROR: UffParser: Validator error: TRTEngineOp_0: Unsupported operation _TRTEngineOp


I tried to convert pb to bin, but I got this error

Warning: No conversion function registered for layer: TRTEngineOp yet.
Converting TRTEngineOp_0 as custom op: TRTEngineOp
DEBUG [/usr/lib/python3.6/dist-packages/uff/converters/tensorflow/converter.py:96] Marking [‘NMS’] as outputs
No. nodes: 19
UFF Output written to /home/hyebin/project/tensorrt_demos/ssd/tmp_v2_face.uff
UFF Text Output written to /home/hyebin/project/tensorrt_demos/ssd/tmp_v2_face.pbtxt
[TensorRT] ERROR: UffParser: Validator error: TRTEngineOp_0: Unsupported operation _TRTEngineOp
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
File “build_engine.py”, line 230, in
File “build_engine.py”, line 224, in main
buf = engine.serialize()
AttributeError: ‘NoneType’ object has no attribute ‘serialize’

How to solve it?


TensorRT Version: 6
GPU Type: Jetson TX2, jetapck 4.3
Nvidia Driver Version:
CUDA Version: 10
CUDNN Version: 7
Operating System + Version:
Python Version (if applicable): 3.6
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

TRTEngineOp is supported subgraph with a TensorRT optimized node. Hence ideally it should not be part of UFF or pb model file.
It looks like PB model is a TF-TRT model, which contains the TRTEngineOps. The UFF parser won’t work on that. You need to either get the original TF model and use the UFF parser on that, or just use TF-TRT.