Describe the problem
for converting to uff and run inference on my trained-model
the train model is ‘ssd_resnet_v1_fpn’ in ‘[url]https://github.com/tensorflow/models/tree/master/research/object_detection[/url]’
I have refer to ‘/workspace/tensorrt/samples/sampleUffSSD/README.txt’ in nvidia-docker
*line 10 : Steps to generate UFF file → It’s OK
From /workspace/tensorrt/bin#
./sample_uff_ssd
-----------------error-result----------------------------------
…/data/ssd/sample_ssd.uff
Begin parsing model…
ERROR: UFFParser: Validator error: MultiscaleGridAnchorGenerator/ToNormalizedCoordinates_4/Cast_1: Unsupported operation _Cast
ERROR: sample_uff_ssd: Fail to parse
sample_uff_ssd: sampleUffSSD.cpp:667: int main(int, char): Assertion `tmpEngine != nullptr’ failed.
Aborted (core dumped)
For unsupported layers, users can extend TensorRT functionalities by implementing custom layers using the IPluginV2 class for the C++ and Python API. Custom layers, often referred to as plugins, are implemented and instantiated by an application, and their lifetime must span their use within a TensorRT engine. https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending