For unsupported layers, users can extend TensorRT functionalities by implementing custom layers using the IPluginV2 class for the C++ and Python API. Custom layers, often referred to as plugins, are implemented and instantiated by an application, and their lifetime must span their use within a TensorRT engine. https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending
Hi ! Sorry to bump this old topic to the the top of the list but I’m trying to build a TRT engine from a custom frozen.pb file. It successfully converted to the uff format but the convertion to trt is always giving me the error : [TensorRT] ERROR: UffParser: Graph error: Cycle graph detected [TensorRT] ERROR: Network must have at least one output
I saw that @madhav.chamle removed the map_fn operation from tensorflow but I do not if it’s a good idea and if yes where to find and remove this operation. If you have any other idea I would really appreciate it.