[TensorRT] ERROR: UFFParser: Graph error: Cycle graph detected

TensorRT 5.0.2
TensorFlow 1.6
Ubuntu 16.04
Graphics QuadroM2200
Python version 2.7
CUDA version 9.0

here is the script

import numpy as np
import pycuda.driver as cuda

This import causes pycuda to automatically manage CUDA context creation and cleanup.

import pycuda.autoinit

import tensorrt as trt

import sys, os

You can set the logger severity higher to suppress messages (or lower to display more messages).

TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
model_file = “./frozen_model.uff”

with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.UffParser() as parser:
parser.register_input(“input/X_placeholder”, (3, 160, 320))
parser.register_input(“input/Placeholder”, (3, 160, 320))
parser.register_output(“output/Reshape_1”)
parser.parse(“./frozen_model.uff”, network)


Thanks,
Madhav

Hello,

to help us debug, can you share “./frozen_model.uff”?

Hello,

parser.parse(“./frozen_model.uff”, network)

getting the error for this line

thanks,
Madhav

correct. It’d help us debug if we can see what frozen_model.uff is. can you share?

Hello,

is there any tool or API using this we can able to read or visualize uff file?

if yes please let me know i will read and will share info to regarding the error

Thanks,
Madhav

Hello,

cyclic graph detected error solved by removing the map_fn operation from tensorflow.

now error that i have got is related to switch.

[TensorRT] ERROR: UFFParser: Validator error: stage4_3/Gconv1x1/Gconv1x1_0/cond/Switch: Unsupported operation _Switch

Thanks,
Madhav Chamle

Hello,

cyclic graph detected error solved by removing the map_fn operation from tensorflow.

now error that i have got is related to switch.

[TensorRT] ERROR: UFFParser: Validator error: stage4_3/Gconv1x1/Gconv1x1_0/cond/Switch: Unsupported operation _Switch

Thanks,
Madhav Chamle

Hello,
It looks like the model you are converting to TensorRT contains an unsupported operation “_Switch”.
For a list of supported operations, please reference: https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#support_op

For unsupported layers, users can extend TensorRT functionalities by implementing custom layers using the IPluginV2 class for the C++ and Python API. Custom layers, often referred to as plugins, are implemented and instantiated by an application, and their lifetime must span their use within a TensorRT engine. https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending

regards,
NVIDIA Enterprise Support

Hi ! Sorry to bump this old topic to the the top of the list but I’m trying to build a TRT engine from a custom frozen.pb file. It successfully converted to the uff format but the convertion to trt is always giving me the error :
[TensorRT] ERROR: UffParser: Graph error: Cycle graph detected [TensorRT] ERROR: Network must have at least one output
I saw that @madhav.chamle removed the map_fn operation from tensorflow but I do not if it’s a good idea and if yes where to find and remove this operation. If you have any other idea I would really appreciate it.