How to solve UFFParser error?

Hello, I want to train my own data using jetson xavier model (SSD mobilenet v2).

First of all, we made frozen_inference_graph.pb from model.ckpt -0000.

We want this weight file to object detection on xavier based on detectnet c++.

To do this, it is necessary to convert the pb file to the uff file, and the following error occurred.

UFFParser: Validator error: FeatureExtractor/MobilenetV2/expanded_conv_15/output: Unsupported operation Identity
failed to parse UFF model ‘networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff’
device GPU, failed to load networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
detectnet – failed to initialize.
detectnet-camera: failed to load detectNet model

Currently, the uff conversion was performed on a different PC than the xavier, and the TensorRT version is 6.0.1.5 in PC(not xavier)

And, the Xavier’s TensortRT version is 5.0.3

I’ve been searching on this problem a number of times.
I think Xavier’s TensorRT version (5.0.3) does not support identity, so I think we should upgrade the TensorRT version, is this right?

I hope you are always careful of the corona-virus and I will wait for your reply.

Hi gulb1602,
Did you meet the error while using TLT? I am afraid this is not a TLT related topic. In TLT, uff conversion is not compatible.