TensorRT and Tensorflow: convert to uff was successful but it seems to have obtained random weights

I have a resnet model in pb format with an accuracy of 93.5%, but after I convert it to uff format, the precision becomes 0.49%(1000 classes), which seems to use random model weights. I don’t know how to solve this problem and look forward to getting a reply.

package version

tensorrt 5.1.5.0
uff 0.6.3
tensorflow-gpu 1.13.1
graphsurgeon 0.4.1

NOTE: UFF has been tested with TensorFlow 1.12.0. Other versions are not guaranteed to work
WARNING: The version of TensorFlow installed on this system is not guaranteed to work with UFF.
UFF Version 0.6.3
=== Automatically deduced input nodes ===
[name: “input”
op: “Placeholder”
attr {
key: “dtype”
value {
type: DT_FLOAT
}
}
attr {
key: “shape”
value {
shape {
dim {
size: 1
}
dim {
size: 224
}
dim {
size: 224
}
dim {
size: 3
}
}
}
}
]

Using output node logits
Converting to UFF graph
Warning: keepdims is ignored by the UFF Parser and defaults to True
No. nodes: 279
UFF Output written to /root/tensorrt/tensorrt-master/tftrt/examples/image-classification/model/r50_93.5.uff

Hi,
Try converting your model to ONNX instead using tf2onnx and then convert to TensorRT using ONNX parser. Any layer that are not supported needs to be replaced by custom plugin.

https://github.com/onnx/tensorflow-onnx
https://github.com/onnx/onnx-tensorrt/blob/master/operators.md

Thanks