Convert pb to uff

Hello, I am new to Jetson Nano. I used the MobileNet SSD V2 model after training and saved the ckpt file, and then converted the converted PB frozen file to UFF file, but the final result showed IndexError: List index out of range, how to solve it?
My Jetpack is 4.2.3 and tensorrt is 5.1.6.1frozen_inference_graph.pb (18.5 MB)

Hi @sheena40920
UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.

Thanks

Ok i’ll try it, thank you for your reply

Sorry, I would like to ask, if I forget to update tensorrt to version 7.2 in github, do I have to follow the latest version to comply with the jetpack before it can be used?

Sorry, I would like to ask, before I tried the conversion of pb to onnx using the following URL GitHub - onnx/tensorflow-onnx: Convert TensorFlow, Keras and Tflite models to ONNX
, I have succeeded, but when I inference, it shows that uint8 is not supported. Is there any way to solve it?model.onnx (17.9 MB)

Hi @sheena40920 ,

Please refer to below link:

Thanks

Sorry, I would like to ask when I was doing training, I did graph_rewriter {in the ssd_mobilenet_v2_quantized_300x300_coco.config file
quantization {
delay: 48000
weight_bits: 8
activation_bits: 8
}
}In the quantization section of this paragraph, the uint 8 appeared when I converted it into the onnx file. What is the reason?ssd_mobilenet_v2_quantized_300x300_coco.config (4.9 KB)