Description
Hi, I have recently upgraded to tensorrt 10.1.0.27 but face an issue of unable to load my uff model into tensorrt as its parser was depreciated. I couldn’t convert it to onnx as I don’t have the original pb or h5 model. Is there anyway to inference the model in tensorrt 10 without recreating an onnx by retraining a new pb?
Thanks.
Environment
TensorRT Version: 10.1.0.27
GPU Type: RTX 3070
Nvidia Driver Version: r555
CUDA Version: 12.5
CUDNN Version: 8.9.7.29
Operating System + Version: Linux/Windows
Python Version (if applicable): N/A
TensorFlow Version (if applicable): N/A
PyTorch Version (if applicable): N/A
Baremetal or Container (if container which image + tag): Baremetal