Importing a ONNX model for performing an inference using TensorRT

Hi,

Do you meet an error like " Unsupported ONNX data type: UINT8 ".

This is a known issue since ONNX by default use INT8 as input data type while TensorRT expects FP32.
You can fix this issue by modifying the data format with our ONNX Graphsurgeon API.

Please check below comment for the detailed information:

Thanks.