Unsupported ONNX data type: UINT8 (2)

Hi,
I used to have the same problem when I tried to convert ssd_mobilenet_v3.pb → onnx → TensorRT engine.
It is no problem to convert to ONNX, the model still works in the onnx runtime. The problem is the TensorRT ONNX parser, which does not support UINT8.
So I replaced the input type in the tensorflow frozen graph from UINT8->Float32. I described it with code example here