Hi @sheena40920 ,
Looks like this is known issue. Natively TensorRT doesn’t support UINT8 data type. When we generate ONNX model, observed data type is UINT8, modifying ONNX model may be error prone.
Please refer following, you may find useful.
opened 02:57PM - 21 Feb 20 UTC
bug
triaged
Folowing the tutorial from the notebook https://github.com/onnx/tensorflow-onnx/… blob/master/tutorials/ConvertingSSDMobilenetToONNX.ipynb I am trying to work with a mobilenetv2 and v3 frozen models from tensorflow frozen_inference_graph.pb or a saved_model.pb to convert to ONNX and to TensorRT files.
Under NGC dockers 20.01-tf1-py3 and 19.05-py3 I am using both this and tensorflow-onnx projects.
I alwaysget different issues, the furthest I got was under 20.01-tf1-py3 with both onnx-tensorrt and tensorflow-onnx on master branchs and install the projects from source.
I was able to create the .onnx file, but when I try to create the .trt file I get the following.
```
onnx2trt /media/bnascimento/project/frozen_inference_graph.onnx -o /media/bnascimento/project/frozen_inference_graph.trt
----------------------------------------------------------------
Input filename: /media/bnascimento/project/frozen_inference_graph.onnx
ONNX IR version: 0.0.6
Opset version: 10
Producer name: tf2onnx
Producer version: 1.6.0
Domain:
Model version: 0
Doc string:
----------------------------------------------------------------
Parsing model
Unsupported ONNX data type: UINT8 (2)
ERROR: image_tensor:0:190 In function importInput:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)
```
I suspect this has to do with the input tensor for the image, but I dont know how to avoid this issue. Anyone with similar issues before?
Cheers
Bruno
Hi everyone,
I have trained SSD Mobilenet v2 model on my dataset. It got successfully converted to ONNX but, during converting ONNX model to TensorRT engine, it throws error due to unsupported datatype UINT8.
Is their any work around to generate tensorrt engine? If I have to retrain the model with supported datatype, how to change datatype of model from uint8 to supported one?
TensorRT version: 7.1
Thank You
If you still need further assistance we recommend you to post your query on Jetson forum to get better help.
Discussions relating to the Jetson DevKits and other Embedded computing devices
Thank you.