I’m tring to convert the tensorflow model zoo implementation of maskRcnn to TensorRT using the onnx parser.
running in ngc continer TensorRT19:02 ( tensorRT v 5.0.2 ) I’m using cuda 10.0 on gtx 1050.
when I load the generated onnx file i get the error “Unsupported ONNX data type: UINT8 (2)”
code to load the onnx file is taken from nvidia docs:
import tensorrt as trt
TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
with open(my onnx file, ‘rb’) as model:
parser.parse(model.read())
(i’ve found https://github.com/onnx/onnx-tensorrt/issues/400
which suggests to use onnx-surgeon to edit the onnx file, this lead to the following error during parsing " [TensorRT] ERROR: Parameter check failed at: …/builder/Network.cpp::addInput::406, condition: isValidDims(dims)" on which i have not found any suggestions online )
I am not able to upload the onnx file, its 53m . the upload simply fails.
i’ve tried the check_model.py
it returns:
onnx.checker.check_model(model)
Traceback (most recent call last):
File “”, line 1, in
File “/root/.local/lib/python3.6/site-packages/onnx/checker.py”, line 102, in check_model
C.check_model(protobuf_string)
onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for LogicalAnd with domain_version of 6
==> Context: Bad node spec: input: “trip_count__153”…
.
.
Unsupported ONNX data type: UINT8 (2)
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/ModelImporter.cpp:54 In function importInput:
[8] Assertion failed: convert_dtype(onnx_tensor_type.elem_type(), &trt_dtype)
failed to parse onnx file
Engine could not be created
Engine could not be created
Hi @omerbrandis,
If your model is not passing the onnx checker, it wont parse through trtexec.
Hence you first need to check the conversion of tf to onnx.
If the issue persist, you can raise it here .
Also, support to TRT <= 5 has been deprecated, hence we recommend you to use the latest TRT.
Thanks!