Error while parseing pre-trained Onnx model with Tensorrt on jetson nano

Hi,

I am trying to parse this Onnx model from the onnx model zoo:

https://s3.amazonaws.com/onnx-model-zoo/arcface/resnet100.onnx

I use the following code to parse the model:

def build_engine_onnx(model_file):
    with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
        with open(model_file, 'rb') as model:
            parser.parse(model.read())
         return builder.build_cuda_engine(network)

engine = build_engine_onnx(resnet100.onnx)

When I run this function I get the following error:

[TensorRT] ERROR: Network must have at least one output

system info:
TensorRT: 5.0.6.3-1+cuda10.0
CUDA: 10.0
Jetpack: 4.2

This is due to a number of unsupported layers in my model.

Seems like your ONNX model is currently not supported within TensorRT.
As you mentioned, it may be due to a number of unsupported layers in the model.
If you can recreate your model using a supported datatype you may be able to go further in the parsing step.