Error while parseing pre-trained Onnx model with Tensorrt on jetson nano


I am trying to parse this Onnx model from the onnx model zoo:

I use the following code to parse the model:

def build_engine_onnx(model_file):
    with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
        with open(model_file, 'rb') as model:
         return builder.build_cuda_engine(network)

engine = build_engine_onnx(resnet100.onnx)

When I run this function I get the following error:

[TensorRT] ERROR: Network must have at least one output

system info:
CUDA: 10.0
Jetpack: 4.2

This is due to a number of unsupported layers in my model.

Seems like your ONNX model is currently not supported within TensorRT.
As you mentioned, it may be due to a number of unsupported layers in the model.
If you can recreate your model using a supported datatype you may be able to go further in the parsing step.