Failed to convert onnx to tensorrt model


Hello, I’m trying to convert onnx to tensorrt model, but the engine fails to build with the error that follows:

After concat removal: 148 layers
Graph construction and optimization completed in 0.126679 seconds.
Constructing optimization profile number 0 [1/1].
Builder timing cache: created 0 entries, 0 hit(s)
…/builder/cudnnBuilder2.cpp (2025) - Assertion Error in getSupportedFormats: 0 (!formats.empty())
create engine failed
Aborted (core dumped)

My code is as follows:

void compile_onnx(vector args) {
bool use_fp16{false};
if ((args.size() >= 4) && args[3] == “–fp16”) use_fp16 = true;

TrtSharedEnginePtr engine = parse_to_engine(args[1], use_fp16);
serialize(engine, args[2]);



TensorRT Version: 7.1.3
GPU Type: TX2
Nvidia Driver Version: Jetpack 4.4.1
CUDA Version: 10.2
CUDNN Version: 8.0
Operating System + Version: Ubuntu18.04

Hi @863268961,
Request you to share your onnx model, so that we can assist you better.

How can I share my onnx model? Uploading the file is not supported.

Hi @863268961,
You can DM your model, or upload it on drive and share the link.