ERROR: builtin_op_importers.cpp:2179 In function importPad inputs.at(1).is_weights()

Description

I’m converting a model from Darknet->Pytorch->ONNX->TensorRT. I can inferring the ONNX model correctly now, but I’ll got some error when I use trtexec to convert the model.
The error message is like below:

ERROR: builtin_op_importers.cpp:2179 In function importPad:
[8] Assertion failed: inputs.at(1).is_weights()

Environment

TensorRT Version : 7.1.3
GPU Type : Jetson NX iGPU
Nvidia Driver Version :
CUDA Version : 10.2
CUDNN Version : 8
Operating System + Version : Ubuntu 18.04

Relevant Files

https://drive.google.com/file/d/1pi8gYR7t8sPJkRFINB3QgcyVM10IhdK3/view?usp=sharing

Steps To Reproduce

sudo ./trtexec --onnx=test.onnx --explicitBatch

Hi @jackgao0323,

Kindly provide the access to the model.
Thanks!

Ok, I have given you the access.

Hi @jackgao0323,
I could reproduce the issue from your model.
But I am afraid, TRT currently does not support convolutions where the weights are tensors.

Thanks!

Hi @AakankshaS
The error is from padding not from convolutions isn’t it? How could I solve this problem?

Thanks!

Can you plesae try onnx_graphsurgeon’s fold_constants functionality. This might be able to resolve this issue
Thanks!

When I use pytorch to export onnx model, I have already chosen the fold constants. Should I use onnx graphsurgeon fold constants again?

Hi @jackgao0323,
Apologies for late response.
You might be able to resolve this by setting pads to be a constant 1d tensor manually using onnx-graphsurgeon

Thanks!

Hi @AakankshaS,

I’ve got the answer in

Thanks for answering.

1 Like