[8] Assertion failed: ctx->network()->hasExplicitPrecision() && "TensorRT only supports multi-input conv for explicit precision QAT networks!"

I met the error as title, it can not convert a simple onnx model to trt engine using onnx2trt.

I posted replicated script on git repo. No one response…

Please help!!

Hi @LucasJin,

The weights needs to be initialized in the ONNX model. You can try ONNX Optimizer and ONNX Simplifier with the ONNX model, and run TensorRT with the processed ONNX model.

For you reference,

Thank you.

@spolisetty I tried, onnx-simplifier and optimizer doesn’t do optimization on this model, in other words, the model optimized still same.

And simplified model raise same error.

Hi @LucasJin,

We are looking into this issue. Please follow up regarding this issue here to get better help.

Thank you.