I wanna Build a TensorRT engine from ONNX using trtexec tool but failed

[03/06/2022-16:51:43] [E] [TRT] ModelImporter.cpp:751: — End node —
[03/06/2022-16:51:43] [E] [TRT] ModelImporter.cpp:754: ERROR: builtin_op_importers.cpp:2122 In function importIf:
[6] Assertion failed: condTensor && “Failed to convert the input cond to a scalar.”
[03/06/2022-16:51:43] [E] Failed to parse onnx file
[03/06/2022-16:51:43] [I] Finish parsing network model
[03/06/2022-16:51:43] [E] Parsing model failed
[03/06/2022-16:51:43] [E] Failed to create engine from model.
[03/06/2022-16:51:43] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8400] # trtexec --onnx=unet.onnx --fp16 --workspace=64 --optShapes=input:1x3x512x512 --buildOnly --saveEngine=unet.engine

Why?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.