Jeston thor convert onnx to engine fail

when converting a onnx mdoel to engine file, the isssue is
[12/01/2025-02:32:02] [I] Finished parsing network model. Parse time: 0.0682791
[12/01/2025-02:32:03] [I] [TRT] Local timing cache in use. Profiling results in this builder pass will not be stored.
[12/01/2025-02:32:03] [I] [TRT] Compiler backend is used during engine build.
Internal Error: MyelinCheckException: peephole_epilogue_fusion.cpp:587: CHECK(curr->inputs()[1]->tensor() == def_opnd->tensor()) failed.
[12/01/2025-02:32:07] [E] Error[9]: Error Code: 9: Skipping tactic 0x0000000000000000 due to exception [myelin_graph.h:attachExceptionMsgToGraph:1146] MyelinCheckException: peephole_epilogue_fusion.cpp:587: CHECK(curr->inputs()[1]->tensor() == def_opnd->tensor()) failed. In compileGraph at optimizer/myelin/codeGenerator.cpp:1425
[12/01/2025-02:32:07] [E] Error[10]: IBuilder::buildSerializedNetworkToStream: Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[input2_cast…output_castOut]}. In computeCosts at optimizer/common/tactic/optimizer.cpp:4115)
[12/01/2025-02:32:07] [I] Created engine with size: 0 MiB
[12/01/2025-02:32:07] [I] Engine built in 4.7451 sec.
[12/01/2025-02:32:07] [E] Assertion failure: false && “Attempting to access an empty engine!”
how to solve it?
I used “/usr/src/tensorrt/bin/trtexec --onnx=igev_conv_final_sim.onnx --saveEngine=igev_conv_final_sim_fp16.engine --fp16 “
I use the onnx model

Hi,

Could you share the ONNX model with us so we can check it further?
Thanks.

I use this model converting to engine model on TensorRT v8.6.0.2 is ok

the issue is on TensorRT v10.13.0.3

Could you please confirm if you are able to access this ONNX model? When might we expect a conclusion? This matter is urgent.

Hi, just wanted to check if there’s any progress on this issue? I’m currently blocked by this and would appreciate any guidance. Thanks!

Hi,

We want to check this issue with the model you provided.
But doesn’t have the permission to download the ONNX file.

Could you help with this?

Thanks.

the onnx file link https://drive.google.com/file/d/1fNJx8Z2LnD4kSxbAZmIztrU-GCw7Ki41/view?usp=sharing

Hi,

Thanks for sharing the model.

We can download the file and reproduce the issue locally.
Will keep you updated if we have further progress on this issue.

Thanks.