I convert onnx model to engine model but I got this error.
[08/10/2023-06:31:48] [W] [TRT] onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[08/10/2023-06:31:51] [W] [TRT] onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped
[08/10/2023-06:31:51] [E] Error[4]: [graphShapeAnalyzer.cpp::analyzeShapes::1285] Error Code 4: Miscellaneous (IElementWiseLayer Mul_1384: broadcast dimensions must be conformable)
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:773: While parsing node number 1384 [Mul -> "2291"]:
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:774: --- Begin node ---
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:775: input: "2288"
input: "3095"
output: "2291"
name: "Mul_1384"
op_type: "Mul"
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:776: --- End node ---
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:779: ERROR: ModelImporter.cpp:179 In function parseGraph:
[6] Invalid Node - Mul_1384
[graphShapeAnalyzer.cpp::analyzeShapes::1285] Error Code 4: Miscellaneous (IElementWiseLayer Mul_1384: broadcast dimensions must be conformable)
[08/10/2023-06:31:51] [E] Failed to parse onnx file
[08/10/2023-06:31:51] [I] Finish parsing network model
[08/10/2023-06:31:51] [E] Parsing model failed
[08/10/2023-06:31:51] [E] Failed to create engine from model.
[08/10/2023-06:31:51] [E] Engine set up fail
How should I do to resolve that problem. I tried to export with implicit and explicit batch, the error was the same.