How to convert .onnx to .trtmodel tensorrt using trtexec with size output optional

Environment

TensorRT Version: 8.6.1
GPU Type: Nvidia T4

I am using the following cpp code to convert onnx file to trt and it works fine, however when moving to another pc, need to rebuild the model. I tried with trtexec it works fine on different pc:
(/usr/src/tensorrt/bin/trtexec --onnx=model.onnx --saveEngine=model.trtmodel)
however I need the output trt model to be 1900x1900 instead of 640x640 as originally, I tried with “/usr/src/tensorrt/bin/trtexec --onnx=model.onnx --saveEngine=model.trtmodel --explicitBatch --optShapes=input0:1x3x1900x1900 --maxShapes=input0:6x3x1900x1900 --minShapes=input0:1x3x1900x1900”
but it gives error “onnx2trt_utils.cpp:374: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.”

It seems I did it wrong, please help me, sorry my english is not good

main.cpp.txt (6.9 KB)

My problem seems to be similar to this topic, but I still haven’t found a solution.

same