Please provide the following info (check/uncheck the boxes after creating this topic): Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
Host Machine Version
native Ubuntu 18.04
other
I had keras Tensorflow model, converted to onnx model using Tf2onnx package.
Then with this converted onnx model trying to optimize using TensorRT optimization tool.
Getting following error
Network has dynamic or shape inputs, but no optimization profile has been defined
Attaching the required files(Onnx model, error screenshot, error log file, Tensorflow keras model). Mar3_3.onnx (1.9 MB)
I tested the model using trtexec tool.
Getting the following error.
Dynamic dimensions required for input conv2d_input
Attaching the error screenshot and log file.
Dear @alksainath.medam,
Could you check with --minShapes=conv2d_input:1x145x145x3 --optShapes=conv2d_input:16x145x145x3 --maxShapes=conv2d_input:256x256x256x3 --shapes=conv2d_input:5x145x145x3 instead of --minShapes=conv2d_input:0:1x145x145x3 --optShapes=conv2d_input:0:16x145x145x3 --maxShapes=conv2d_input:0:256x256x256x3 --shapes=conv2d_input:0:5x145x145x3
I had tried as u suggested, getting “Cuda failure: out of memory”
I had shared my model previously and sharing again.
Can, you try from your side and share the optimized model along with process/steps.
Sharing the log file and screenshot along with model.
I had tried, as u suggested getting “Cuda failure: out of memory”
After trying in high-end system got following error.
“Required optimization profile is invalid”
Attaching the screenshots.
Dear @alksainath.medam,
The model looks bigger. May I know on which GPU you have tested? Could you attach complete log which has required optimization profile invalid. Also please use text messages instead of images to be able to search by others in community.
[03/24/2023-13:33:02] [W] [TRT] onnx2trt_utils.cpp:194: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[03/24/2023-13:33:02] [W] [TRT] onnx2trt_utils.cpp:194: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[03/24/2023-13:33:02] [W] [TRT] onnx2trt_utils.cpp:194: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[03/24/2023-13:33:02] [E] [TRT] Parameter check failed at: optimizationProfile.cpp::setDimensions::129, condition: validate(newEntry, true)
[03/24/2023-13:33:02] [E] Required optimization profile is invalid
terminate called after throwing an instance of ‘std::runtime_error’
what(): Failed to create object
Aborted (core dumped) GPU.txt (966 Bytes) log.txt (3.8 KB) Verbose.txt (26.1 KB)