How to calculate proper tensorrt workspace size?

Hi, i am using trtexec to convert onnx format to engine format, the log says the “Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance”. So i want to use ./trtexec --workspace=N to set the proper workspace size, but i face the question, how to know the proper workspace size for my model ? Thanks !

Hi @wade.wang,
Quoting “How do I choose the optimal workspace size?” from the below post

Please refer to the link.
Thanks!

I read it, but it says “Applications should therefore allow the TensorRT builder as much workspace as they can afford” that only tellss the upper limit, it didn’t tell the lower limit, so is there some formula that can calculate the lower limit ?

Hi @wade.wang,
Below link will help you in choosing the optimization profiles.
https://docs.nvidia.com/deeplearning/tensorrt/api/python_api/infer/Core/OptimizationProfile.html

Thanks!