We are currently working with TAO and tlt-converter and had a question regarding workspace-size. Currently, we are able to create .engine file using tlt-converter however it provides a warning
[INFO] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.
My question is should I be worried about this even if the .engine file outputs correctly? I tried looking at the documentation for TensorRT and it claims that increasing workspace size will increase performance. Does “increase performance” relate to the model’s ability to detect objects or the speed in which the .engine file is generated?
I understand that if there is a OOM error when compiling, increasing this will help but we are not facing an error but rather a warning. We want to make sure we aren’t limiting the model’s ability to detect objects by not passing a higher workspace size