Tlt-converter workspace-size

We are currently working with TAO and tlt-converter and had a question regarding workspace-size. Currently, we are able to create .engine file using tlt-converter however it provides a warning

[INFO] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.

My question is should I be worried about this even if the .engine file outputs correctly? I tried looking at the documentation for TensorRT and it claims that increasing workspace size will increase performance. Does “increase performance” relate to the model’s ability to detect objects or the speed in which the .engine file is generated?

I understand that if there is a OOM error when compiling, increasing this will help but we are not facing an error but rather a warning. We want to make sure we aren’t limiting the model’s ability to detect objects by not passing a higher workspace size

You can ignore it if the .engine file outputs correctly.

No, it is not related.

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.