Is the INFO `[I] [TRT] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.` critical?

When executing TensorRT, the following INFO is displayed.

[I] [TRT] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.

We use Tesla T4 GPU, so the available GPU memory is 15109 MiB = 15870798319.3277301788 byte.

Out of memory when max_workspace_size = 15870700000 is set.

It works when max_workspace_size = 15000000000 is set, but the above INFO is output.

The INFO is also found in the TensorRT README.

https://github.com/NVIDIA/TensorRT#install-the-tensorrt-oss-components-optional

[08/23/2019-22:08:59] [I] [TRT] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output.

The following NVIDIA blog also says that you should allocate enough.

https://devblogs.nvidia.com/speed-up-inference-tensorrt/

“Set the Maximum Workspace Sizes” chapter

https://devblogs.nvidia.com/speed-up-inference-tensorrt/#h.97pc4btg9tu3

Question

Is the INFO a critical issue? Should it be resolved or left alone?

Our operating environment

  • GPU : Tesla T4
  • Host OS : Ubuntu 16.04.6 LTS
  • TensorRT : 6.0.1
  • NVIDIA Driver : 430.26
  • CUDA : 10.1

@inazuka.daiki Did you ever find a solution to your problem? I seem to be encountering this as well.

@solarflarefx Thanks for your comment. But unfortunately, I haven’t solved this problem yet.