Hi,
I need convert caffe model to TRT with float32.
When GPU has 4G of free memory,
There will have error: out of memory if MaxWorkspace is set to 6G,
but it is worked fine with MaxWorkspace 2G.
Did I miss any config?
Hi,
I need convert caffe model to TRT with float32.
When GPU has 4G of free memory,
There will have error: out of memory if MaxWorkspace is set to 6G,
but it is worked fine with MaxWorkspace 2G.
Did I miss any config?
Hi,
MaxWorkspaceSize is the maximum GPU temporary memory which the TensorRT engine can use at execution time.
Please refer below troubleshooting link on how do I choose the optimal workspace size?
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-700/tensorrt-developer-guide/index.html#troubleshooting
Thanks