OutOfMemory Error in computeCosts in tensorrt

Hi, I am running tensorrt optimized caffemodel. I am getting the error as below:

"Begin parsing model…
End parsing model…
Begin building engine…
ERROR: Internal error: could not find any implementation for node upsample, try increasing the workspace size with IBuilder::setMaxWorkspaceSize()
ERROR: …/builder/tacticOptimizer.cpp (1023) - OutOfMemory Error in computeCosts
sample_fasterRCNN: samplePVA.cpp:134: void caffeToTRTModel(const string&, const string&, const std::vector<std::__cxx11::basic_string >&, unsigned int, nvcaffeparser1::IPluginFactory*, nvinfer1::IHostMemory**): Assertion `engine’ failed.
Aborted (core dumped)
"

Whereas the memory usage of GPU is not not even crossing 2gb and i hvae 8gb GPU.
I am using tensorrt 4.0.1.6.

Kindly help me.

Thanx

Hi,

Two recommendations:

  1. Decrease batch size

  2. Increase workspace size
    Please check this page for more information:
    https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/classnvinfer1_1_1_i_builder.html

Thanks.