Hi,
I’m trying to load a custom model using imagenet-console from jetson-inference. (TensorRT version 2.1.2, tx1)
I got the following error:
[GIE] Internal error: could not find any implementation for node conv2_1/dw + relu2_1/dw, try increasing the workspace size with IBuilder::setMaxWorkSpace()
[GIE] cudnnBuilder2.cpp (586) - OutOfMemory Error in buildSingleLayer
I tried increasing the workspace with builder->setMaxWorkspaceSize(16 << 24), but that didn’t solve the problem. Help would be greatly appreciated.