Hi,
I’m trying to parse a .caffemodel using the ICaffeParser interface class from TensorRT (V3.0.2). My model is 1.1GB in size which results in the following protobuf error:
[libprotobuf ERROR google/protobuf/io/coded_stream.cc:207] A protocol message was rejected because it was too big (more than 1073741824 bytes). To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
I tried setting a higher limit on size using the ICaffeParser::setProtobufBufferSize() method, but any value greater than 2^30 (1GB) is apparently set to zero internally giving the following error:
[libprotobuf ERROR google/protobuf/io/coded_stream.cc:207] A protocol message was rejected because it was too big (more than 0 bytes). To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
What can be done to fix this, or is this the hard limit on what you can do with TensorRT?
Thanks in advance,
Maarten