I had implemented a workaround using a padding layer but this was actually not 100% correct. However, by now this problem is fixed in a better way in the original repository so I follow this approach.
Here’s roughly which changes you should make.
trt_utils.h defines a class to compute Padding sizes for MaxPool layers.
class YoloTinyMaxpoolPaddingFormula : public nvinfer1::IOutputDimensionsFormula
You can then give your network a pointer to a YoloTinyMaxpoolPaddingFormula instance which will then be used to compute the correct adding for any MaxPool layer that will be added to the network.
Basically what it does is that it will use “valid” padding for all layers except the ones that are explicitly marked by name to require “same” padding.
else if (m_configBlocks.at(i).at("type") == "maxpool")
// Add same padding layers
if (m_configBlocks.at(i).at("size") == "2" && m_configBlocks.at(i).at("stride") == "1")
m_TinyMaxpoolPaddingFormula->addSamePaddingLayer("maxpool_" + std::to_string(i));
std::string inputVol = dimsToString(previous->getDimensions());
nvinfer1::ILayer* out = netAddMaxpool(i, m_configBlocks.at(i), previous, network);
previous = out->getOutput(0);
assert(previous != nullptr);
std::string outputVol = dimsToString(previous->getDimensions());
printLayerInfo(layerIndex, "maxpool", inputVol, outputVol, std::to_string(weightPtr));
Note that the formulas will only work for square networks. If your network is rectangular you need to make some additional minor changes. Best to run this code in Debug mode because there are some assertions that should warn you when things are going wrong.