I am creating TensorRT engine from onnx for dynamic batch input.
I have two onnx models. One has input fixed 1x24x94x3.
Another one has dynamic batch so input is Unknownx24x94x3.
I can see all these using Netron.
When networked is parsed we can see input dimension using network->getInput(0)->getDimensions().
For fixed input, I can print as 1x24x94x3.
For dynamic, input shape is -1x24x94x3.
For that -1 input, TensorRT doesn’t accept -1 shape. The error is …/builder/cudnnBuilderGraph.cpp (794) - Assertion Error in checkDimsSanity: 0 (dims.d[i] >= 0).
I am using this dynamic shape sample.
You can test.
The error happened at build preprocessor engine.
bool SampleDynamicReshape::buildPreprocessorEngine(const SampleUniquePtr<nvinfer1::IBuilder>& builder)
{
..........
mPreprocessorEngine = makeUnique(builder->buildEngineWithConfig(*preprocessorNetwork, *preprocessorConfig));
........
}
Moreover, the following three lines also doesn’t take -1 input shape.
bool SampleDynamicReshape::prepare()
{
.....
mPredictionInput.resize(mPredictionInputDims);
mOutput.hostBuffer.resize(mPredictionOutputDims);
mOutput.deviceBuffer.resize(mPredictionOutputDims);
return true;
.......
}
It is strange. The program is designed for explicit batch, then dynamic shape -1 is not accepted.
What is wrong with TensorRT?
Or did I make something wrong?