I have an inception module of varying width as a hyper parameter. I’ve found that TensorRT can handle my model as long as the width of my inception module is not too large. IE if I have 8 branches in the module it is ok, but I get errors when the number of branches reaches 12.
The error message I get with 12 branches is:
cudnnBuilder2.cpp:1006: nvinfer1::cudnn::Engine* nvinfer1::builder::buildEngine(nvinfer1::CudaEngineBuildConfig&, const nvinfer1::cudnn::HardwareContext&, const nvinfer1::Network&): Assertion `it != tensorScales.end()’ failed.
Any feedback would be great as this is the only difference in my model and it will be a few days until I can finish training a new model to test a different number of branches.