I’m trying to use TF->ONNX->TensorRT engine to build models for TensorRT.
My model has custom ops and hacked ONNX Parser to parse the INetworkDefinition correctly.
However, the call to IBuilder::buildEngineWithConfig seems returning
nullptr and could not find useful debug information for that.
Any ways to debug this? And know what went wrong during
NOTES: I provided a custom plugin to TensorRT, using both
REGISTER_TENSORRT_PLUGIN and custom register function.
TensorRT Version: 7.0.0
GPU Type: 2080Ti
Nvidia Driver Version: 440
CUDA Version: 10.2
Operating System + Version: Ubuntu 16.04
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):