I’m looking to deploy a detectron2/MasRCNN model using TensorRT on a Jetson AGX Xavier. So far I haven’t been successful importing the model to TensorRT.
By default detectron2 exports to caffe2, however TensorRT doesn’t appear to support caffe2 as an input. I’ve been able to export MaskRCNN to onnx, but am getting errors using the latest onnx2trt
Parsing model
Unsupported ONNX data type: UINT8 (2)
ERROR: data:192 In function importInput:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)
I’m still exploring a few other options for importing the model to TensorRT, but things don’t look promising. I’m hoping someone on here has either done this before, or is knowledgeable enough to provide a better path forward. Currently I’m looking to see why the model contains uint8’s and convert them.
I am aware of: https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/sampleUffMaskRCNN but am looking to find a solution based on detectron2. If switching to tensorflow is the simplest option however, then I’ll go that route.
Thank you in advance for any help!