Exporting Tensorflow models to Jetson Nano

Did you find a working solution?

not yet. sorry

I have the same problem with SSD-resnet34, after some research, I passed by this thread about NMS plugin problem in ONNX, where they propose instead of adding a custom plugin at TensorRT level, to change the NMS node at ONNX graph level using graphsurgeon tool to use the TensorRT prebuilt plugin.

I still didn’t try the provided scripts yet, but it may help someone with similar issue.

1 Like

Hi AastaLLL, I have followed your instructions but got a new error (see attached the complete log file), some idea on how to fix it?:

Error before applying Graphsurgeon:
Unsupported ONNX data type: UINT8 (2)
ERROR: image_tensor:0:189 In function importInput:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)
[01/19/2021-19:48:27] [E] Failed to parse onnx file
[01/19/2021-19:48:27] [E] Parsing model failed
[01/19/2021-19:48:27] [E] Engine creation failed
[01/19/2021-19:48:27] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=/faster_rcnn_inceptionv2_coco.onnx --explicitBatch

$ trtexec --onnx=/faster_rcnn_inceptionv2_coco_updated_model.onnx --explicitBatch
Error after applying Graphsurgeon :
While parsing node number 7 [Loop]:
ERROR: ModelImporter.cpp:92 In function parseGraph:
[8] Assertion failed: convertOnnxWeights(initializer, &weights, ctx)
[01/19/2021-20:35:59] [E] Failed to parse onnx file
[01/19/2021-20:35:59] [E] Parsing model failed
[01/19/2021-20:35:59] [E] Engine creation failed
[01/19/2021-20:35:59] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=/faster_rcnn_inceptionv2_coco_updated_model.onnx --explicitBatch

log_onnx_tensorrt.txt (76.2 KB)

Please help to open a new topic if it’s still an issue. Thanks