Custom model on Jetson Nano


I’m currently working on an object detection problem to detect garbage bags using a Jetson nano and a webcam. I previously trained a custom model using MobileNet-V2 320x320. I exported it as both saved model and tflite format. However I haven’t been able to import it to DetectNet even though I converted it to ONNX format using tf2onnx repository.

When I pass the path to the model to DetectNet I get this error:

[TRT] ModelImporter.cpp:773: While parsing node number 265 [TopK → “TopK__875:0”]:
[TRT] ModelImporter.cpp:774: — Begin node —
[TRT] ModelImporter.cpp:775: input: “GatherND__867:0”
input: “Cast__873:0”
output: “TopK__875:0”
output: “TopK__875:1”
name: “TopK__875”
op_type: “TopK”
domain: “”

[TRT] ModelImporter.cpp:776: — End node —
[TRT] ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4519 In function importTopK:
[8] Assertion failed: ( && “This version of TensorRT only supports input K as an initializer.”
[TRT] failed to parse ONNX model ‘/home/colbits/model/onnx_model/model.onnx’
[TRT] device GPU, failed to load /home/colbits/model/onnx_model/model.onnx
[TRT] detectNet – failed to initialize.

I tried this solution but then I get this error by running the

AttributeError: ‘NoneType’ object has no attribute ‘op’

Can you help with this?

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.


Which JetPack version do you use?
If you are not using our latest JetPack 4.6.3, could you upgrade and try it again?