Issue deploying custom Tensorflow model on Xavier NX

Hello, im trying to deploy a custom tensorflow model on my Jetosn Xavier NX. The model is trained from a ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8

The model is then converted to onnx using this method: GitHub - pskiran1/TensorRT-support-for-Tensorflow-2-Object-Detection-Models

I have also tried converting it with this method: TensorRT/samples/python/tensorflow_object_detection_api at main · NVIDIA/TensorRT · GitHub

The model passes the trxexec test.

When i try to do inference i get this error: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed. )

The Xavier is running :
Jetpack 4.6.1


Would you mind sharing the complete output log with us?

Heres a picture of the error message after running detectnet with this command:

detectnet --model=models/thewinningone.onnx --labels=models/test/labels.txt --input-blob=input_tensor --output-cvg=detection_scores --threshold=0.15 --output-bbox=detection_boxes /dev/video0

I also uploaded the model.
thewinningone.onnx (10.2 MB)

Hi @bruker58, can you try running this on a single image? I wonder if a CUDA error is actually occurring during the post-processing, which messes up TensorRT on the next frame.

Anyhow, the ONNX pre/post-processing in jetson.inference.detectNet is configured for the models trained with from the Hello AI World tutorial with PyTorch. It probably needs adjusting to support your different model (such as different interpretation of the output tensors, and possibly different coefficients used for mean-pixel subtraction during pre-processing/ect)

Hi dusy_nv, thanks so much for responding. I tried running the inference on a single image. Here’s the result:

Do you have any other suggestions on how to run the model on the xavier with a decent framerate?

@AastaLLL might have other suggestions about how to convert it, but I would give TF-TRT a try:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.