Jetpack 6 | tensorflow model | Object detection | inference | Jetson orin nano module

Description

I am newbie to jetson devices.

I have trained my object detection custom dataset in colab got tensorflow model , I want to run inference on my Jetson orin nano, Please point to me any docs as how to do it right and in optimal way? How do I run inference on jetson device using “detectnet” (if possible) and provide the --network point to my custom model?

Environment

GPU Type: nvidia geforce rtx
Nvidia Driver Version:
CUDA Version: 12.0
CUDNN Version: 8.9
Operating System + Version: ubuntu 2204
Python Version (if applicable):
TensorFlow Version (if applicable): 2.15
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

1 Like

Hi @daydreamers0423 ,
Would recommend you to raise the query on JEtson orin forum.
Thanks

ok thanks