Jetson Orin Nano Inference GPU

Why can’t I run a YOLO model to inference on GPU? I ordered one to try for my company going forward to use for object detection. I have worked with it for two weeks now. Still no way to run object detection with GPU’s.

Hi,

Would you mind sharing more info about the issue you meet?
Which model do you use and which repo/frameworks do you try to use?
What kind of error and is there any output log can share with us?

Thanks.

Yes, thank you for the response. The documentation on installing PyTorch for Jetson devices has been changed and is not working. I had to find a youtube video (3 months old) where they were installing it and pause the videos and input into the terminal manually. Any fixes for this planned?

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,

Would you mind sharing more info about the error you met?
Is there any error message or output log?

Thanks