Run the Hello AI World project on my Orin Nano

I’ve been following the “Hello AI World” tutorial but ran into issues—specifically, TensorRT 10.3 no longer supports legacy Caffe models, which caused the process to fail. After searching the forums, I tried to rebuild the model using PyTorch instead, but couldn’t get PyTorch to install successfully for JetPack 6.2.1, which is what I am using now.

Has there been an updated version of the “Hello AI World” tutorial with clearer, more foolproof instructions? I’d appreciate any guidance or updated resources.

Hi,

Please use JetPack 6.0 if you want to test “Hello AI World”.

As we move to generative AI now, please test our generative AI tutorial on the environment with JetPack 6.1+.

Thanks.

Thank you for the reply.
Just wondering: Is there a plan to migrate “Hello AI World” to JetPack 6.2+ in the near future? Also, if I reflash my system back to JetPack6.0, do I still have the “TensorRT 10.3 no longer supports legacy Caffe models” problem?

If you reflash back to JetPack6.0, you will get TRT 8.6+, which is compatible with the legacy Caffe models, and the “Hello AI World” project will function properly (most of it).

I hope Nvidia directly answers the question “…is there a plan to migrate “Hello AI World” to JetPack 6.2+…”. Yes or No?

Hi, both

Hello AI World mainly depends on the Caffe model, so you will need JetPack 6.0.
If you want to run inference on JetPack 6.1+, please check Jetson AI Lab or Deepstream SDK.

Jetson AI Lab: focus on LLMs

Deepstream SDK: focus on vision or multimedia

Thanks.