Hello – I have a jetson orin nano with jetpack 6.2.1, L4T 36.4.7, CUDA 12.6.68. I have tried to install jetson-inference from sources. However I can not seem to be able to run any of the imagenet programs. Looks like there is a breakdown regarding Caffe models that are not supported in the new Jetpack. Can any one provide any suggestions on how to go about this? I could only get videoSource, videoOutput, resize and convert color functions to work from jetson_utils. I did install a docker container as well. I made more progress with that but even that had issues with crashses after showing image and detection. Thnx.
Hi Botz,
I have moved your post to the Jetpack section for better visibility. Thanks for posting on the forums!
Best,
AHarpster
Hi,
jetson-inference is deprecated.
Please set up the device with JetPack 6.0 if you need to run the example.
Thanks.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.