Failed to initialize imageNet

Hi
I followed the steps for building jetson-inference for the Deep Learning Nodes for ROS/ROS2 package from the below link

https://github.com/dusty-nv/ros_deep_learning

Even though build was successful I cannot run the samples of jetson-inference shown in the below website,it is throwing the error which is attached below.Same error is shown with other samples also.

https://github.com/dusty-nv/jetson-inference/blob/master/docs/imagenet-console-2.md

Example used is shared below,

“./imagenet images/strawberry_0.jpg images/test/output_1.jpg”

Error message

here is what is inside of networks directory in the jetson-inference repo which is cloned

So could you please let me know what is the root cause of the issue so that I can proceed same with ROS.

Attached the full log of the above example.
full_log_error.txt (24.3 KB)

Hi @krishnaprasad.k, which Jetson AGX Xavier are you using? Also which version of JetPack-L4T are you on? (you can check this with cat /etc/nv_tegra_release)

Typically that error means you are running an older version of JetPack-L4T or TensorRT that doesn’t support your particular device yet.

Hi dusty_nv,

We are using Jetpack 4.6.1 on xavier-nx

Attached the log of nv_tegra_release file

We also tried with docker container inside the jetson-inference directory,but failed to run sample,same error is thrown.

Could you please clarify the Jetpack and TensorRT versions,to run the jetson-inference sample models properly.

Hi @krishnaprasad.k, if you are using Xavier NX 16GB, please refer to this post below:

Hi @dusty_nv,

Thanks for sharing the post.
When we upgrade from Jetpack 4.6.1 to Jetpack 5.0.2 for the compatible TensorRT version ,ROS2 version will be changed to foxy for ubuntu 20.04

But the ros_deep_learning repo use ROS2 Eloquent throughout the site which supports ubuntu 18.04
ros_deep_learning

Could you please let us know whether ROS2 version will conflict to do samples with deepstream nodes on ROS2,when we upgrade the Jetpack verison to 5.0.2 or applying overlay patch given in the Jetpack 4.6.1 website is the only option?

Hi @krishnaprasad.k, the ros_deep_learning nodes compile against Foxy/Galactic/Humble as well (I should mention this in the docs). Here’s an example where they get built with Foxy: https://github.com/dusty-nv/jetbot_ros/blob/d8e5ee1b17f5c66d038017b86f8920a496197ea9/Dockerfile#L123

If you want to use ros_deep_learning, use the PyTorch variants of my ROS containers as the base container, because these include the dependencies needed for ros_deep_learning package.

Hi @dusty_nv,

Could you please clarify whether ros_deep_learning nodes can be compiled on ROS2 foxy without the support of docker container?

Presuming that you already have ROS2 Foxy built/installed outside of Docker, yes ros_deep_learning can also be built outside of container (just install jetson-inference first)

My ROS2 containers are provided for convenience and to contain the build environment, however you can follow the same steps as inside the Dockerfile and should be able to build ROS2 outside of container that way too.

Hi @dusty_nv,

Thank you.Let us try it on our side and come back.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.