I followed the steps for building jetson-inference for the Deep Learning Nodes for ROS/ROS2 package from the below link
Even though build was successful I cannot run the samples of jetson-inference shown in the below website,it is throwing the error which is attached below.Same error is shown with other samples also.
Example used is shared below,
“./imagenet images/strawberry_0.jpg images/test/output_1.jpg”
here is what is inside of networks directory in the jetson-inference repo which is cloned
So could you please let me know what is the root cause of the issue so that I can proceed same with ROS.
Attached the full log of the above example.
full_log_error.txt (24.3 KB)
@krishnaprasad.k, which Jetson AGX Xavier are you using? Also which version of JetPack-L4T are you on? (you can check this with
Typically that error means you are running an older version of JetPack-L4T or TensorRT that doesn’t support your particular device yet.
We are using Jetpack 4.6.1 on xavier-nx
Attached the log of nv_tegra_release file
We also tried with docker container inside the jetson-inference directory,but failed to run sample,same error is thrown.
Could you please clarify the Jetpack and TensorRT versions,to run the jetson-inference sample models properly.
@krishnaprasad.k, if you are using Xavier NX 16GB, please refer to this post below:
As mentioned in previous posts
213791 and 214020 there is an issue on the TensorRT version of Jetpack 4.6.1 that makes it incompatible with Xavier NX emmc 16GB. It is suggested to upgrade to Jetpack 5.0.1. However, we are using Auvidea JNxxx board which provides firmware up to Jetpack 4.6. Is there any other workaround to solve the missing TensorRT device entry?
Thanks for sharing the post.
When we upgrade from Jetpack 4.6.1 to Jetpack 5.0.2 for the compatible TensorRT version ,ROS2 version will be changed to foxy for ubuntu 20.04
But the ros_deep_learning repo use ROS2 Eloquent throughout the site which supports ubuntu 18.04
Could you please let us know whether ROS2 version will conflict to do samples with deepstream nodes on ROS2,when we upgrade the Jetpack verison to 5.0.2 or applying overlay patch given in the Jetpack 4.6.1 website is the only option?
@krishnaprasad.k, the ros_deep_learning nodes compile against Foxy/Galactic/Humble as well (I should mention this in the docs). Here’s an example where they get built with Foxy: https://github.com/dusty-nv/jetbot_ros/blob/d8e5ee1b17f5c66d038017b86f8920a496197ea9/Dockerfile#L123
If you want to use ros_deep_learning, use the PyTorch variants of my
ROS containers as the base container, because these include the dependencies needed for ros_deep_learning package.
Could you please clarify whether ros_deep_learning nodes can be compiled on ROS2 foxy without the support of docker container?
Presuming that you already have ROS2 Foxy built/installed outside of Docker, yes ros_deep_learning can also be built outside of container (just install jetson-inference first)
My ROS2 containers are provided for convenience and to contain the build environment, however you can follow the same steps as inside the Dockerfile and should be able to build ROS2 outside of container that way too.
Thank you.Let us try it on our side and come back.
December 7, 2022, 5:04am
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.