Even though build was successful I cannot run the samples of jetson-inference shown in the below website,it is throwing the error which is attached below.Same error is shown with other samples also.
Hi @krishnaprasad.k, which Jetson AGX Xavier are you using? Also which version of JetPack-L4T are you on? (you can check this with cat /etc/nv_tegra_release)
Typically that error means you are running an older version of JetPack-L4T or TensorRT that doesn’t support your particular device yet.
Thanks for sharing the post.
When we upgrade from Jetpack 4.6.1 to Jetpack 5.0.2 for the compatible TensorRT version ,ROS2 version will be changed to foxy for ubuntu 20.04
But the ros_deep_learning repo use ROS2 Eloquent throughout the site which supports ubuntu 18.04 ros_deep_learning
Could you please let us know whether ROS2 version will conflict to do samples with deepstream nodes on ROS2,when we upgrade the Jetpack verison to 5.0.2 or applying overlay patch given in the Jetpack 4.6.1 website is the only option?
If you want to use ros_deep_learning, use the PyTorch variants of my ROS containers as the base container, because these include the dependencies needed for ros_deep_learning package.
Presuming that you already have ROS2 Foxy built/installed outside of Docker, yes ros_deep_learning can also be built outside of container (just install jetson-inference first)
My ROS2 containers are provided for convenience and to contain the build environment, however you can follow the same steps as inside the Dockerfile and should be able to build ROS2 outside of container that way too.