Jetson Orin Nano Developer Kit with NVIDIA Docker and ROS2

Hey,
I am trying to make a Docker container work with the Jetson Orin Nano. The software needs gpu acceleration and needs to communicate with a robot over the network with ros2 humble.

I tried different setup-ups:
Docker base image: nvcr.io/nvidia/l4t-base:r36.2.0 with new ros2 installation
and dustynv/ros:humble-ros-base-l4t-r36.2.0
I followed this setup jetson-containers/docs/setup.md at master · dusty-nv/jetson-containers · GitHub including setting the Docker Default Runtime.
I use the following command to start the container: sudo docker run --net=host -it --rm --name l4t_ros l4t_ros.
When setting the correct ROS_DOMAIN_ID I also can receive the correct topics but when I send some information it is not received by the counter part. One weird observation is that when I start the container with sudo docker run --net=host --runtime=nvidia -it --rm --name l4t_ros l4t_ros then the topic do not show up.

I also tried out the official ros2 humble base image and with that I was able to sent the messages but at a low rate which I am not sure was maybe because of slow inference of not using cuda at this moment.

Host system: Jetson Linux 36.2

Hi @jonathan.kinzel, what RMW_IMPLEMENTATION are the other hosts using? I believe the dustynv/ros:humble-ros-base-l4t-r36.2.0 image was built with rmw_cyclonedds_cpp as the default (although newer builds have changed to rmw_fastrtps_cpp - those get set here)

Although it sounds like you also encounter this behavior when installing Humble from the ros apt repo? @Raffaello may be able to lend his experience. Alternatively you can try the Isaac ROS apt packages and container.

Hey,
I am not totally sure why but with my own installation it works now. But thank you for your quick response.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.