Why do you need to use apt install *nvblox* inside docker, because we cloned it into the workspace

why do you need to use apt install nvblox inside docker, because we cloned it into the workspace, can we do colcon build , and then source /workspaces/isaac_ros-dev/install/setup.bash ,? This may be a stupid question, but it came up. Maybe a package cloned into the workspace has a higher priority even after it is changed, for example, when remapping various topics, to check the work with turtlebot waffle with realsense 200.

Second question:
I started the standard one
ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
with setting up waffle and publishing realsense r200 topics.
How can I make it easier to pair topics.
When running rqt_graph,

after

ros2 launch nvblox_examples_bringup realsense_example.launch.py
mode:=dynamic

In general, how can I make nvblox friends with turtlebot3, since probably many novice roboticists start with turtlebot3? Thank you very much .
Run :
#first term0 without docker
ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
#other term1 inside docker
ros2 launch nvblox_examples_bringup realsense_example.launch.py mode:=dynamic

Run :
#first term0 without docker
ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
#other term1 inside docker
ros2 launch nvblox_examples_bringup nvblox_carter_navigation.launch.py mode:=dynamic
#other term2 inside docker
ros2 launch nvblox_examples_bringup realsense_example.launch.py mode:=dynamic

Hi @j15home

welcome to the Isaac ROS forum, let me reply to your questions

The purpose of this documentation is to provide a step-by-step guide on how the Isaac ROS packages work and how to install them. When building your own setup, it is recommended to create a custom Docker container with all the necessary packages and customizations already included.

I suggest to follow our documentation to build your own Isaac ROS image: Isaac ROS Dev ā€” isaac_ros_docs documentation

Iā€™m quite worried that the RealSense R200 may not be fast enough for the Isaac ROS NVBlox; please check if it meets the minimum requirements.

https://nvidia-isaac-ros.github.io/repositories_and_packages/isaac_ros_nvblox/index.html#camera-system-requirements

Please check our API documentation for the Isaac ROS nvblox: ROS Topics and Services ā€” isaac_ros_docs documentation

The main topic to publish and allow Isaac ROS nvblox to works are listed below:

ROS Topic Interface Description
camera_{i}/color/image sensor_msgs/Image Optional input color image to be integrated. Must be paired with a camera_info message below. Only used to color the mesh.
camera_{i}/color/camera_info sensor_msgs/CameraInfo Optional topic along with the color image above. Contains intrinsics of the color camera.
camera_{i}/depth/image sensor_msgs/Image The input depth image to be integrated. Must be paired with a camera_info message below. Supports both floating-point (depth in meters) and uint16 (depth in millimeters, OpenNI format).
camera_{i}/depth/camera_info sensor_msgs/CameraInfo Required topic along with the depth image. Contains intrinsic calibration parameters of the depth camera.
pointcloud sensor_msgs/PointCloud2 Input 3D LIDAR pointcloud. You must set the Lidar intrinsic parameters, if using this input, as it uses those to convert the pointcloud into a depth image.

Where the i in the subscription list below can be replaced with 0-3, for example camera_0/color/image .

Best,
Raffaello

1 Like