I tried the tutorial of Isaac Perceptor below on Isaac Sim 4.1.0 (without Nova Carter or Nova Orin, just on Isaac Sim on x86_64 system). https://nvidia-isaac-ros.github.io/reference_workflows/isaac_perceptor/run_perceptor_in_sim.html
But I found some failure of result of it.
Could you give some instructions to get appropriate result?
I should introduce Isaac Perceptor with demonstration in the exhibition, so I really need your help.
I got result as below video (it was duplicated).
It shows no Nvblox voxels, and goal wanders around by oneself.
I also tried Nvblox sample on same container, it worked fine.
My system environment is below.
OS: Ubuntu 22.04
RAM: 96 GB
GPU: RTX A6000 *1
NVIDIA Driver Version: 555.58.02
CUDA Version: 12.6
Isaac Sim Version: 4.1.0 (ROS2 Humble Bridge is enabled)
ROS2 Version: Humble
Docker and NVIDIA Container Toolkit are installed and work fine
Thank you for bringing this issue to our attention. To better understand and assist you with the issue, could you please do the following?
Provide a log of running the example perceptor with perceptor_sample_scene.usd
While running the example perceptor, check if there is any message being published to topic /nvblox_node/static_esdf_pointcloud. You can try command ros2 topic hz /nvblox_node/static_esdf_pointcloud which will show the frequency of the message being published (it should be around 10hz)
Instead of sending a goal via rviz2, try sending a command to topic /cmd_vel and see if the robot still moves randomly. For example, you can make the robot move in circle by sending ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.2, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.2}}". Or you can make the robot stop by sending all zero linear and angular velocities.
While running the example perceptor, check if there is any message being published to topic /nvblox_node/static_esdf_pointcloud. You can try command ros2 topic hz /nvblox_node/static_esdf_pointcloud which will show the frequency of the message being published (it should be around 10hz)
Yeah, there were published messages per around 10 hz.
Instead of sending a goal via rviz2, try sending a command to topic /cmd_vel and see if the robot still moves randomly. For example, you can make the robot move in circle by sending ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.2, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.2}}". Or you can make the robot stop by sending all zero linear and angular velocities.
I attached movie following your instruction 2 and 3.
It seems order to nova carter via cmd_vel topic was worked fine.
Thanks for your logs! It seems that nvblox node is able to publish the ESDF messages. Now the problem is why rviz is not visualizing it. Could you please try ros2 node info /rviz and see if the topic /nvblox_node/static_esdf_pointcloud is in its subscribers list? If it is, could you please do ros2 topic echo /nvblox_node/static_esdf_pointcloud, save the output to a log and attach it here as well? Also could you please attach your rviz config file? In rviz, you can click file > save config as and save it to your desired path.
As for perceptor is generating random goal poses, I am able to reproduce this issue and passed it to our internal engineering team. They will take a look and try to root cause it. Again, thanks for bringing this issue up!
As for perceptor is generating random goal poses, I am able to reproduce this issue …
Thanks a lot for your reproduction test.
The unsteady goal issue is important, but at least, we can set goal via terminal.
More primary issue in this time is hidden nvblox output in this demonstration.
(Of course, I want to solve random goal issue as well!)
Then, hidden nvblox issue wasn’t duplicated in your test?
Sorry for my ignorance. I just noticed that you are using Isaac Sim 4.1.0 while the tutorial is using scene localhost/NVIDIA/Assets/Isaac/4.0/Isaac/Samples/NvBlox/perceptor_sample_scene.usd.
I tried difference combinations of Isaac Sim version and assets version. I am able to completely replicate your issue if I use Isaac Sim 4.1.0 + Asset 4.0 or Isaac Sim 4.0.0 + Asset 4.1.
Do you happen to use Isaac Sim 4.1.0 + Asset 4.0?
When I tried Isaac Sim 4.0.0 with Asset 4.0, I can visualize ESDF on rviz but the robot would follow some random goal poses. This issue would happen 4 out of 5 times based on my trials.
When I tried Isaac Sim 4.1.0 with Asset 4.1, I cannot visualize ESDF on rviz (empty message as you shared in your log and empty local costmap as well) but the robot is able to navigate well given a goal pose (but it will run into obstacles because of empty local costmap).
In our tutorial, the Isaac ROS is validated with Isaac Sim 4.0.0 (link)
So it is recommended to use Isaac Sim 4.0.0. But as I said, I still encounter the “random goal pose” issue with this version and I will update you once our engineering team has more updates.
As for supporting Isaac Sim 4.1.0, I will ask our team for the timeline and get back to you once I have an answer.
Could you tell me a command which can set goal and go to?
(Of course, I know I should learn ROS2 by myself, but I should test this demonstration quickly for the exhibition.)
And, result of nvblox shown in RViz had too low framerate than one of the RGB cam, the depth cam, and the viewer of Isaac Sim.
Was it occurred by performance of graphic board I using (one RTX A6000)?
(The usage of gpu was always around 90 % during running the demonstration.)
But, the nvblox demonstration I tested previously worked smoothly.
I don’t know why there are such difference, so could you give me some idea?
I am able to confirm with our engineer that “random goal poses” is an issue for Isaac Sim 4.0.0 with Isaac Perceptor. It happens quite often during our tests (around 4 out of 5 times). An internal ticket has been created to track it. I will let you know once this issue is fixed.
The issue still exists even if you publish a goal pose message to /goal_pose topic for now. But I can share you the command: ros2 topic pub --once /goal_pose geometry_msgs/msg/PoseStamped "{header: {stamp: {sec: 0, nanosec: 0}, frame_id: 'odom_vslam'}, pose: {position: {x: 1, y: 1, z: 0}, orientation: {x: 0, y: 0, z: 0, w: 1}}}"
Our engineer is able to root cause the problem and they come up with a temporary fix. After you install the required debian packages and assets, please run sudo sed -i "76i \ \ \ \ \ \ \ \ \ \ \ \ 'image_qos': 'DEFAULT'," /opt/ros/humble/share/isaac_ros_perceptor_bringup/launch/algorithms/vslam.launch.py in the docker.
Then you can start the simulation in Isaac Sim and launch the perceptor example.
Based on my test, this would resolve the “random goal pose” issue and “low robot pose frequency” issue. I think your observation of “result of nvblox shown in RViz had too low framerate” might be caused by low robot pose frequency. Without this temp fix, the robot pose publish frequency was under 1 hz. With this temp fix, it is increased to 30-40 hz. But please note that the nvblox ESDF publish frequency is around 16 hz so it is less than half of the camera framerate (~40 hz).
You can verify it with command ros2 topic hz ${topic_name}
Please give it a try and see if it resolves all of the issues you have seen so far.
I can set a rigid goal and get smooth result of nvblox on RViz with below your instruction.
After you install the required debian packages and assets, please run sudo sed -i "76i \ \ \ \ \ \ \ \ \ \ \ \ 'image_qos': 'DEFAULT'," /opt/ros/humble/share/isaac_ros_perceptor_bringup/launch/algorithms/vslam.launch.py
in the docker.
Then, I think that command will make some change in vslam.launch.py.
But I can’t find any difference in /opt/ros/humble/share/isaac_ros_perceptor_bringup/launch/vslam.launch.py between before and after executing that command.
You are right. This command is to add 'image_qos': 'DEFAULT', at line 76 in the file /opt/ros/humble/share/isaac_ros_perceptor_bringup/launch/algorithms/vslam.launch.py. Here is my file after edit: vslam.launch.py (4.0 KB)
I think you might have attached the wrong file for “edited”. The two files you attached are the same (with the same file name and the same content).
This added line is to set QoS setting to DEFAULT for cuvslam. For more information about QoS, please refer to the ROS 2 official document.
If you check the yaml file for nvblox in the isaac_ros_perceptor_bringup package, it sets the QoS setting to SENSOR_DATA. Since cuvslam didn’t specify the QoS setting, it will also use SENSOR_DATA. The following is from the ROS 2 official document:
For sensor data, in most cases it’s more important to receive readings in a timely fashion, rather than ensuring that all of them arrive. That is, developers want the latest samples as soon as they are captured, at the expense of maybe losing some. For that reason the sensor data profile uses best effort reliability and a smaller queue size.
According to their documentation, perceptor will subject to data drop, which causes cuvslam to be unable to generate robot poses with good frequency, thereby affecting ESDF and costmap construction in nvblox and ultimately impacting navigation.
By setting QoS to DEFAULT for cuvslam, it won’t have bad data drop and the issue is resolved.
Here is the explanation of DEFAULT QoS from the ROS 2 official document:
In order to make the transition from ROS 1 to ROS 2 easier, exercising a similar network behavior is desirable. By default, publishers and subscriptions in ROS 2 have “keep last” for history with a queue size of 10, “reliable” for reliability, “volatile” for durability, and “system default” for liveliness. Deadline, lifespan, and lease durations are also all set to “default”.
Thank you for your explanation about the cause of this issue.
I understood what was bottleneck.
Then, I also set goal from terminal with your instruction below.
The issue still exists even if you publish a goal pose message to /goal_pose topic for now. But I can share you the command: ros2 topic pub --once /goal_pose geometry_msgs/msg/PoseStamped "{header: {stamp: {sec: 0, nanosec: 0}, frame_id: 'odom_vslam'}, pose: {position: {x: 1, y: 1, z: 0}, orientation: {x: 0, y: 0, z: 0, w: 1}}}"
I can’t thank you enough, I will make a good demonstration for the exhibition.