'OgnROS2RtxLidarHelperInternalState' object has no attribute 'sensor'

Hello,
I am using Isaac Sim 2023.1.0 and ROS Humble on Ubuntu 22.04.
I have added 2 PhysX Lidars to a mobile robot. The Action graph is -

It is giving me the following error when I run the simulation -

2023-11-15 09:02:33 [144,372ms] [Error] [omni.graph.core.plugin] /World/mir/ros_lidars/ros2_rtx_lidar_helper: [/World/mir/ros_lidars] Assertion raised in compute - ‘OgnROS2RtxLidarHelperInternalState’ object has no attribute ‘sensor’
File “/home/labtower2/.local/share/ov/pkg/isaac_sim-2023.1.0/exts/omni.isaac.ros2_bridge/omni/isaac/ros2_bridge/ogn/python/nodes/OgnROS2RtxLidarHelper.py”, line 76, in compute
if db.internal_state.sensor is None:

2023-11-15 09:02:33 [144,377ms] [Warning] [omni.fabric.plugin] No source has valid data array=0x1ed56520 usdValid=0 cpuValid=0
2023-11-15 09:02:33 [144,377ms] [Warning] [omni.fabric.plugin] No source has valid data array=0x1ed59488 usdValid=0 cpuValid=0
2023-11-15 09:02:33 [144,378ms] [Error] [omni.graph.core.plugin] /World/mir/ros_lidars/ros2_rtx_lidar_helper_01: [/World/mir/ros_lidars] Assertion raised in compute - ‘OgnROS2RtxLidarHelperInternalState’ object has no attribute ‘sensor’
File “/home/labtower2/.local/share/ov/pkg/isaac_sim-2023.1.0/exts/omni.isaac.ros2_bridge/omni/isaac/ros2_bridge/ogn/python/nodes/OgnROS2RtxLidarHelper.py”, line 76, in compute
if db.internal_state.sensor is None:

2023-11-15 09:02:36 [147,358ms] [Warning] [carb] Client omni.anim.graph.core.plugin has acquired [carb::tasking::ITasking v2.4] 100 times. Consider accessing this interface with carb::getCachedInterface() (Performance warning)
2023-11-15 09:02:37 [148,138ms] [Warning] [carb] Plugin interface for a client: omni.hydratexture.plugin was already released.

I cannot see the Lidar nodes on RViz as well.

Is it because I am using PhysX Lidars instead of RTX Lidars? If yes, how can I configure RTX Lidar to have 270 degree horizontal FOV? I am trying to simulate a Sick nanoScan3 laser scanner.

Thanks!

Hi @rudresh.lonkar,

I believe the RTX lidar uses the render product that nodes you are using, but the physX lidars can’t.

Instead, try using the Isaac Read Lidar Point cloud node and ROS2 publish point cloud.

Hi @aspen.eyers,

Thanks for your reply! The solution you gave worked. It is also mentioned in the ROS1 documentation -
https://docs.omniverse.nvidia.com/isaacsim/latest/ros_tutorials/tutorial_ros_sensors.html

But now, I have a different problem. I am using 2 lidars but they are both transformed w.r.t the base_link / World
Screenshot from 2023-11-22 14-48-44
Screenshot from 2023-11-22 14-49-51

How do I transform the lidar data from the laser_link to the base_link with proper transformation? I am planning of using a laser merger package in ROS to tackle this.

That’s great to hear @rudresh.lonkar!
If that’s the solution to your original question feel free to click mark as solution ;)

You probably need to set up your transform tree properly and visualise in rviz from the map frame.
Or if you’re happy with a local frame you can view in the base_link frame.

map → base_link (this tf comes from odometry) → laser_link (this comes from a published tf tree)

It looks like Isaac sim has the ability to publish tf trees: 7.1.5. Transform Trees and Odometry — Omniverse IsaacSim latest documentation. I haven’t used this functionality yet myself.

Hey @aspen.eyers,
That worked. I used the World frame from the simulation and could see the scans properly. I will have to setup the map frame and odom frame for navigation.

Thanks for all your help!