RTX Lidar as render product

Hi Everyone,
I have an requirement to record a 360 deg pointcloud.
This should be possible via an RTX Lidar which gives me an ability to imitate a Lidar with a config file and allows to publish pointcloud in ROS.
But I was hoping that there would be a way to record pointcloud from RTXLidar render product rather than just the camera.

for example:

# This example is not functional
_, sensor = omni.kit.commands.execute(
    translation=(0, 0, 1.0),
    orientation=Gf.Quatd(0.5, 0.5, -0.5, -0.5),
render_product = rep.create.render_product(sensor)

pointcloud_annotator = rep.AnnotatorRegistry.get_annotator("pointcloud")

Can someone please guide me on how can i acheive this.

1 Like

anyone has any idea?

I’m going to move this to the IsaacSim channel. I think they’ll be able to better assist you there.

Hello, Hopefully this reply is not too late to be of some help.

There are a couple ways to do this:

  1. use ROS tools to record the point cloud data. This is well documented, and should be straightforward, see: 12. Publish RTX Lidar Point Cloud — Omniverse IsaacSim latest documentation

  2. create your own node using the printRTXLidar node as a base: RTX Lidar Node Descriptions — Omniverse IsaacSim latest documentation

I would suggest (1).

Although I may have misunderstood your question. if you just want to get the points to render on screen, you need to take a section of code like this:

# RTX sensors are cameras and must be assigned to their own render product
_, render_product_path = create_hydra_texture([1, 1], sensor.GetPath().pathString)

# Create the debug draw pipeline in the post process graph
writer = rep.writers.get("RtxLidar" + "DebugDrawPointCloud")

Isaac Sim 2023.1.0 now has a IsaacCreateRTXLidarScanBuffer node which keeps a running update of the 360 scan.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.