Custom RTX Lidar configuration no longer working in 5.0.0

Isaac Sim Version

5.0.0

Operating System

Ubuntu 24.04

GPU Information

Model: GeForce RTX 3050 Ti Laptop
Driver Version: 550.163.01

Topic Description

Detailed Description

I’m trying to use a custom lidar configuration. Up until 4.5.0 I could add the parent directory containing the custom_lidar.json to app.sensors.nv.lidar.profileBaseFolder in /source/extensions/isaacsim.sensors.rtx/config/extension.toml and call IsaacSensorCreateRtxLidar with my custom settings. This no longer works and throws:

2025-09-11T15:02:31Z [32,911ms] [Error] [isaacsim.sensors.rtx.plugin] CUDA error 700: cudaErrorIllegalAddress - an illegal memory access was encountered at ../../../source/extensions/isaacsim.sensors.rtx/nodes/IsaacSimSensorsRTXCuda.cu:304
2025-09-11T15:02:31Z [32,911ms] [Error] [isaacsim.ros2.bridge.plugin] CUDA error 700: cudaErrorIllegalAddress - an illegal memory access was encountered at ../../../source/extensions/isaacsim.ros2.bridge/nodes/OgnROS2PublishPointCloud.cpp:175
2025-09-11T15:02:31Z [32,911ms] [Error] [isaacsim.ros2.bridge.plugin] CUDA error 700: cudaErrorIllegalAddress - an illegal memory access was encountered at ../../../source/extensions/isaacsim.ros2.bridge/nodes/OgnROS2PublishPointCloud.cpp:177

In a second attempt, I tried adding another configuration to the existing ones (e.g. “OS0_REV6_32ch10hz1024res” to the “OS0” directory) and adding it to the supported_lidar_configs.py. This crashes with a segfault, both with an existing config copied from another file or with my custom one. (See below)

Lastly, I tried modifying the existing Lidar configs of e.g. “OS1_REV6_32ch20hz1024res” but this seems to have no effect on the result.

Is this a bug or is there a different way of using custom RTX Lidar configurations in 5.0.0?

Steps to Reproduce

  1. From the root of the latest compiled isaacsim repo, run ./_build/linux-x86_64/release/python.sh source/standalone_examples/api/isaacsim.ros2.bridge/rtx_lidar.py
    In another ROS2 Jazzy sourced terminal, ros2 topic echo /point_cloud will show data.

  2. Change config=”Example_Rotary” to config=“OS0_REV6_128ch10hz512res” and repeat step 1. This will work correctly as before.

  3. Make a copy of “OS0_REV6_128ch10hz512res” in source/extensions/isaacsim.sensors.rtx/data/lidar_configs/Ouster/OS0 and rename it “OS0_REV6_128ch10hz512res2”.

  4. Add “OS0_REV6_128ch10hz512res2” it to “/Isaac/Sensors/Ouster/OS0/OS0.usd” in source/extensions/isaacsim.sensors.rtx/python/impl/supported_lidar_configs.py.

  5. In rtx_lidar.py change the config to “OS0_REV6_128ch10hz512res2”. Run ./build.sh.

  6. Repeat step 1 → Crash with segfault.

Thank you very much for your help!

Hi @konstantin.sommer we recently released 5.1.0 OSS on github. Could you please try that and see if this issue persists? Thanks.

Hi @zhengwang, thank you for the response! The issue unfortunately persists.

When repeating the steps above on 5.1.0, step 6 no longer causes a segfault but defaults to the Example_Rotary configuration instead of my custom one. The logs look like this:

[Error] [rtx.rtxsensor.plugin] RtxSensor misconfiguration: No schema-based sensor parameters were provided and no legacy config string was provided.
[Warning] [omni.sensors.nv.lidar.lidar_core.plugin] No config file given - use default Example_Rotary

Thank you for your help!

Hi @konstantin.sommer sorry for my late reply and thanks for bringing this issue up. I am able to replicate this issue on my end. Let me reach to the internal team about it.

The IsaacSensorCreateRtxLidar command assumes paths in supported_lidar_configs.py are in S3 or isaac-dev.ov.nvidia.com . You can pass force_camera_prim=True and config=[name of file without .json suffix] to the command and it should set the Camera prim attributes appropriately.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.