LIDAR Parallel Sim

Hi,
Can I get LIDAR depth data in a parallel RL setting? I was able to visualize it in parallel setting but is there support to get depth data and store it in the observation buffer?

Also what I notice is that even if I don’t acquire the data, I have attached Lidar in the usd of the mobile robot. But when I run it in Parallel sim, the LIDAR rays don’t seem to get blocked by obstacles, depth value is always 100.

I was able to do make it work by setting “enable_scene_query” to true

Hello there, I also want to use Lidar sensors to do RL, but so far do not have a clear idea. The doc and examples did not show how to access the camera and Lidar sensors using the Tensor API. It would be great if you could share some snippets about it. Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.

Hi @btx0424 - Hope this documentation will be helpful to you: https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/ext_omni_isaac_range_sensor.html#isaac-sim-physx-lidar-example