Omni.Anim.People + Lidar

Currently using a wheeled robot with lidar to do SLAM. I added the omni.anim.people extension following the guide, to generate more rich tests with the navigation stack, just to see that actors are not detected by the lidars (Even if I add collisions to their mesh, as collisions do not move in conjuction with the meshes being animated). I saw in Omni.Anim.People People can not be detected by Lidars that you could use rtx lidar, but found no way to construct the omni action graph to publish the laser scan using it.

Tried https://docs.omniverse.nvidia.com/app_isaacsim/_images/isaac_sim_sensors_rtx_based_lidar_node_overview.png and connecting it to the ROS2 Publish Laser Scan graph node but it does not work

1 Like

Hi @tomas.lobo.it , please take a look at this . You should be able to setup and use an RTX Lidar sensor which will publish the lidar scan over to the navigation stack. People detections should come through on the occupancy map/laser scan with it.

Update: Lidar RTX does work properly with the actor’s visuals, tho it has an issue. I have been checking the rtx lidar documentation but it does not matter what values I set to ’ [start,end]AzimuthDeg’ or ’ valid[Start,End]AzimuthDeg’, I cant get the result to not be 380 deg.

The other issue (tho not that it changes much) is if I change the numberOfEmitters to 1 and leave only a emitter with “azimuthDeg”:0.0, “elevationDeg”:0.0 and “fireTimeNs”:0, nothing works and it just crashes. Any help would be appreciated

@rchadha friendly ping

Same here for the following issue:
if I change the numberOfEmitters to 1 and leave only a emitter with “azimuthDeg”:0.0, “elevationDeg”:0.0 and “fireTimeNs”:0, nothing works and it just crashes.

1 Like

@mcarlson1 can you please help with the RTX Lidar issues mentioned above?

Your link does not work, so I can only guess what is going on… however, let me just mention a few common problems.

See this page:
https://docs.omniverse.nvidia.com/isaacsim/latest/isaac_sim_sensors_rtx_based_lidar/node_overview.html

  1. make sure you are not in Windows. rtx-lidar only works in linux.
  2. make sure you can run ./python.sh standalone_examples/api/omni.isaac.debug_draw/rtx_lidar.py. Just to make sure the rtx_lidar is working for you.
  3. when you edit a lidar configuration file, mentioned in this documentation, make sure that your emitter state parameters (here) are all of the same array length, and that the length matches the numberOfEmitters parameter. Otherwise you will get a crash every time.
  4. If you make your own configuration file, make sure you are not getting an error that it can’t be found in the terminal output. It should be in one of the folders set with the app.sensors.nv.lidar.profileBaseFolder carb setting from the ./exts/omni.isaac.sensor/config/extension.toml file.

1 → All the tests were run on ubuntu
2 → I can run rtx_lidar without issues, currently using it 360
3 → All the lengths matched the numberOfEmitters
4 → The file was found

My take for now would be, can you make the rtx lidar be 180 degres instead of 360 for example? I was not able to

Hi @tomas.lobo.it - Yes, you can adjust the field of view of the RTX Lidar sensor in Isaac Sim by modifying the configuration file associated with the sensor.