Currently using a wheeled robot with lidar to do SLAM. I added the omni.anim.people extension following the guide, to generate more rich tests with the navigation stack, just to see that actors are not detected by the lidars (Even if I add collisions to their mesh, as collisions do not move in conjuction with the meshes being animated). I saw in Omni.Anim.People People can not be detected by Lidars that you could use rtx lidar, but found no way to construct the omni action graph to publish the laser scan using it.
Tried https://docs.omniverse.nvidia.com/app_isaacsim/_images/isaac_sim_sensors_rtx_based_lidar_node_overview.png and connecting it to the ROS2 Publish Laser Scan graph node but it does not work
Hi @tomas.lobo.it , please take a look at this . You should be able to setup and use an RTX Lidar sensor which will publish the lidar scan over to the navigation stack. People detections should come through on the occupancy map/laser scan with it.
Update: Lidar RTX does work properly with the actor’s visuals, tho it has an issue. I have been checking the rtx lidar documentation but it does not matter what values I set to ’ [start,end]AzimuthDeg’ or ’ valid[Start,End]AzimuthDeg’, I cant get the result to not be 380 deg.
The other issue (tho not that it changes much) is if I change the numberOfEmitters to 1 and leave only a emitter with “azimuthDeg”:0.0, “elevationDeg”:0.0 and “fireTimeNs”:0, nothing works and it just crashes. Any help would be appreciated
Same here for the following issue:
if I change the numberOfEmitters to 1 and leave only a emitter with “azimuthDeg”:0.0, “elevationDeg”:0.0 and “fireTimeNs”:0, nothing works and it just crashes.