I currently have a LiDAR sensor set up in a scene to collect point cloud and semantic data. When running on a 3090Ti with all enabled I get around 4 fps.
I understand that with a dense LiDAR the computation cost is high to get the semantic values at each ray. To combat this I wanted to take “snapshots” at time intervals. However, when I try to attempt to do this by enabling/disabling either the LiDAR itself or the semantic mode I either don’t get any point clouds or no semantic information respectively. My assumption is that the changing of state is taking time and therefore when requesting the data using the interface it has not actually been collected yet.
My question is;
Does anyone know of a method to efficiently sample at nHz by enabling/disabling the LiDAR (and avoiding my aforementioned issue)?
Also, if anyone knows any optimal way to increase fps without trying a “snapshot” technique or reducing the LiDAR’s sampling density it would be of great help. (This probably isn’t possible).
Below are some code snippets:
result, prim = omni.kit.commands.execute( "RangeSensorCreateLidar", path=lidarPath, parent="/World", min_range=0.4, max_range=100.0, draw_points=False, draw_lines=False, horizontal_fov=360.0, vertical_fov=60.0, horizontal_resolution=0.4, vertical_resolution=0.4, rotation_rate=0, high_lod=True, yaw_offset=0.0, enable_semantics=True ) self.lidarInterface = _range_sensor.acquire_lidar_sensor_interface()
Snippet to retrieve point cloud and semantic data (called at freq or by button)
pointcloud = self.lidarInterface.get_point_cloud_data("/World"+self.lidarPath) semantics = self.lidarInterface.get_semantic_data("/World"+self.lidarPath)
Snippet used to enable the lidar/semantics
lidar_att = pri.GetAttribute("enabled") # lidar_att = pri.GetAttribute("enableSemantics") lidar_att.Set(1)