I just noticed that the depth camera image and lidar scans treat objects with a transparent material just like any opaque material (eg. omniverse://localhost/NVIDIA/Materials/Base/Glass/Clear_Glass.mdl).
In reality glass would usually not be recognized by lidars or depth cameras.
This sparked the question if the material properties (esp. reflection) of objects in Isaac Sim are somehow taken into account when calculating the result of a lidar scan or depth images. I have the impression that this is currently not the case, if that would be possible in the future it would be really great!
We are planning to have RTX based sensor support in the future that will allow the materials applied to geometry to influence the lidar simulation.
Currently the lidar sensor uses PhysX raycasts so it interacts with the collision geometry instead of the visual geometry which limits how materials influence the lidar simulation