for visualizing the output of a depth camera within isaacsim, the only way i’ve found is adjusting the viewport’s render annotator to ‘distance to camera’.
However the color scale is fixed to the ‘meters per unit’ root layer setting, as explained in the this answer. Thank you very much @ahaidu !
The issue is that changing the ‘meters per unit’ setting works to adjust the color scale BUT also has unwanted side effects → changes physics aspects / textures / importing models: this parameter is used by other aspects of IsaacSim that should not change just for a better colorscale of the depth image.
Is there a better way to visualize the output of a depth camera in IsaacSim?
It seems like a use case for every robotics application at the core of IsaacSim!
Thanks for the reply! the purpose is a live visualization for interactive development and demonstration purposes.
Saving the data to disk via replicator is not enough for this application.
you could take a look on how the sensor images are published in the UI and publish the scaled depth data. Where /omni.syntheticdata/omni/syntheticdata/scripts/visualizer_window.py uses the ImageProvider class from /omni.ui/omni/ui/_ui.pyi to draw the images in the UI.
Here are the relevant snippets on setting this up:
# create the UI
with ui.VStack():
ui.Label(sensor, alignment=ui.Alignment.CENTER, height=20)
ui.ImageWithProvider(
self._activated_visualization_data[sensor], alignment=ui.Alignment.CENTER_TOP
)
ui.Spacer(height=20)
# ...
# Set the data
self._activated_visualization_data[sensor] = ui.ImageProvider()
self._activated_visualization_data[sensor].set_image_data(...)