I came across this video on Isaac ROS Stereo Camera Depth Perception:
May I ask where are the images output streamed to and visualized in? And also whether it is possible to stream these images to and visualize them in Omniverse Isaac SIM via a ROS Topic?
The live images are recorded on robot using a HAWK camera while the simulated images are from Isaac Sim. The overlays are respectively generated from Isaac ROS DNN Stereo Disparity, Isaac ROS Image Pipeline, and Isaac ROS Proximity Segmentation respectively. Each has its own simple Python scripts to colorize the outputs for visualization in a window.