There are some interesting examples in the SDK i.e. StereoDepthDNN, CoarseToFineStereoDepth which require a ZED camera to run.
I have seen that other examples (i.e. Apriltags) support a RealSense camera, so I’m wondering if also these examples can run with some minor adaptations…
The Realsense D435 provides only a single RGB camera image. Neither of the two stereo depth algorithms will work with it out of the box.
All examples requiring a ZED camera also require a physical ZED camera for input;
I’m just wondering if there’s a way to simulate the inputs of the ZED camera through the isaac sim?
Apparently Carter uses a ZED camera alongside a v4l2 lidar but I have had little success in trying to adapt the carter_sim app in order to provide the correct simulated output for these examples.
Can anyone help out?