3D SLAM on a drone with a Jetson TX2

Dear People, I have read this page:

Where it talks about some SLAM:

To perform tracking, the ZED uses a novel depth-based SLAM (Simultaneous Localization and Mapping) technology that was developed from scratch and optimized to run at high speed. The camera maps the three-dimensional world in front of it in real time and understands how the user moves through space. On desktop and laptop GPUs, tracking runs at camera frame-rate, which means you can get up to 100Hz tracking frequency in WVGA mode.

Positional tracking API is available in ZED SDK 1.0 for Windows, Linux, and Jetson. The tracking API also supports Unity, ROS and other third-party libraries. Check out our samples on GitHub and get started.

The question is?

Where I can get the source code of depth-based SLAM (Simultaneous Localization and Mapping) technology that was developed from scratch and optimized to run at high speed and how can I integrate it into ROS?

or Is there any already implemented SLAM solution for a drone with ZED and Jetsontx2??

Thank you

Hi chavezalfredo, there are two realtime open-source SLAM solutions already in ROS that the Jetson community has been able to run on TX2:

Recommend looking into these packages. Also, if you need a ROS wrapper for Stereolabs ZED SDK, see the zed-ros-wrapper node.