Actual Carter Robot Integration

Apologies if this is not the right place for this question!
I am trying to assemble Carter with AGX Orin Devkit as part of a lab project and I was attempting to follow this guide (which has been removed already) https://docs.nvidia.com/isaac/doc/tutorials/carter_hardware.html

However I am very new to robotics and all of this. I have been trying to find resources online to help with the overarching goal of allowing the robot to map, localize, SLAM within my physical lab.

Several issues that I am facing:

  • I am unable to find ros2 segwayrmp, as I want to use keyboard input to control the robot via teleop tools (so that I can move on to some form of SLAM which publishes twist messages as well to autonomously control the robot)
  • Trying to create the map of the lab area, but not sure how to do so. I will be able to record pointcloud data from the Velodyne lidar set up.

There seems to be a new documentation site but I would like to be able to do mapping, localization, SLAM using ros2 packages. (or any other way that doesn’t involve having to send and contact an NVIDIA rep every time I want to build a map because that sounds unintuitive to me, although please correct me if I am wrong)
https://docs.nvidia.com/isaac/doc/tutorials/mapping.html#mapping-services

Would appreciate all help and advice regarding this, thank you!

@ychua041 i am just another user and have limited experience in Issac Sim. I am wondering if this is a tutorial that could be of help?
https://docs.omniverse.nvidia.com/isaacsim/latest/tutorial_external_ros_gems.html

there is a dedicated ROS2 tutorials sections from the left hand menu as well (in case you are interested). Unfortunately, i am not equipped to address your questions in any specificity; so, I’ll let the mods/devs from the team provide the guidance you need moving forward.

@Simplychenable Thanks for the reply. I have looked through the link and I have a lot of questions regarding the whole process.

From what I understand, Isaac Sim is a simulation platform. So if I followed the steps, I would be running the software on the simulation platform instead of the physical robot… is that correct? How can I translate what I’m doing onto the physical robot?

that’s a loaded question with answer outside of my field of expertise, but i think you are on the right track. i’ll let the devs/mods address your questions in greater details when they cross this thread. there are a lot of videos pertaining to Issac Sim on the Nvidia on-demand playlist, though, so perhaps you could find your answer there.

i did come across a video on Issac Cortex that touches on some of the overarching process that might have some information you are looking for (will require an account)

if you’d like to dig into their whole library of videos, you can target your specific field through Search | NVIDIA On-Demand

side note - here’s the page to the Carter hardware you were looking for. they had archived it, it seems.

https://docs.nvidia.com/isaac/archive/2021.1/doc/tutorials/carter_hardware.html

Hi @ychua041 - Here is the documentation that you can refer: NVIDIA Isaac AMR - NVIDIA Docs

Thanks for the replies! Regarding the archived docs: they used AGX Xavier instead of Orin on the robot, and while I was following that guide previously before they removed it, one of the main roadblocks faced was that they used Isaac SDK which only runs on Ubuntu 18.04, while the Orin runs on Ubuntu 20.04.

One of the main blockers I am facing now is trying to get the Segway RMPLite 220 to move via keyboard keypresses. It seems like there is a ros1 package which interfaces with it on GitHub, but no dice for ros2.

Genuine question: From the NVIDIA Isaac AMR docs, it seems like it is able to do mapping, autonomy etc. which is similar to Nav2? I was advised to look into ros2 packages (e.g. explore Nav2) by my lab supervisors. May I ask what is the difference between these libraries, are they basically released by NVIDIA/open source but have similar goals/functionality?