Hello,
I am trying to build a digital twin of a mobile robot and its environment. I have already built the environment in Create with the required rendering and lighting. The robot model is imported from the URDF, verified, and decorated precisely like the real robot.
Now, the robot is operated inside (let’s assume only a single room to make the task simple). I want to capture the robot’s motion into the virtual/digital twin to keep them synchronized. What sensors or IoT should the environment and the robot have?
I noticed there is not much difference in my application and tracking autonomous mobile robots (AMRs) in warehouses. My focus here is on tracking the pose, not the robot’s autonomy. Assuming the robot is remotely controlled, I want to capture its current pose data and connect it into Omniverse. What are the best sensors to precisely track the robot poses in this case, and what workflow should I consider to bring the data into Isaac simulation?
Thanks for your help!