Isaac sim provides methods for manipulator control, but how is it possible to apply these to real robots after testing these in isaac sim?
As far as I know, many robots operate based on the trajectory controller. How is the manipulator application developed in isaac sim possible to interwork with these robots?
I know that Isaac sim also provides ros, but does it also provide ros related to manipulator (trajectory)? Or should I use joint control from isaac sim tutorial?
What I want to ask is that there is a robot I want to control in a real environment, so I brought the urdf file for this robot and loaded the isaac sim. After building an environment in isaac sim and implementing an application to control this robot using various sensors, I am curious about how to apply this application to a real robot. The object I am interested in is the robot arm, and I know that many robots operate based on the trajectory controller. What interface should I use between the application developed in isaac sim and the actual robot arm? Please forgive me for my unfamiliarity with this part and ask for various advice.
You need to stream the simulated robot’s joint configurations out on ROS to an external controller. That controller might be implemented by the OEM and may require interpolating / buffering the streamed configuration.
An important consideration is the simulation needs to be synchronized with the physical robot, so the physical robot should be streaming out the measured joint configurations on ROS and before trying to control the physical robot, we need to make sure the simulated robot is set to start from the measured joint configuration.
We have an example for Franka using the FrankaROS interface demonstrating that pipeline and handshake in the Omni Isaac Cortex system. But it’s not yet implemented out of the box for general robots. Those tools, though, are a good starting point for understanding how such a pipeline can be designed, and they’re generally robot agnostic. See Cortex: Controlling Physical Robots.