1. I’d like to perform path planning using a UR10 robot instead of Franka.
Q1) In my first attempt, I tried to create a user example for manipulator path planning and import a UR10 robot, but it didn’t work. I would like to know how to import a UR10 robot into this example.
Q2) I’d like to learn how to add a new target object to the example mentioned above.
Q3) As a second approach, I imported the UR10 robot from the “import robots” section. I want to open the code for this file and integrate the path planning code. Is that possible?
What didn’t work when you tried to rewrite the example? Did you get error messages?
2. I want to create paths using the Artificial Potential Field (APF) path planning algorithm instead of the default one.
Q4) Similarly, I’m wondering if it’s possible to use the APF path planning algorithm in the manipulation path planning examples. Please provide guidance on how to do this.
If you want to bring your own planning and control algorithms, you can use the Isaac Articulation interface to send joint position commands for the robot.
Here is an example of how to use the Articulation interface for the UR10.
First I add a UR10 to the scene using “Create → Isaac → Robots → Universal Robots → UR10”
Then I press play to start the simulation and open the script window using “Window → Script Editor”
With the simulation running, you can then use a little script to see the names of each joint in the articulation, set target positions and read joint positions:
from omni.isaac.core.articulations import Articulation, ArticulationSubset
from omni.isaac.core.utils.types import ArticulationAction
import numpy as np
# Load robot
robot = Articulation("/UR10")
robot.initialize()
print("DOF names:", robot.dof_names)
# Set position for all joints
robot.apply_action(ArticulationAction(np.array([1.0, 0.0,2.0,3.0,0.0,0.0])))
# Print position
position = robot.get_joint_positions()
print("position:", position)
Here is my expected output from running it twice. Note the first time the position readout is near zero because the robot hasn’t had time to move directly after applying action in the same step. The second set of output very closely approximates the target position.
DOF names: ['shoulder_pan_joint', 'shoulder_lift_joint', 'elbow_joint', 'wrist_1_joint', 'wrist_2_joint', 'wrist_3_joint']
position: [ 1.2860776e-06 7.5394190e-03 1.6797542e-03 -1.4944457e-06
1.0166268e-07 -6.6924440e-07]
DOF names: ['shoulder_pan_joint', 'shoulder_lift_joint', 'elbow_joint', 'wrist_1_joint', 'wrist_2_joint', 'wrist_3_joint']
position: [ 9.9996537e-01 3.5953731e-03 1.9993634e+00 3.0001130e+00
-6.8680338e-06 3.4853120e-09]
This Articulation API can be used as part of a standalone python application (1.2. Isaac Sim Workflows — Omniverse IsaacSim latest documentation) or a custom extension (1.2. Isaac Sim Workflows — Omniverse IsaacSim latest documentation).
If you decide to use the custom extension route, there is a set of Isaac Sim extension templates that are a great launching off point: 3.1. Isaac Sim Extension Templates — Omniverse IsaacSim latest documentation