Attempting to create custom Robot

Hi I’m trying to move the joints of a custom robot I’ve imported using the URDF importer in python.
I’m having trouble with getting the articulation of the robots, which I am trying to do with these steps:

(ret, articulationJointAPI) = execute(“ApplyAPISchemaCommand”, api=PhysicsSchema.ArticulationJointAPI, prim=kinova_j1)
articulationJointAPI.CreateArticulationTypeAttr().Set(“articulatedJoint”)

dc.get_articulation(kinova_prim.GetPath().pathString)

I get the following error:

2021-05-04 18:13:58 [424,518ms] [Warning] [omni.isaac.dynamic_control.plugin] DcGetArticulationBodyCount: Function called while not simulating

2021-05-04 18:13:58 [424,518ms] [Error] [omni.isaac.dynamic_control.plugin] DcGetArticulationBodyCount: Invalid or expired articulation handle

I see that there is a is_simulating() function for dc but no “play()” or “start()” function. Is the error referring to the editor? Is there something I need to call for this to load successfully? Or do I need to edit my URDF?

Hi @ss14499

This post could help:

Also, you can playthe editor using next code:

editor = omni.kit.editor.get_editor_interface()
if not editor.is_playing():
    editor.play()

When I set editor.play(), I can see that editor.is_playing() returns True but dc.is_simulating still returns False. Is that expected? When does dc.is_simulating() return true?

Hi,

Are you calling the method inside the editor step event?
Could you provide more details about how are you running your code?

I was calling it outside the step event before.

And, what happens if it is called within the step event?
By the way, How are you running your code: using OmniKitHelper, Script editor, as an extension, etc?

Yes, it worked within the step callback function. I’m running my script usingg OmniKitHelper.

I was able to use the link you provided to make a callback function that gets called when the kit updates. Within that, I can see that the dc.is_simulating evaluates to True.

I’m still getting a warning when I try to add articulation:

**2021-05-05 17:21:50 [80,431ms] [Warning] [omni.isaac.dynamic_control.plugin] Failed to find articulation at '/GEN3_6DOF_NO_VISION_URDF_ARM_V01/shoulder_link/joint_4'**

Does my URDF need to have any additional tags on these joints? Maybe I don’t understand what it means to make an articulation…can both revolute joints and continuous joints be articulations? Is that just something you need when you want to move a part of your model?

I recommend you to read the documentation related to Rigging a Robot. There is information about definition, use case and configuration for joints, articulations, etc…

Could you share your stage and some code snippet?

1 Like

yes here’s the main code

kit = OmniKitHelper(config=CONFIG)
editor = omni.kit.editor.get_editor_interface()
stage = kit.get_stage()

physxIFace = _physx.acquire_physx_interface()

dc = _dynamic_control.acquire_dynamic_control_interface()
mp = _motion_planning.acquire_motion_planning_interface()

editor.set_camera_position("/OmniverseKit_Persp", -200, 150, 200, True)
editor.set_camera_target("/OmniverseKit_Persp", 0,0,0, True)

editor.stop()

physxIFace.release_physics_objects()


allegro = MovingArm(stage, editor, dc, mp)
allegro.load_kinova()
urdf = allegro.get_urdf()


kinova_prim = stage.GetDefaultPrim()

config_kinova(stage, dc, editor, kinova_prim) #add drive to arm
setup_physics(stage) #gravity to scene

wristPrim = stage.DefinePrim("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_1_link/hand", "Xform")
stage.SetDefaultPrim(wristPrim)
allegro.load_hand()
set_translate(stage.GetDefaultPrim(),Gf.Vec3d(0, 0, 127))

#move path of prims
src="/allegro_hand_right"
dst ="/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_1_link/hand"
MovePrimCommand(src,dst).do()

scene = PhysicsSchema.PhysicsScene.Define(stage, Sdf.Path("/physicsScene"))
scene.CreateGravityAttr().Set(Gf.Vec3f(0.0, 0.0, -981.0))


physics_api = PhysicsSchema.PhysicsAPI.Apply(kinova_prim)
physics_api.CreateBodyTypeAttr("rigid")
# Enable collision on prim
collision_api = PhysicsSchema.CollisionAPI.Apply(kinova_prim)
collision_api.CreateCollisionEnabledAttr().Set(True)
# Set the collision approximation to convexHull
mesh_collision_api = PhysxSchema.PhysxMeshCollisionAPI.Apply(kinova_prim)
mesh_collision_api.CreatePhysxMeshCollisionApproximationAttr().Set("convexHull")

physicsArticulationAPI = PhysicsSchema.ArticulationAPI.Get(stage, kinova_prim.GetPath())
physicsArticulationAPI.GetFixBaseAttr().Set(True)


editor.play()

update_sub = kit.editor.subscribe_to_update_events(allegro.editor_update)

vid = output.VideoOutput()
for i in range(100):
	kit.update()
	vid.add_image()
	time.sleep(0.01)

vid.save_vid("testvid.mp4")

and here are the additional functions called to load the arm, hand, and to update at each time step
def load_kinova(self):
_urdf_interface = _urdf.acquire_urdf_interface()
urdf_config = _urdf.ImportConfig()

		if self._stage:
			if UsdGeom.GetStageUpAxis(self._stage) == UsdGeom.Tokens.y:
				urdf_config.set_up_vector(0, 1, 0)
			if UsdGeom.GetStageUpAxis(self._stage) == UsdGeom.Tokens.z:
				urdf_config.set_up_vector(0, 0, 1)

		root_path, filename = os.path.split(os.path.abspath(KINOVA_PATH))
		import_kinova = _urdf_interface.parse_urdf(root_path, filename, urdf_config)
		self.urdf = _urdf_interface.import_robot(root_path, filename, import_kinova, urdf_config)


def load_hand(self):
	_urdf_interface = _urdf.acquire_urdf_interface()
	urdf_config = _urdf.ImportConfig()

	if self._stage:
		if UsdGeom.GetStageUpAxis(self._stage) == UsdGeom.Tokens.y:
			urdf_config.set_up_vector(0, 1, 0)
		if UsdGeom.GetStageUpAxis(self._stage) == UsdGeom.Tokens.z:
			urdf_config.set_up_vector(0, 0, 1)

	root_path, filename = os.path.split(os.path.abspath(ALLEGRO_PATH))
	import_allegro = _urdf_interface.parse_urdf(root_path, filename, urdf_config)
	self.urdf = _urdf_interface.import_robot(root_path, filename, import_allegro, urdf_config)

def editor_update(self,step):
	import time
	print("step", step)

	if not self.ar:
		self.ar = self._dc.get_articulation("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/shoulder_link/joint_4")

	self._dc.wake_up_articulation(self.ar)

	joint_target = [np.random.rand(9)*2-1]
	print(joint_target)
	self._dc.set_articulation_dof_position_targets(self.ar, joint_target)
	print("step done", time.time())

For context, I’m trying to load two URDF files for an arm(kinova) and hand(allegro) into the Sim and attach them together. I don’t have access to a GUI easily atm. I wanted to be able to load them and have the different joints move randomly, here I’m just trying one joint

This is the config helper function. The numbers are set from a referenced franka file for now:

    def config_kinova(stage, dc, editor, kinova_prim):
    j1 =stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/base_link/joint_1")
    j2_prim = stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/shoulder_link/joint_2")
    j3_prim = stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/bicep_link/joint_3")
    j4_prim = stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/forearm_link/joint_4")
    j5_prim = stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_1_link/joint_5")
    j6_prim = stage.GetPrimAtPath("/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_2_link/joint_6")

    (ret, articulationJointAPI) = execute("ApplyAPISchemaCommand", api=PhysicsSchema.ArticulationJointAPI, prim=j1)
    articulationJointAPI.CreateArticulationTypeAttr().Set("articulatedJoint")

    joint_1 = PhysicsSchema.DriveAPI.Get(j1, "revolute")

    joint_2 = PhysicsSchema.DriveAPI.Get(j2_prim, "revolute")
    joint_3 = PhysicsSchema.DriveAPI.Get(j3_prim, "revolute")
    joint_4 = PhysicsSchema.DriveAPI.Get(j4_prim, "continuous")
    joint_5 = PhysicsSchema.DriveAPI.Get(j5_prim, "revolute")
    joint_6 = PhysicsSchema.DriveAPI.Get(j6_prim, "continuous")

    # Set the drive mode, target, stiffness, damping and max force for each joint
    set_drive_parameters(joint_1, "position", 0.012, 60000, 3000, 8700)
    set_drive_parameters(joint_2, "position", -0.57, 60000, 3000, 8700)
    set_drive_parameters(joint_3, "position", 0, 60000, 3000, 8700)
    set_drive_parameters(joint_4, "position", -2.81, 60000, 3000, 8700)
    set_drive_parameters(joint_5, "position", 0, 25000, 3000, 1200)
    set_drive_parameters(joint_6, "position", 3.037, 15000, 3000, 1200)

    # Set Max Joint velocity on all joints
    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/base_link/joint_1"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)

    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/shoulder_link/joint_2"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)
    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/forearm_link/joint_4"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)
    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/bicep_link/joint_3"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)
    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_1_link/joint_5"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)
    PhysxSchema.PhysxArticulationJointAPI.Get(
        stage, "/GEN3_6DOF_NO_VISION_URDF_ARM_V01/spherical_wrist_2_link/joint_6"
    ).CreatePhysxArticulationJointMaxJointVelocityAttr(10.0)

This looks like a challenging task at first glance. Have you tried remote deployment using docker? This deployment can give you access to the interface using the kit-client…

Yes the problem is I’m working off a Mac OSX. And my Window’s machine renders rather slowly…
The arm and the hand align and I’ve nested the hand under the arm’s prim path, so now I would like to get the joints moving programatically. Is there something that documents the corresponding python functions for every step in Rigging A Robot?

Well, As far as I know, no…

An alternative “solution” to this specific problem could be to combine the arm and the hand by using xacro to generate the final URDF and then import it…

1 Like

Right, but the main problem is making sure I register all of the joints in both the hand and arm correctly. Are the calls I make in config_kinova(stage, dc, editor, kinova_prim) necessary and do I have to do the same for the hand?

These steps are necessary to setup the joints according to the type of drive you want to use

“Currently, two types of drives are supported: position and velocity. For position drive, set a relative high stiffness comparing to the damping, and set a reasonable maxForce. For velocity drive, set only damping and set stiffness to 0.” [joint-drives]

By the way, have you tried the tricks described in this post to speed up rendering?

Thanks for the help. And I have not… I will try that, thanks. But I suspect my machine is the real problem.

If I were to try using the remote kit again, how do I save the scene I have currently loaded in my python script to the nucleus server?

I think this snippet from /isaac-sim/_build/linux-x86_64/release/exts/omni.isaac.step_importer/omni/isaac/step_importer/scripts/mesh_list_widget.py line 21 could help

            usd_context = omni.usd.get_context()

            async def save_and_open(path):
                if omni.usd.get_context().has_pending_edit():
                    await omni.kit.asyncapi.save_stage()
                omni.usd.get_context().close_stage(lambda a, b: omni.usd.get_context().open_stage(path, None))

            asyncio.ensure_future(save_and_open(item.get_usd_path().strip()))
1 Like