How can I check transformation relationship between robot base (or gripper) and target position?


I am trying to use ‘UR5e + 2f-140 + realsense D435i (hand-eye)’ in several tasks and now I make the UR5e pick and place the cube. The code saves depth and RGB images at "${HOME}/isaac_sim_image".

[Question 1]
When I try to use another USD file that has a different home position robot, the UR5e works with a different configuration even though it has the same target position. It looks like it is adjusted the transformation offset from the gripper. How can I fix this problem?

Q2. Solved

[Question 2]
I want to normalize the depth images between 0 to the maximum value which is set by the camera. Which value should I have to set with the maximum value? Is it ok to set the maximum value with Focal distance?

=> I checked the raw output of the depth camera. The range was 0~70m so, I think I have to control the normalization range depending on my environment.

[Question 3]
How can I set the initial viewpoint right after running the code? I think the default viewpoint is the perspective and I want it but, my code shows with hand-eye viewpoint.

I share my codes and the problematic USD has the name ‘ur5e_handeye_gripper_upright.usd’. (6.8 MB)


Hi - Sorry for the delay. Someone from our team will review and respond to you.

1 Like

[Question 1] - The position solver always assumes robot at origin, so I’d instruct you to pick robot base pose, and transform the global target position by the local base pose. (base inverse * global target)

[Question 3]
alternative 1 (you also need to give it a camera pose and target to look at):

from omni.isaac.core.utils.viewports import set_camera_view
set_camera_view(eye=[-6, -15.5, 6.5], target=[-6, 10.5, -1], camera_prim_path="/OmniverseKit_Persp")

alternative 2:

viewport = omni.kit.viewport.utility.get_active_viewport()
viewport.camera_path = "/OmniverseKit_Persp"


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.