Where is the world origin?

Hi guys,

I am having some trouble figuring out where the world origin is and the position of my robot (carter) with respect to this origin at any given time.

I currently have 2 different approaches which I am using to try and figure out where the robot is with respect to the world.

Option 1: Pose Tree - This is the approach I feel is the correct one as the PoseTree is where we should go for all pose related information. However, I was getting some unusual results and wanted to check other options for getting the ground-truth pose.

Option 2: RigidBody3Proto - I am led to understand that pose information in these messages comes straight from the simulation. I was wanting to see if it gave identical results to the PoseTree and if not which one was correct.

The Issue

I cannot say 100% confidently which is the correct method or where the global “world” origin truly is.

My current thinking is that PoseTree world2robot is correct but I cannot see the direct relation to the RigidBody3Proto poses.


  1. Can somebody confirm my understanding and explain how the poses for RigidBody3Proto works?
  2. Does Isaac Sim (Unreal) and the SDK use the same coordinate system and if not how do they differ?
  3. Is RigidBody3Proto in the SDK coordinate system or in the Unreal coordinate system?
  4. Is the reference frame/origin for RigidBody3Proto the spawn point of the robot?
  5. What is the true world origin that should be used in applications and how does it relate to the PoseTree and the RigidBody3Proto poses?
  6. How do you set the world origin in a config file?

Any help with any or all of these questions would be greatly appreciated.

Apologies for the rambling nature of the post but I have been wrestling with this for a while now.

The rigid body pose is specified in the the simulator coordinate frame and the origin of the coordinate system is dependent on the environment. As each map in Unreal is different, it has its own origin.

The one slight difference between the SDK and Simulation is that the rigid body proto is specified in meters and a right handed coordinate system while Unreal natively uses left-handed coordinates and centimeters, A pose sent from the SDK is automatically converted to Unreal native coordinated and vice-versa so that poses on the SDK side remain consistent

There is currently no relationship between the SDK PoseTree and the simulator as each environment has its own origin and the mapping from environment to the SDK is not specified. The mapping itself is a constant transform thatcould be computed for the SDK side if needed.

You could for example specify a SDK-Simulator transform in the SDK pose tree depending on the environment and where you wanted to define your own world origin. This would then allow you to convert to/from your coordinate system to the simulator one transparently.

Let me know if you need help computing this transform and I can provide an example for the warehouse environment

OK I think that has helped me get started. Just a few things I want to confirm before I get going.

So a RigidBody3Proto message should contain the refTBody component which is the location of the body with respect to some reference frame. When I don’t have a reference frame provided (as seems to be the case for carter_sim) how do I find what that reference location is?

Your suggestion about having some transform to map the environment to the SDK would be the best option I think for what I am trying to do and I would greatly appreciate an example for the warehouse environment. What I feel I need is a direct transform from the reference frame in refTBody to the world frame defined in my sdk application.

How would I find where this frame is with respect to the SDK world frame?

I have quickly tried to find where it is myself but think I may have just gotten myself more confused. When I print out the RigidBody3Proto refTBody pose I get the following:

Translation: {X: -0.028863, Y: -10.9996, Z: 0.809041}

This lines up well with what I see in the carter_full_config.json and the carter.config.json which set the actor carter_1 and the worldTrobot_init of robot_pose_initializer respectively to:

“pose”: [1, 0.0, 0.0, 0, 0, -11.0, 0.9] and “pose”: [1, 0.0, 0.0, 0, 0, 11.0, 0.9]

which are the settings defined for the Hospital environment https://docs.nvidia.com/isaac/isaac_sim/content/maps/hospital.html

However, when I print worldTrobot_init as attained by:

std::optional temp_worldtrobotinit = node()->pose().tryGet(“world”, “robot_init”, timestamp);

I get the following output:

Translation: {X: 48.56, Y: 5.4, Z: 0}

which does not match up to anything I have seen so far. Has carter_sim_joystick redefined the world origin somewhere along the line?

Apologies if this should all be simple but any help you can give is very much appreciated.

OK as an update, some of my confusion ended up coming from a typo in my code from where I had changed robot_pose_initialzer.

From my understanding here is what is happening:

  1. RigidBodyPoses are all given with the reference frame being the origin of the world origin in Unreal.
  2. The worldTrobot_init is not based on this same world origin, but the origin of the map from the localizer.
  3. The initial position of the robot’s RigidBodyPose is defined as the carter_1 pose in carter_full_config.json

From previous messages I understand that the RigidBodyProto undergoes some transform to go from unreal coordinates to SDK coordinates and that this shifts us from LH coordinates to RH coordinates. Does this also shift the origin of the object (robot) from the centre of the object to the base of the object?

Do you have an example of how to convert from original Unreal coordinates to the SDK coordinates (from LH coordinates with origins in the centre of the object to RH coordinates with origins at the base of the object)?

Hi d20,

Glad to hear you figured it all out.
Your conclusions are correct. As you can see thein the note in the beginning of the contents chapter (https://docs.nvidia.com/isaac/isaac_sim/content/maps/index.html) the diff of coordinates in Sim and SDK are about 1) LH to RH and 2) cm to meters.

We don’t have any example for the transformation, but I am guessing should be straightforward to calculate and test.


Hey all,

Sorry to bring this thread up again but I had left it alone for a while thinking I was all good and everything was sorted … but it wasn’t. I thought I would bring up the issue here again to check if it is something I am doing wrong or if Isaac is not doing something it should be.

Crux of the matter is that there is confusion between examples and documentation and I want to know which is the “correct” way things should be interpreted in Isaac. I am pretty sure I have figured out what the situation is but I want to be sure and to point out the issue in case it is something that needs to be fixed for the next release of Isaac.

On the coordinate frames page (https://docs.nvidia.com/isaac/isaac/packages/perception/doc/coord_frame.html#robot-coordinate-frame) it says that for most robots, the center of the coordinate frame is, “generally placed on the ground and in the center of rotation of the robot”. It also says though that for Carter “the robot coordinate frame is centered between the two main wheels”.

Now this all makes sense to me. What doesn’t make as much sense is the disparity between what I see in the carter_full_config.json, what is defined in the carter.config.json, what I get from the RigidBodyProto, and what I see in the Unreal Editor.

Between carter.config.json and carter_full_config.json I see that sensors like the camera and Lidar are offset by 0.24 with carter.config.json always being higher than carter_full_config.json. From this I had originally inferred that in the SDK, the robot coordinate frame was at the base of the robot as the wheel radius for carter is 0.24. So in my mind that meant that SDK uses a “robot base” coordinate frame despite what is said in the documentation.

When I look at the output from the RigidBodyProto for the Carter robot and see what coordinates the Carter robot has in within the Unreal Editor I get almost what I expected. The coordinates of x are the same, y is inverted from the change from LH and RH coordinate frames, and the SDK is in metres rather than centimetres. However, the z values for both the RigidBodyProto and the Unreal simulator are the same (0.24 m). Given what I had found before, I would have expected the RigidBodyProto (which is now in the SDK coordinate system) to be zero, assuming that the SDK is using the “robot base” coordinate system.

What I am currently assuming is happening is that somewhere along the line the base of the Carter coordinate frame in the SDK was switched from the base of the robot to the centre of the wheels and the sensor configuration in carter.config.json weren’t updated to match this and had assumed a “robot base” coordinate frame. My plan is just to change the carter.config.json to match this assumption.

I would like confirmation that my reasoning is correct here though as it is critically important that I know, within the SDK, which robot coordinate system to use when determining where sensors are relative to the robot. Then I can use the RigidBodyProto reliably alongside this information to get the ground-truth location of the camera sensor within the simulated environment.

Apologies for the long post and many thanks in advance for any help you can provide.