Robot Pose uncertainty

In a current robotic project, we are using the ParicleFilterLocalization from the navigation package. This gives the pose of the robot to world frame in a known map and is written to the pose tree. This is accessible and successfully used in our own path planning algorithm. My questions are:

  1. Is this pose tree purely based on the ParticleFilterLocalization component, or is there a motion model in the background writing to the pose tree as well (a.k.a. the prediction of our pose)?

If not, then we will add the motion model and write to the pose tree, but we just need to know this answer.

  1. Does their exist an uncertainty matrix for the robot pose when using the ParticleFilterLocalization, or anywhere in the Isaac SDK, and how can it be accessed? Or is the ParticleFilterLocalization taken as absolute truth? That would make it tough to integrate other sensors and codelets for them that we write since the lidar would end up always trumping any of our other measurements.

I would like to read and write to this matrix as more sensors are integrated into the system for the best estimate of the pose as well as certainty of that pose.

I see in the API overview where it says under isaac.navigation.particlefilterlocalization:

“For every tick the particle distribution is updated based on an ego motion estimate read from the pose tree”

Which I infer as the robot motion model prediction. Then the rest talks about how the filter uses the range results to update the pose tree as well. Is this the correct interpretation?

This would still leave the question of how to access the robot uncertainty of its location. Does this exist in the pose tree as well? Also, is there access to see this motion model? If not, I’m sure it can be inferred as well. There is plenty of documentation for differential drive robot motion. Just curious if the model is viewable.