Zero values in /visual_slam/tracking/odometry topic

Hi,

I am using the isaac_ros_vslam package with an Intel Realsense D455, integrated with the robot_localization package to get my robot localization.

Here, I’m facing the following problem: some covariance values of the /visual_slam/tracking/odometry topic are 0. This doesn’t happen all the time, but it happens quite often, making the EKF filter from robot_localization unstable.

As a workarund, I’m using the /visual_slam/tracking/vo_pose_covariance instead, as it’s value doesn’t have this problem.

Is this the expected behavior or is there some kind of issue?

This is an example of my /visual_slam/tracking/odometry topic


header:
stamp:
sec: 1718006137
nanosec: 72434082
frame_id: map
child_frame_id: base_link
pose:
pose:
position:
x: -0.18829065561294556
y: -0.7475811839103699
z: -0.14418889582157135
orientation:
x: 0.0009961834115019804
y: -0.03407587459564546
z: -0.05355979702336192
w: 0.997982553100213
covariance:

  • 0.0
  • 0.0
  • 0.0
  • 3.556602952334916e-20
  • -6.852714414421259e-19
  • 3.608248659603303e-19
  • 0.0
  • 0.0
  • 0.0
  • 5.659126527693679e-19
  • -3.7443462688228286e-18
  • -1.4718286660177989e-18
  • 0.0
  • 0.0
  • 0.0
  • -1.2553646994699983e-19
  • 1.0993890695959257e-18
  • 3.493265634971272e-18
  • 3.556602952334916e-20
  • 5.659126527693679e-19
  • -1.2553646994699983e-19
  • 7.932368483580105e-15
  • -6.72113073918508e-15
  • -8.448243274648833e-15
  • -6.852714414421259e-19
  • -3.7443462688228286e-18
  • 1.0993890695959257e-18
  • -6.721130739185079e-15
  • 1.8764574328284905e-14
  • 1.3875256018809103e-14
  • 3.608248659603303e-19
  • -1.4718286660177989e-18
  • 3.493265634971272e-18
  • -8.448243274648832e-15
  • 1.3875256018809105e-14
  • 2.5194641515562394e-14
    twist:
    twist:
    linear:
    x: 0.0
    y: 0.0
    z: 0.0
    angular:
    x: 7.583187826818249e-09
    y: -3.1724994084912666e-07
    z: -4.5952138011680034e-07
    covariance:
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 0.0
  • 5.6311578418487885e-14
  • -1.559435015746044e-14
  • -5.398157851896417e-14
  • 0.0
  • 0.0
  • 0.0
  • -1.559435015746044e-14
  • 2.510679075672562e-14
  • 3.039358842826922e-14
  • 0.0
  • 0.0
  • 0.0
  • -5.398157851896417e-14
  • 3.039358842826922e-14
  • 8.488513363067446e-14

Hi @maciekPR

The engineers are debugging your issue, and I will keep you posted for updates.

Raffaello

Thanks! Just mention that I’m using the release-2.1

1 Like

Hi @maciekPR

To better assist you, could you please provide the responses to the following questions:

  • What setup are you currently using?
  • Can you update to Isaac ROS 3.0?

Hi @Raffaello ,

At this moment I won’t be able to update to Isaac ROS 3.0, as I’m working on a customized docker environment and probably I’ll need to modify it to avoid dependency errors. Maybe at some point I’ll be able to test the node following the quickstart guide (I’ve already done it in the past but I didn’t check the topic covariance) and check if I can reproduce this behavior on version 2.1 and 3.0. I’ll let you know.

About the setup, I’m using the node with a realsense D455 camera.

Hi @maciekPR

We wouldn’t recommend using EKF with Visual Odometry. EKF requires a known covariance. However, due to the nature of Visual odometry, robust covariance calculation is impossible in some cases, such as when the robot sees a large moving object, white wall, or repetitive structure.
It’s better to think that visual odometry has good constant robustness most of the time, but sometimes, it is completely wrong and impossible to detect this case robustly.

The multicamera visual odometry was implemented in the Isaac 3.0 release to increase the robustness of visual odometry.

For practical application, it’s recommended to use an external “odometry fail detector” and use constant covariance (measure your own, using your camera, environment, and resolution)

Both topics with covariances right now return only approximate solutions:

  1. /visual_slam/tracking/vo_pose_covariance - covariance from the last PnP.

  2. covariance from /visual_slam/tracking/odometry - rough approximation for low speed motion. Deprecated. It will be removed in future releases.

You were right to notice some issues in covariance values.

Thanks for that.
The next minor release will fix it.

However, neither output can be used to detect cases where odometry fails.

Hi @Raffaello ,

Thanks for your analysis. We’ll have it in mind.

By now, we are using /visual_slam/tracking/vo_pose_covariance and we are integrating it into the robot_localization EKF, and we are getting quite good results.

But, in your opinion, which would be the best option to use the VSLAM package? Just use it alone without integrating it in the robot_localization EKF and letting it publish the odom>map transform?

It’s great to hear that you are having success with the vo_pose_covariance.

For localization, we recommend enabling the enable_localization_n_mapping flag. This flag enables loop closure correction and produces a much more accurate pose, but with jumps, which makes it unusable for EKF.

Hi @Raffaello ,

Just one question regarding this

Can you provide me some link (tutorial, github/branch, launch file…) to get started with the multicamera visual odometry?

Thank you

Hi @maciekPR

We made an example of multicamera visual odometry for Hawk camera:

Best,
Raffaello