Bypasing the IMU Proto for Stereo Visual Odometry

Hello,
I use a IMU with an accelerometer, a gyroscope and a compass so acceleration, speed and position.
I gathers those and use a carefully calibrated madgwick AHRS algorithm to get precise positioning as a quaternion.
Is there a way to directly send the quaternion to the svo app so I can get better results than with a simple bias zeroing ?

Thank you,
regards

Hello Planktos,

In the current release, there is no way to provide external orientation to the Stereo Visual Odometry (SVO). We plan to add this feature in the foreseeable future.

To increase SVO accuracy and performance, you could provide raw IMU data, that uses for motion prediction, checking results, and as a failover. However, all these solutions help to decrease odometry drift, but can’t exclude it.

Consider making an external system (other codelet) that would mix (fuse) SVO output with other sensors. For example, you can detect the horizontal drift, using quaternion calculated from IMU and reset SVO (or update World<->SVO mapping).

Dmitry

2 Likes

Hello @Planktos and @dslepichev

Thanks for this thread. I’m using the SVIO codelet for realsense D435 on the Jetson Nano. I am looking to get better accuracy and was thinking about providing data from an external IMU.

@dslepichev from your comment, it looks like this can be done. Do you have any reference codes which I can follow to do this? How difficult is it to integrate readings from an external IMU to the SVIO gem?

Hello @arjunsengupta98,
This is the IMU proto :

# Message published by an inertial measurement unit (IMU)
struct ImuProto {
  # The linear acceleration of the body frame along the primary axes
  linearAccelerationX @0: Float32;
  linearAccelerationY @1: Float32;
  linearAccelerationZ @2: Float32;
  # The angular velocity of the body frame around the primary axes
  angularVelocityX @3: Float32;
  angularVelocityY @4: Float32;
  angularVelocityZ @5: Float32;
  # Optional angles as integrated by a potential internal estimator
  angleYaw @6: Float32;
  anglePitch @7: Float32;
  angleRoll @8: Float32;
}

You can feed the imu proto to the svo node with this edge :

      {
        "source": "zed/ZedImuReader/imu_raw",
        "target": "tracker/StereoVisualOdometry/imu"
      },

If you use a BMI160 :

  • please follow this
  • and take a look at the imu.app.json in Isaac\apps\samples\imu

If you don’t use a bmi160 you need (I’m probably making some typo there but this the general idea) :

  • new node :
      {
        "name": "my_imu",
        "components": [
          {
            "name": "isaac.alice.MessageLedger",
            "type": "isaac::alice::MessageLedger"
          },
          {
            "name": "my_imu",
            "type": "isaac::my_imu"
          }
        ]
      }
  • Modify the imu edge to :
      {
        "source": "zed/ZedImuReader/imu_raw",
        "target": "my_imu"
      },
  • my_imu hpp :
#pragma once
#include "messages/imu.capnp.h"

namespace isaac {
class my_imu : public alice::Codelet {
 public:
  void start() override;
  void tick() override;
  ISAAC_PROTO_RX(ImuProto, start_trigger);
  ISAAC_PROTO_TX(ImuProto, my_imu_out);
};}  // namespace isaac
ISAAC_ALICE_REGISTER_CODELET(isaac::my_imu);
  • my_imu.cpp :
#include "my_imu.hpp"
#include <memory>
#include <utility>
#include "engine/core/assert.hpp"

namespace isaac {
void my_imu::start() {tickOnMessage(rx_start_trigger());}
void my_imu::tick() {
auto my_imu_proto = tx_my_imu_out().initProto();
Float32 my_data[9]={0,0,0,0,0,0,0,0,0};
//Fill your data here
//my_data = My_i2c_function();//TODO
my_data = My_serial_function();//TODO
//set the ImuProto
my_imu_proto.setLinearAccelerationX = my_data[0];
my_imu_proto.setLinearAccelerationY = my_data[1];
my_imu_proto.setLinearAccelerationZ = my_data[2];
my_imu_proto.setAngularVelocityX = my_data[3];
my_imu_proto.setAngularVelocityY = my_data[4];
my_imu_proto.setAngularVelocityZ = my_data[5];
//Optional angles as integrated by a potential internal estimator
/*my_imu_proto.angleYaw = my_data[0];
my_imu_proto.anglePitch = my_data[0];
my_imu_proto.angleRoll = my_data[0];*/
//Publish ImuProto
tx_my_imu_out().publish(rx_start_trigger.acqtime());
}}  // namespace isaac

  • BUILD sources :
load("//engine/build:isaac.bzl", "isaac_app", "isaac_py_app", "isaac_cc_module")
isaac_cc_module(
    name = "my_imu",
    srcs = ["my_imu.cpp"],
    hdrs = ["my_imu.hpp"],
    visibility = ["//visibility:public"],
    deps = ["//engine/core/math",],
)

In two weeks I will receive an I2C IMU (currently using serial IMU) so I could send you my svo project including the I2C part and a clean working code (the one above is not tested yet) if you are waiting 2 weeks.

Regards,
Planktos

I use BMI160 + Realsense435. In stereo_vo, if I add imu_raw fom BMI160, it stops.
What is procedure to add BMI160 imu for stereo_vo? Must add some posetree elements ?

Hi @kaspars.cabs,
What is the error when it stop ?
The imu pose is initialised with this line :"imu_frame": "zed_imu"
you can specify your own imu frame : "imu_frame": "BMI_imu" and write a new pose in a .cpp codelet,
also don’t forget to set "enable_imu": true, to false in the zed config.
If you have no error when it stop could you share your code so I could get a look ?
Regards,
Planktos

Maybe you forgot to set the tick methode of the imu… can you print the imu data in an other app to see if the imu work ?

IMU is fully working, I use IMU for differentialspeed odometry.
My config part:

"config": {
    "tracker": {
        "StereoVisualOdometry": {
            "horizontal_stereo_camera": true,
            "process_imu_readings": false,
            "lhs_camera_frame": "left_ir_camera",
            "rhs_camera_frame": "right_ir_camera",
            "imu_frame": "imu",
            "tick_period": "1000s",
            "num_points": 50
        },
        "splitim_component": {
            "lhs_camera_frame": "left_ir_camera",
            "rhs_camera_frame": "right_ir_camera"
        }

Split_im splits realsense camera left and right images.
My edge part:

Blockquote
“edges”: [
{
“source”: “subgraph/interface/left_ir”,
“target”: “tracker/splitim_component/colorimage_left”
},
{
“source”: “subgraph/interface/right_ir”,
“target”: “tracker/splitim_component/colorimage_right”
},
{
“source”: “tracker/splitim_component/image_left”,
“target”: “tracker/StereoVisualOdometry/left_image”
},
{
“source”: “tracker/splitim_component/image_right”,
“target”: “tracker/StereoVisualOdometry/right_image”
},
{
“source”: “tracker/splitim_component/intrinsics_left”,
“target”: “tracker/StereoVisualOdometry/left_intrinsics”
},
{
“source”: “tracker/splitim_component/intrinsics_right”,
“target”: “tracker/StereoVisualOdometry/right_intrinsics”
},
{
“source”: “subgraph/interface/Odometry”,
“target”: “tracker/StereoVisualOdometry/imu”
},
{
“source”: “tracker/StereoVisualOdometry/coordinates”,
“target”: “subgraph/interface/coordinates”
},
{
“source”: “tracker/StereoVisualOdometry/features”,
“target”: “subgraph/interface/features”
},
{
“source”: “tracker/StereoVisualOdometry/left_camera_pose”,
“target”: “subgraph/interface/camerapose”
}
]

subgraph/interface/Odometry is imu_raw from main json. Without stereo_vo works (but with errors in angle movement) but if I set “process_imu_readings”: true, it stops.

Thanks a ton @Planktos! I too am waiting for my IMU to arrive. I shall be using a BMI088. SHall try this out as soon as it arrives. This should be super helpful :)

@kaspars.cabs, can you try without "tick_period": "1000s", "num_points": 50.
Is subgraph/interface/Odometry ticking at the right frequency ?

@arjunsengupta98 very cool, I will use a BNO080 and a generic stereo camera from ELP for less than 100$, could be nice to share our data to make a benchmark on hardware when we all succeeded and also try the AHRS or karman filter to try to improve accuracy !

1 Like

Sounds good !

Hello @kaspars.cabs were you able to have any success? Would it be possible for you to share your code?

For odometry , I crete own solution of image splitter.

Main thinks:
Synhronized in time output from ColorCameraProto to ImageProto and CameraIntrinsicsProto.
Probably nvidia solution not synchronize time or do not set Poses.

Code for left image (images from realsense)

   void clsynchrosplitim::tick()
    {

        const double time = rx_colorimage_left().acqtime();

        const auto &isimleft = rx_colorimage_left().getProto();
        auto imageproto_left = isimleft.getImage();
        auto ph_left = isimleft.getPinhole();
        //auto ds_left = isimleft.getDistortion();

        const auto &isimright = rx_colorimage_right().getProto();
        auto imageproto_right = isimright.getImage();
        auto ph_right = isimright.getPinhole();
        //auto ds_right = isimright.getDistortion();

        ImageConstView1ub image_left;
        if (FromProto(imageproto_left, rx_colorimage_left().buffers(), image_left))
        {
            Image1ub output_image(image_left.rows(), image_left.cols());
            Copy(image_left, output_image);

            auto output = tx_image_left().initProto();
            output.setChannels(imageproto_left.getChannels());
            output.setCols(imageproto_left.getCols());
            output.setRows(imageproto_left.getRows());
            output.setElementType(imageproto_left.getElementType());

            ToProto(std::move(output_image), output, tx_image_left().buffers());
            tx_image_left().publish(time);

            auto intr = tx_intrinsics_left().initProto();

            auto distortionL = intr.initDistortion();
            distortionL.setModel(DistortionProto::DistortionModel::BROWN);
            Vector5d disto;
            disto << 0, 0, 0, 0, 0;
            ToProto(disto, distortionL.getCoefficients());


            intr.setDistortion(distortionL);
            intr.setPinhole(ph_left);
            tx_intrinsics_left().publish(time);
        }

Pose set:

        Pose3d extrinsics;
        double qq[4] = {0,0,0,0};
        extrinsics.rotation = SO3<double>::FromQuaternion(Quaterniond(qq));
        extrinsics.translation = Vector3d(0.06,0,0);
        const bool bleft_T_right_camera_set = node()->pose().trySet(get_lhs_camera_frame(), get_rhs_camera_frame(),extrinsics, 0);
        if (!bleft_T_right_camera_set)
            printf("Pose set status: %i\n",(int)bleft_T_right_camera_set);

okay, I see. Thanks for sharing your code though

Hey @Planktos

Any success on this?

Hi @arjunsengupta98
I am working on it,
I have some issues to read my IMU at 200Hz, I’m using usb but can’t go more than 20Hz, soon as I get pass this issue I will be able to optimize SVO and release my accuracy.
Later I will use servo motors to get a precise mesurement of SVO accuracy and responsiveness for rotation and translation.
In the same time I’m working on mesuring accuracy of open pose with a laser pointing to a specified bone.
Thank you for taking news !

1 Like

Hi again @arjunsengupta98,
I did solved my issue about IMU frequency.
I’m working on rendering a 3d skeleton based on humman pose gem and coarse to fine depth gem in my qt gui using opengl so it’s taking some time,
however the svo output look consistent and should work perfectly with the right parameters
next step is optimising the accuracy but I have to say it’s not very obvious how to get all the camera parameters right when not provided by the constructor.
All in all I should have a decent accuracy next month and will begin accuracy mesurement and optimisation with some servo motor then.
Also thinking of making my code opensource on github but will see when it’s time.
(currently I got gamepad, motors, humman pose and gui working perfectly just need some more 3D and svo optimisation)
And after that should’nt be long before I release my product a last ! :D

Hope you are doing well,
Regards,
Planktos

1 Like

Hello @Planktos

That’s great to hear! I’m good, thanks - as I hope you are too.

Sorry for the late response - I hadn’t logged into my Nvidia developer account in a while!
I won’t be working on the project anymore since I’ve left my previous company to pursue my graduate studies. I shall definitely share this thread with my former colleagues though (those continuing work on the project). You guys may benefit from connecting with each other.

I shall drop in a personal message to you as well. All the best on your project! :D

Regards,
Arjun

1 Like