Scan localization during runtime

Hello together,
im running a holonomic Unity simulation and the robot desynchronizes from the map over time.
The initial position guessing works fine, but not afterwards…
Do you have any advice for me where my error is?

Selection_006

Thank you
Best Markus

Hi Markus, which Isaac application with holonomic base navigation did you run? Did you implement the holonomic base model in Unity (because the default Unity simulation shipped only support differential base)? Does the robot moved in Unity? Is the movement consistent with the command send from the navigation app?

In general, the lost localization over time indicates that the “base state” reported by simulation does not match the actual movement of the robot in sim.

1 Like

Hi qianl, thank you for your help!!

I am running the “//apps/navsim:navsim_navigate” navigation app and i changed the all differential_base references that i could find to there holonomic counterparts.
The navigation i am using is the “packages/navigation/apps/holonomic_base_navigation.subgraph.json”
I implemented my own Unity code, but mostly copied for the differential_base:

            // measure current speed and acceleration
            Vector3 vecForward = body.rotation * Vector3.right;
            Vector3 vecSideward = body.rotation * Vector3.forward;
            Vector3 measuredSpeed = new Vector3(Vector3.Dot(body.velocity, vecForward), Vector3.Dot(body.velocity, vecSideward), -body.angularVelocity.y);
            Vector3 measuredAcceleration = (measuredSpeed - lastSpeed) / Time.deltaTime;
            lastAcceleration += TimedSmoothingFactor(Time.deltaTime, accelerationSmoothing) * (measuredAcceleration - lastAcceleration);

            // Publish current state.
            IsaacMessage.State state = new IsaacMessage.State
            {
                schema = "",
                pack = IsaacMessage.CreateTensor(IsaacMessage.ElementType.float64, new int[] { 1, 1, 6 }, 0),
                data = new double[0],
            };
            double[] real_data = new double[] {
              measuredSpeed[0],
              measuredSpeed[1],
              measuredSpeed[2],
              lastAcceleration[0],
              lastAcceleration[1],
              lastAcceleration[2],
            };
            byte[][] buffers2 = new byte[1][];
            buffers2[0] = new byte[real_data.Length * sizeof(double)];
            Buffer.BlockCopy(real_data, 0, buffers2[0], 0, buffers2[0].Length);
            Publish(outputComponent, stateChannelName, state, IsaacMessage.StateProtoId, buffers2);

            // Store current speed for calculating acceleration at the next step
            lastSpeed = measuredSpeed;



            (Vector3 dir, float rot) = PContoller(commandedSpeed);
            //moveRobot(dir, rot);
            Vector2 debug_move = new Vector2(movement[0], movement[2]);
            moveRobot(debug_move, rotation);

            if (body.velocity.magnitude > maxSpeed)
            {
                body.velocity = Vector3.ClampMagnitude(body.velocity, maxSpeed);
            }
        }

The robot can move with command for the navigation stack, but for testing of the localization i mostly move it with the keyboard.

When i start, the it kind of works, but over time the position “drifts” away. My understanding was, that the localization would compensate for that drift with the lidar-scanners?
Selection_016 Selection_017 Selection_018

Update:
i got an big improvement by reducing the sigma values in isaac.flatscan_localization.ParticleFilterLocalization
from:

“initial_sigma”: [
0.1,
0.1,
0.1
],
“absolute_predict_sigma”: [
0.001,
0.001,
0.002
],
“relative_predict_sigma”: [
0.2,
0.2,
0.2
],

to:

“initial_sigma”: [
0.001,
0.001,
0.001
],
“absolute_predict_sigma”: [
0.00001,
0.00001,
0.00002
],
“relative_predict_sigma”: [
0.002,
0.002,
0.002
],

It’s not perfect but much better.
But tbh i have no idea why… can anyone explain it to me?
I would have expected the opposite effect.

Thank you
Markus

Hi Markus,

Can you odometry values in Sight? Open http://localhost:3000/, filter channels on left by “state”, right click and plot. Do they look correct as you move the robot? Also, is “use_imu” parameter set to false? You can check on Sight by filtering on top right.

Best,
Oguz

1 Like

Hi Oguz,

yes, i can see the values in Sight. After some movement i can also see the De-Synchronization there.
Orientation according to unity 180°, in Sight the heading value is 2.98 so there is a misplacement of aprox. 8deg (see Picture)
“use_imu” is turned off.
Thank you for your help


Best,
Markus

Thanks for the info. Would you mind attaching the files that you have changed so that we can reproduce and debug?

1 Like

Sure, thank you for helping!
I hope google drive is ok. The uploader won’t let me uploade these files.
Included is:

  • the Unity Script which i use for moving (wasd + qe) the robot and reporting back in the base_state path. (its attached to the Rigidbody of my robot)
  • the navsim_navigate_app.json which i changed to use the holonomic modules
  • the holonomic_base_navigation.subgraph.json where i changed the sigma values

https://drive.google.com/drive/folders/1_fsIAvO4HfksceBd3zhW2YumRWlCqxAd?usp=sharing

i hope this helps.

Thanks for the files. They made me realize we forgot to ask what version of the SDK and simulator you are using. Would you mind trying 2020.1 from https://developer.nvidia.com/isaac/downloads?

1 Like

I’m currently using version 2019.3 (SDK and Sim) and Unity version 2018.3.11 on Ubuntu 18.04

Sure, i will try 2020.1 and report back!

In 2020.1 i can’t detect any noteworthy De-Synchronization anymore (with the default values) .
Thank you!