DRIVE OS Version: 6.0.10
Issue Description:
Hi everyone,
I’m trying to use the DriveWorks LiDAR self-calibration sample with a Hesai AT128P (custom LiDAR integration), and I’m currently stuck with the calibration status never progressing.
Goal / setup
I want to run the LiDAR self-calibration tool for a Hesai AT128P (custom LiDAR), i.e. use the calibration engine/routine with a LiDAR type that is not natively supported by the calibration module.
For that, I had to adapt the NVIDIA sample code. One adaption is that I had to spoof the LiDAR device string inside the m_lidarPropertiesCalib (dwLidarProperties m_lidarPropertiesCalib{}) (See below for the code).
Also, I use CAN-only egomotion (DW_EGOMOTION_ODOMETRY), no IMU yet.
What already works
Other samples/tools but the calibration sample:
- Replay / recording (original NVIDIA Driveworks samples) pipeline works
- Point cloud processing works (I adapted the pointcloudprocessing sample)
Inside this customized calibration sample:
- Egomotion works (CAN-only, no IMU yet)
DW_EGOMOTION_ODOMETRY- I feed VehicleIO states from CAN
- I can compute and use egomotion transforms (not the root issue here)
Observed problem
m_updatedLidarToRigremains equal to the nominal transform (m_lidarInfo.nominalSensorToRig)- therefore
m_correctionRandm_correctionTalso remain zero/identity (expected consequence) - more importantly: calibration status (
dwCalibrationStatus m_status{}) stays at 0.00% / “Not Accepted” the whole time
So I suspect the problem is earlier in the calibration pipeline (initialization / calibration routine setup), not just output interpretation.
Suspected area: calibration routine initialization (dwCalibrationEngine_initializeLidar)
This part succeeds (no error code), but I suspect the routine is internally not configured correctly for my custom LiDAR.
Calibration engine init (no error) (exactly like in the original LiDAR self-calibration sample from DriveWorks!)
CHECK_DW_ERROR_MSG(dwCalibrationEngine_initialize(&m_calibEngine, m_rigConfig, m_context),
"Error: Initialize calibration engine failed.");
Calibration routine init (with LiDAR spoofing)
Originally this failed because the calibration module did not support the Hesai LiDAR type.
To work around this, I spoofed the LiDAR device string to a supported Velodyne type:
std::strncpy(reinterpret_cast<char*>(m_lidarPropertiesCalib.deviceString),
"VELO_HDL64E"
sizeof(m_lidarPropertiesCalib.deviceString) - 1);
m_lidarPropertiesCalib.deviceString[sizeof(m_lidarPropertiesCalib.deviceString) - 1] = '\0';
dwCalibrationLidarParams lidarCalibrationParams{};
lidarCalibrationParams.lidarProperties = &m_lidarPropertiesCalib;
dwCalibrationEngine_initializeLidar(&m_calibRoutine,
m_lidarInfo.sensorId,
m_canIndex,
&lidarCalibrationParams,
cudaStreamDefault,
m_calibEngine)
I printed the content of dwLidarProperties (the struct passed into dwCalibrationLidarParams::lidarProperties) and noticed clear discrepancies compared to what I expect for my Hesai AT128P recording. That raises the question whether the calibration routine (m_calibRoutine) is initialized successfully but with incorrect LiDAR model assumptions, causing calibration to never converge / never progress.
Observed vs. expected fields of dwLidarPropeties
Field (dwLidarProperties) |
Observed (logged) | Expected (Hesai AT128P, Single Return @10 Hz) | Assessment / Comment |
|---|---|---|---|
deviceString |
VELO_HDL64E (spoofed!) |
VELO_HDL64E (spoofed!) |
Intentionally spoofed |
spinFrequency |
10 Hz |
10 Hz |
✅ matches |
numberOfRows |
128 |
128 |
✅ matches |
pointsPerPacket |
256 |
256 |
✅ matches (2 blocks × 128 channels) |
pointsPerSpin |
320000 |
153600 |
❌ much too high (~2.08×) |
pointsPerSecond |
3200000 |
1536000 |
❌ much too high (~2.08×) |
packetsPerSpin |
1250 |
600 |
❌ much too high (~2.08×) |
packetsPerSecond |
12 |
6000 |
❌ |
horizontalFOVStart |
0.5 rad |
convention-dependent | ❌ |
horizontalFOVEnd |
2.8 rad |
convention-dependent | ❌ |
Horizontal FOV Span (End - Start) |
2.3 rad ≈ 131.8° |
about 120° (slightly more in practice) | ❌ |
verticalFOVStart |
-0.216697 rad (≈ -12.42°) |
about -12.47° (≈ -0.21764 rad) |
✅ very close |
verticalFOVEnd |
0.224673 rad (≈ +12.87°) |
about +12.93° (≈ 0.22567 rad) |
✅ very close |
validAuxInfos |
0x0 |
unknown / mode-dependent | ⚠️ |
lidarSSISizeInBytes |
0 |
unknown / sensor-dependent | ⚠️ |
availableReturns |
ANY |
unknown / mode-depent | ⚠️ |
horizontalAngles[] min/max |
0 / 0 |
delta should be around 0,1 ° | ❌ |
verticalAngles[] min/max |
0 / 0 |
-12.42°/12.87 ° would normally be expected | ❌ |
Questions
- What exactly from the rig is used by
dwCalibrationEngine_initialize(&m_calibEngine, m_rigConfig, m_context) - For the LiDAR self-calibration routine:
- Which fields of
dwLidarPropertiesare actually relevant for calibration? - If the fields of the
dwLidarPropertiesare no all correct, can that cause the calibration status to stay at 0% (Not Accepted) without explicit errors?
- Which fields of
- Can the fields of
dwLidarPropertiesbe overridden/set manually and if so, how? - Or is it more likely, that the issue is somewhere elese?