FlowCalib: LiDAR-to-Vehicle Miscalibration Detection using Scene Flows

FlowCalib: LiDAR-to-Vehicle Miscalibration Detection using Scene Flows
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Accurate sensor-to-vehicle calibration is essential for safe autonomous driving. Angular misalignments of LiDAR sensors can lead to safety-critical issues during autonomous operation. However, current methods primarily focus on correcting sensor-to-sensor errors without considering the miscalibration of individual sensors that cause these errors in the first place. We introduce FlowCalib, the first framework that detects LiDAR-to-vehicle miscalibration using motion cues from the scene flow of static objects. Our approach leverages the systematic bias induced by rotational misalignment in the flow field generated from sequential 3D point clouds, eliminating the need for additional sensors. The architecture integrates a neural scene flow prior for flow estimation and incorporates a dual-branch detection network that fuses learned global flow features with handcrafted geometric descriptors. These combined representations allow the system to perform two complementary binary classification tasks: a global binary decision indicating whether misalignment is present and separate, axis-specific binary decisions indicating whether each rotational axis is misaligned. Experiments on the nuScenes dataset demonstrate FlowCalib’s ability to robustly detect miscalibration, establishing a benchmark for sensor-to-vehicle miscalibration detection.


💡 Research Summary

Accurate extrinsic calibration between a LiDAR sensor and the vehicle frame (S2V calibration) is a prerequisite for safe autonomous driving, yet most existing works focus on sensor‑to‑sensor (S2S) alignment and ignore whether an individual sensor itself is miscalibrated. This paper introduces FlowCalib, the first framework that detects LiDAR‑to‑vehicle rotational miscalibration solely from the motion cues present in 3‑D scene flow of static objects. The key observation is that a rotational error introduces a systematic, distance‑dependent bias in the apparent motion of points belonging to static scene elements. When the vehicle moves straight, a correctly calibrated LiDAR yields flow vectors that are consistent with the ego‑motion; a yaw, pitch, or roll error rotates these vectors by a constant angle whose magnitude grows with the distance to the object. This bias can be captured without any additional sensors such as IMU, GPS, or cameras.

FlowCalib’s pipeline consists of four stages. First, miscalibration is simulated on the pre‑calibrated nuScenes dataset by injecting uniform random rotations in the range


Comments & Academic Discussion

Loading comments...

Leave a Comment