A Sensor-Aware Phenomenological Framework for LiDAR Degradation Simulation and SLAM Robustness Evaluation
: Doumegna, Mawuto Koudjo Felix; Yu, Xianjia; Zou, Zhuo; Westerlund, Tomi
Publisher: IEEE
: 2026
IEEE Robotics and Automation Practice
: 1
: 86
: 91
: 2995-4304
DOI: https://doi.org/10.1109/RAP.2026.3684773
: https://ieeexplore.ieee.org/document/11482742
: https://research.utu.fi/converis/portal/detail/Publication/523284320
Light detection and ranging (LiDAR)-based simultaneous localization and mapping (SLAM) systems are highly sensitive to adverse conditions such as occlusion, noise, and field-of-view (FoV) degradation, yet existing robustness evaluation methods either lack physical grounding or do not capture sensor-specific behavior. This article presents a sensor-aware phenomenological framework for simulating interpretable LiDAR degradations directly on real point clouds, enabling controlled and reproducible SLAM stress testing. Unlike image-derived corruption benchmarks (e.g., SemanticKITTI-C) or simulation-only approaches (e.g., LiDARSim), the proposed system preserves per-point geometry, intensity, and temporal structure while applying structured dropout, FoV reduction, Gaussian noise, occlusion masking, sparsification, and motion distortion. The framework features autonomous topic and sensor detection, a modular configuration with four predefined severity tiers (light–extreme), and real-time performance (< 5 ms per frame for solid-state LiDAR and < 20 ms for dense, wide-FoV spinning LiDAR). The implementation is Docker-containerized and compatible with robot operating system (ROS) workflows. Experimental validation across three LiDAR models and five stateof-the-art SLAM systems reveals distinct patterns of robustness shaped by sensor design and environmental context. The opensource implementation provides a practical foundation for benchmarking LiDAR-based SLAM under physically meaningful degradation scenarios.
:
This work was supported by the Research Council of Finland’s Digital Waters flagship under Grant 359247.