Camera Sensor Raw Data-Driven Video Blur Effect Prevention: Dataset and Study
: Nahli, Abdelwahed; Li, Dan; Uddin, Rahim; Raza, Tahir; Irfan, Muhammad; Lu, Qiyong; Zhang, Jian Qiu
Publisher: IEEE
: 2025
IEEE Access
: 13
: 184762
: 184774
: 2169-3536
DOI: https://doi.org/10.1109/ACCESS.2025.3622993
: https://research.utu.fi/converis/portal/detail/Publication/506150065
Recent advances in machine vision have played an important role in addressing the challenging problem of motion blur. However, most deep learning–based deblurring methods operate in the RGB domain, rely on recursive strategies, and are often trained on unrealistic synthetic data. In this paper, we introduce a preventive solution from a new perspective, leveraging the opportunity to operate directly in the RAW domain on high-bit sensor data. Since no publicly available high–frame rate RAW-based blur prevention dataset exists, we construct Blurry-RAW, a novel dataset containing paired blurry and sharp frames in both RAW and RGB formats. We further propose 3D-ISPNet, a CNN–Transformer hybrid architecture, trained exclusively on RAW sensor data. This model achieves superior quantitative and qualitative performance compared to RGB-based counterparts. Moreover, by fine-tuning on data from different camera sensors, 3D-ISPNet demonstrates strong generalization across diverse hardware. Ultimately, the introduction of RAW-driven blur prevention and the new dataset paves the way for further research in this emerging direction.
:
This work was supported in part by Fudan University, and in part by the National Natural Science Foundation of China
under Grant 12374431.