Motion-Robust Multimodal Fusion of PPG and Accelerometer Signals for Three-Class Heart Rhythm Classification




Zhao, Yangyang; Kaisti, Matti; Lahdenoja, Olli; Koivisto, Tero

Beigl, Michael; Jacucci, Giulio; Sigg, Stephan; Xiao, Yu; Bardram, Jakob; Tsiropoulou, Eirini Eleni; Xu, Chenren

ACM international joint conference on pervasive and ubiquitous computing

2025

 ACM international joint conference on pervasive and ubiquitous computing

UbiComp Companion '25 : Companion of the 2025 ACM International Joint Conference on Pervasive and Ubiquitous Computing

171

175

979-8-4007-1477-1

DOIhttps://doi.org/10.1145/3714394.3754412

https://doi.org/10.1145/3714394.3754412

https://research.utu.fi/converis/portal/detail/Publication/500360183

https://arxiv.org/abs/2511.00949

ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp) / ACM International Symposium on Wearable Computers (ISWC)



Atrial fibrillation (AF) is a leading cause of stroke and mortality, particularly in elderly patients. Wrist-worn photoplethysmography (PPG) enables non-invasive, continuous rhythm monitoring, yet suffers from significant vulnerability to motion artifacts and physiological noise. Many existing approaches rely solely on single-channel PPG and are limited to binary AF detection, often failing to capture the broader range of arrhythmias encountered in clinical settings. We introduce RhythmiNet, a residual neural network enhanced with temporal and channel attention modules that jointly leverage PPG and accelerometer (ACC) signals. The model performs three-class rhythm classification: AF, sinus rhythm (SR), and Other. To assess robustness across varying movement conditions, test data are stratified by accelerometer-based motion intensity percentiles without excluding any segments. RhythmiNet achieved a 4.3% improvement in macro-AUC over the PPG-only baseline. In addition, performance surpassed a logistic regression model based on handcrafted HRV features by 12%, highlighting the benefit of multimodal fusion and attention-based learning in noisy, real-world clinical data.


This study was funded by the Moore4Medical project, supported by the ECSEL JU and Business Finland (Grant Agreements H2020-ECSEL-2019-IA-876190 and 7215/31/2019), and by the ITEA project RM4HEALTH, supported by Business Finland (Grant 8139/31/2022).


Last updated on 12/03/2026 02:20:02 PM