Integrating wearable sensor data and self-reported diaries for personalized affect forecasting




Yang Zhongqi, Wang Yuning, Yamashita Ken S., Khatibi Elahe, Azimi Iman, Dutt Nikil, Borelli Jessica L., Rahmani Amir M.

PublisherElsevier

2024

Smart Health

Smart Health

100464

32

2352-6483

2352-6491

DOIhttps://doi.org/10.1016/j.smhl.2024.100464

https://doi.org/10.1016/j.smhl.2024.100464

https://research.utu.fi/converis/portal/detail/Publication/387507382



Emotional states, as indicators of affect, are pivotal to overall health, making their accurate prediction before onset crucial. Current studies are primarily centered on immediate short-term affect detection using data from wearable and mobile devices. These studies typically focus on objective sensory measures, often neglecting other forms of self-reported information like diaries and notes. In this paper, we propose a multimodal deep learning model for affect status forecasting. This model combines a transformer encoder with a pre-trained language model, facilitating the integrated analysis of objective metrics and self-reported diaries. To validate our model, we conduct a longitudinal study, enrolling college students and monitoring them over a year, to collect an extensive dataset including physiological, environmental, sleep, metabolic, and physical activity parameters, alongside open-ended textual diaries provided by the participants. Our results demonstrate that the proposed model achieves predictive accuracy of 82.50% for positive affect and 82.76% for negative affect, a full week in advance. The effectiveness of our model is further elevated by its explainability.

Last updated on 2024-26-11 at 21:18