A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Integrating wearable sensor data and self-reported diaries for personalized affect forecasting




TekijätYang Zhongqi, Wang Yuning, Yamashita Ken S., Khatibi Elahe, Azimi Iman, Dutt Nikil, Borelli Jessica L., Rahmani Amir M.

KustantajaElsevier

Julkaisuvuosi2024

JournalSmart Health

Tietokannassa oleva lehden nimiSmart Health

Artikkelin numero100464

Vuosikerta32

ISSN2352-6483

eISSN2352-6491

DOIhttps://doi.org/10.1016/j.smhl.2024.100464

Verkko-osoitehttps://doi.org/10.1016/j.smhl.2024.100464

Rinnakkaistallenteen osoitehttps://research.utu.fi/converis/portal/detail/Publication/387507382


Tiivistelmä
Emotional states, as indicators of affect, are pivotal to overall health, making their accurate prediction before onset crucial. Current studies are primarily centered on immediate short-term affect detection using data from wearable and mobile devices. These studies typically focus on objective sensory measures, often neglecting other forms of self-reported information like diaries and notes. In this paper, we propose a multimodal deep learning model for affect status forecasting. This model combines a transformer encoder with a pre-trained language model, facilitating the integrated analysis of objective metrics and self-reported diaries. To validate our model, we conduct a longitudinal study, enrolling college students and monitoring them over a year, to collect an extensive dataset including physiological, environmental, sleep, metabolic, and physical activity parameters, alongside open-ended textual diaries provided by the participants. Our results demonstrate that the proposed model achieves predictive accuracy of 82.50% for positive affect and 82.76% for negative affect, a full week in advance. The effectiveness of our model is further elevated by its explainability.

Ladattava julkaisu

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 21:18