A1 Refereed original research article in a scientific journal
Emotions amplify speaker-listener neural alignment
Authors: Dmitry Smirnov, Heini Saarimäki, Enrico Glerean, Riitta Hari, Mikko Sams, Lauri Nummenmaa
Publisher: WILEY
Publication year: 2019
Journal: Human Brain Mapping
Journal name in source: HUMAN BRAIN MAPPING
Journal acronym: HUM BRAIN MAPP
Volume: 40
First page : 4777
Last page: 4788
Number of pages: 12
ISSN: 1065-9471
eISSN: 1097-0193
DOI: https://doi.org/10.1002/hbm.24736
Abstract
Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two "speaker" subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 "listener" subjects during fMRI. After scanning, both speakers and listeners rated the moment-to-moment valence and arousal of the stories. Time-varying similarity of the blood-oxygenation-level-dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker-listener pairs. Telling and listening to the stories elicited similar emotions across speaker-listener pairs. Arousal was associated with increased speaker-listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker-listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker-listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion-dependent speaker-listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.
Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two "speaker" subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 "listener" subjects during fMRI. After scanning, both speakers and listeners rated the moment-to-moment valence and arousal of the stories. Time-varying similarity of the blood-oxygenation-level-dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker-listener pairs. Telling and listening to the stories elicited similar emotions across speaker-listener pairs. Arousal was associated with increased speaker-listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker-listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker-listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion-dependent speaker-listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.