Zhang, Z., Chen, T., Liu, Y., Wang, C., Zhao, K., Liu, C. H. and Fu, X., 2023. Decoding the Temporal Representation of Facial Expression in Face-selective Regions. Neuroimage, 283, 120442.
Full text available as:
|
PDF (OPEN ACCESS ARTICLE)
Zhang et al 2023 Decoding the Temporal Representation of Facial Expression in Face-selective Regions.pdf - Published Version Available under License Creative Commons Attribution. 2MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1016/j.neuroimage.2023.120442
Abstract
The ability of humans to discern facial expressions in a timely manner typically relies on distributed face-selective regions for rapid neural computations. To study the time course in regions of interest for this process, we used magnetoencephalography (MEG) to measure neural responses participants viewed facial expressions depicting seven types of emotions (happiness, sadness, anger, disgust, fear, surprise, and neutral). Analysis of the time-resolved decoding of neural responses in face-selective sources within the inferior parietal cortex (IP-faces), lateral occipital cortex (LO-faces), fusiform gyrus (FG-faces), and posterior superior temporal sulcus (pSTS-faces) revealed that facial expressions were successfully classified starting from ∼100 to 150 ms after stimulus onset. Interestingly, the LO-faces and IP-faces showed greater accuracy than FG-faces and pSTS-faces. To examine the nature of the information processed in these face-selective regions, we entered with facial expression stimuli into a convolutional neural network (CNN) to perform similarity analyses against human neural responses. The results showed that neural responses in the LO-faces and IP-faces, starting ∼100 ms after the stimuli, were more strongly correlated with deep representations of emotional categories than with image level information from the input images. Additionally, we observed a relationship between the behavioral performance and the neural responses in the LO-faces and IP-faces, but not in the FG-faces and lpSTS-faces. Together, these results provided a comprehensive picture of the time course and nature of information involved in facial expression discrimination across multiple face-selective regions, which advances our understanding of how the human brain processes facial expressions.
Item Type: | Article |
---|---|
ISSN: | 1053-8119 |
Uncontrolled Keywords: | CNN; Facial expression; magnetoencephalography (MEG); pairwise classification; representational similarity analysis |
Group: | Faculty of Science & Technology |
ID Code: | 39106 |
Deposited By: | Symplectic RT2 |
Deposited On: | 09 Nov 2023 08:45 |
Last Modified: | 09 Nov 2023 08:45 |
Downloads
Downloads per month over past year
Repository Staff Only - |