Skip to main content

Recognizing Induced Emotions of Movie Audiences From Multimodal Information.

Muszynski, M., Tian, L., Lai, C., Moore, J. D., Kostoulas, T., Lombardo, P., Pun, T. and Chanel, G., 2021. Recognizing Induced Emotions of Movie Audiences From Multimodal Information. IEEE Transactions on Affective Computing, 12 (1), 36-52.

Full text available as:

[img]
Preview
PDF ((©2019 IEEE))
j9-Recognizing Induced Emotions of Movie Audiences From Multimodal Information.pdf - Accepted Version

11MB

Official URL: Volume: 12, Issue: 1, Jan.-March 1 2021)

DOI: 10.1109/TAFFC.2019.2902091

Abstract

Recognizing emotional reactions of movie audiences to affective movie content is a challenging task in affective computing. Previous research on induced emotion recognition has mainly focused on using audio-visual movie content. Nevertheless, the relationship between the perceptions of the affective movie content (perceived emotions) and the emotions evoked in the audiences (induced emotions) is unexplored. In this work, we studied the relationship between perceived and induced emotions of movie audiences. Moreover, we investigated multimodal modelling approaches to predict movie induced emotions from movie content based features, as well as physiological and behavioral reactions of movie audiences. To carry out analysis of induced and perceived emotions, we first extended an existing database for movie affect analysis by annotating perceived emotions in a crowd-sourced manner. We find that perceived and induced emotions are not always consistent with each other. In addition, we show that perceived emotions, movie dialogues, and aesthetic highlights are discriminative for movie induced emotion recognition besides spectators’ physiological and behavioral reactions. Also, our experiments revealed that induced emotion recognition could benefit from including temporal information and performing multimodal fusion. Moreover, our work deeply investigated the gap between affective content analysis and induced emotion recognition by gaining insight into the relationships between aesthetic highlights, induced emotions, and perceived emotions.

Item Type:Article
ISSN:1949-3045
Uncontrolled Keywords:Affective Computing; Implicit Tagging; Emotion Recognition; Multimodal Learning; Multimodal Fusion; Induced and Perceived Emotions; Aesthetic Highlights; Physiological and Behavioral Signals; Crowdsourcing
Group:Faculty of Science & Technology
ID Code:31903
Deposited By: Symplectic RT2
Deposited On:27 Feb 2019 09:43
Last Modified:14 Mar 2022 14:15

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -