Skip to main content

Multimodal Motivation Modelling and Computing towards Motivationally Intelligent ELearning Systems.

Wang, R., Chen, L. and Ayesh, A., 2022. Multimodal Motivation Modelling and Computing towards Motivationally Intelligent ELearning Systems. CCF Transactions on Pervasive Computing and Interaction.

Full text available as:

[img]
Preview
PDF (OPEN ACCESS ARTICLE)
published article.pdf - Published Version
Available under License Creative Commons Attribution.

1MB

DOI: 10.1007/s42486-022-00107-4

Abstract

Persistent motivation to engage in e-learning systems is essential for users’ learning performance. Learners' motivation is traditionally assessed using subjective, self-reported data which is time-consuming and interruptive to their learning process. To address this issue, this paper proposes a novel framework for multimodal assessment of learners’ motivation in e-learning environments to inform intelligent e-learning systems that can facilitate dynamic, context-aware, and personalized services or interventions to maintain learners’ motivation during use. The applicability of the framework was evaluated in an empirical study in which we combined eye tracking and electroencephalogram (EEG) sensors to produce a multimodal dataset. The dataset was then processed and used to develop a machine learning classifier for motivation assessment by predicting the levels of a range of motivational factors, which represented the multiple dimensions of motivation. We investigated the performance of the machine learning classifier and the most and least accurately predicted motivational factors. We also assessed the contribution of different EEG and eye gaze features to motivation assessment. Our study has revealed valuable insights for the role played by brain activities and eye movements on predicting the levels of different motivational factors. Initial results using logistic regression classifier have achieved significant predictive power for all the motivational factors studied, with accuracy of between 68.1% and 92.8%. The present work has demonstrated the applicability of the proposed framework for multimodal motivation assessment which will inspire future research towards motivationally intelligent e-learning systems.

Item Type:Article
ISSN:2524-521X
Uncontrolled Keywords:Multimodal motivation modelling; Eye tracking; EEG; E-learning
Group:Faculty of Science & Technology
ID Code:37003
Deposited By: Symplectic RT2
Deposited On:30 May 2022 15:03
Last Modified:30 May 2022 15:03

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -