Wang, R., Chen, L. and Ayesh, A., 2022. Multimodal Motivation Modelling and Computing towards Motivationally Intelligent ELearning Systems. CCF Transactions on Pervasive Computing and Interaction.
Full text available as:
|
PDF (OPEN ACCESS ARTICLE)
published article.pdf - Published Version Available under License Creative Commons Attribution. 1MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1007/s42486-022-00107-4
Abstract
Persistent motivation to engage in e-learning systems is essential for users’ learning performance. Learners' motivation is traditionally assessed using subjective, self-reported data which is time-consuming and interruptive to their learning process. To address this issue, this paper proposes a novel framework for multimodal assessment of learners’ motivation in e-learning environments to inform intelligent e-learning systems that can facilitate dynamic, context-aware, and personalized services or interventions to maintain learners’ motivation during use. The applicability of the framework was evaluated in an empirical study in which we combined eye tracking and electroencephalogram (EEG) sensors to produce a multimodal dataset. The dataset was then processed and used to develop a machine learning classifier for motivation assessment by predicting the levels of a range of motivational factors, which represented the multiple dimensions of motivation. We investigated the performance of the machine learning classifier and the most and least accurately predicted motivational factors. We also assessed the contribution of different EEG and eye gaze features to motivation assessment. Our study has revealed valuable insights for the role played by brain activities and eye movements on predicting the levels of different motivational factors. Initial results using logistic regression classifier have achieved significant predictive power for all the motivational factors studied, with accuracy of between 68.1% and 92.8%. The present work has demonstrated the applicability of the proposed framework for multimodal motivation assessment which will inspire future research towards motivationally intelligent e-learning systems.
Item Type: | Article |
---|---|
ISSN: | 2524-521X |
Uncontrolled Keywords: | Multimodal motivation modelling; Eye tracking; EEG; E-learning |
Group: | Faculty of Science & Technology |
ID Code: | 37003 |
Deposited By: | Symplectic RT2 |
Deposited On: | 30 May 2022 15:03 |
Last Modified: | 30 May 2022 15:03 |
Downloads
Downloads per month over past year
Repository Staff Only - |