Skip to main content

Affective state recognition in Virtual Reality from electromyography and photoplethysmography using head-mounted wearable sensors.

Mavridou, I., 2021. Affective state recognition in Virtual Reality from electromyography and photoplethysmography using head-mounted wearable sensors. Doctoral Thesis (Doctoral). Bournemouth University.

Full text available as:

MAVRIDOU, Ifigeneia_D.Eng._2021.pdf
Available under License Creative Commons Attribution Non-commercial.



The three core components of Affective Computing (AC) are emotion expression recognition, emotion processing, and emotional feedback. Affective states are typically characterized in a two-dimensional space consisting of arousal, i.e., the intensity of the emotion felt; and valence, i.e., the degree to which the current emotion is pleasant or unpleasant. These fundamental properties of emotion can not only be measured using subjective ratings from users, but also with the help of physiological and behavioural measures, which potentially provide an objective evaluation across users. Multiple combinations of measures are utilised in AC for a range of applications, including education, healthcare, marketing, and entertainment. As the uses of immersive Virtual Reality (VR) technologies are growing, there is a rapidly increasing need for robust affect recognition in VR settings. However, the integration of affect detection methodologies with VR remains an unmet challenge due to constraints posed by the current VR technologies, such as Head Mounted Displays. This EngD project is designed to overcome some of the challenges by effectively integrating valence and arousal recognition methods in VR technologies and by testing their reliability in seated and room-scale full immersive VR conditions. The aim of this EngD research project is to identify how affective states are elicited in VR and how they can be efficiently measured, without constraining the movement and decreasing the sense of presence in the virtual world. Through a three-years long collaboration with Emteq labs Ltd, a wearable technology company, we assisted in the development of a novel multimodal affect detection system, specifically tailored towards the requirements of VR. This thesis will describe the architecture of the system, the research studies that enabled this development, and the future challenges. The studies conducted, validated the reliability of our proposed system, including the VR stimuli design, data measures and processing pipeline. This work could inform future studies in the field of AC in VR and assist in the development of novel applications and healthcare interventions.

Item Type:Thesis (Doctoral)
Additional Information:If you feel that this work infringes your copyright please contact the BURO Manager.
Uncontrolled Keywords:EMG; PPG; affect; electromyography; photoplethysmography; emotion; valence; arousal; virtual reality; headsets; physiological sensors; timeseries data; machine learning; affect recognition; VR; IMU; signal processing; ENGD; emteq; novel interfaces
Group:Faculty of Media & Communication
ID Code:35917
Deposited By: Symplectic RT2
Deposited On:20 Aug 2021 14:03
Last Modified:14 Mar 2022 14:29


Downloads per month over past year

More statistics for this item...
Repository Staff Only -