Savosin, D., Prakoonwit, S., Tian, F., Liang, J.G. and Pan, Z.G., 2017. Representation of Intractable Objects and Action Sequences in VR Using Hand Gesture Recognition. In: 11th International Conference on E-Learning & Games, 26-28 June 2017, Bournemouth, UK.
Full text available as:
|
PDF
Representation of Intractable Objects.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial No Derivatives. 601kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Abstract
We propose a novel approach on using static and dynamic gesture recognition in VR games to represent interactive objects in games, such as equip- ment system, weapons and handy-tools. We examine various applications of ges- ture recognition in games, learning, medicine and VR, including how developers currently use the bundles of HDM devices paired with hand tracking sensors. The proposed approach provides game developers with a control over recording ges- tures and binding them to in-game intractable objects and equipment.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Uncontrolled Keywords: | Games; Gesture Recognition; Virtual Reality |
Group: | Faculty of Science & Technology |
ID Code: | 29986 |
Deposited By: | Symplectic RT2 |
Deposited On: | 15 Nov 2017 14:17 |
Last Modified: | 14 Mar 2022 14:08 |
Downloads
Downloads per month over past year
Repository Staff Only - |