Skip to main content

Gaze modulated disambiguation technique for gesture control in 3D virtual objects selection.

Deng, S., Chang, J., Hu, S.M. and Zhang, J. J., 2017. Gaze modulated disambiguation technique for gesture control in 3D virtual objects selection. In: 3rd IEEE International Conference on Cybernetics (CYBCONF), 21-23 June 2017, Exeter, UK.

Full text available as:

bare_conf.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.


DOI: 10.1109/CYBConf.2017.7985779


© 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3D environment. An emerging technique that integrates gaze modulated pointing with mid-air gesture control enables fast target acquisition and rich control expressions. The performance of this technique relies on the eye tracking accuracy which is not comparable with the traditional pointing techniques (e.g., mouse) yet. This will cause troubles when fine grainy interactions are required, such as selecting in a dense virtual scene where proximity and occlusion are prone to occur. This paper proposes a coarse-to-fine solution to compensate the degradation introduced by eye tracking inaccuracy using a gaze cone to detect ambiguity and then a gaze probe for decluttering. It is tested in a comparative experiment which involves 12 participants with 3240 runs. The results show that the proposed technique enhanced the selection accuracy and user experience but it is still with a potential to be improved in efficiency. This study contributes to providing a robust multimodal interface design supported by both eye tracking and mid-air gesture control.

Item Type:Conference or Workshop Item (Paper)
Group:Faculty of Media & Communication
ID Code:29651
Deposited By: Symplectic RT2
Deposited On:04 Sep 2017 14:21
Last Modified:14 Mar 2022 14:06


Downloads per month over past year

More statistics for this item...
Repository Staff Only -