Skip to main content

PerimetryNet: A multiscale fine grained deep network for three-dimensional eye gaze estimation using visual field analysis.

Yu, S., Wang, Z., Zhou, S., Yang, X., Wu, C. and Wang, Z., 2023. PerimetryNet: A multiscale fine grained deep network for three-dimensional eye gaze estimation using visual field analysis. Computer Animation and Virtual Worlds, 34 (5), e2141.

Full text available as:

[img]
Preview
PDF
CASA2022_PerimetryGaze__CASA_Version_.pdf - Accepted Version

2MB

DOI: 10.1002/cav.2141

Abstract

Three-dimensional gaze estimation aims to reveal where a person is looking, which plays an important role in identifying users' point-of-interest in terms of the direction, attention and interactions. Appearance-based gaze estimation methods could provide relatively unconstrained gaze tracking from commodity hardware. Inspired by medical perimetry test, we have proposed a multiscale framework with visual field analysis branch to improve estimation accuracy. The model is based on the feature pyramids and predicts vision field to help gaze estimation. In particular, we analysis the effect of the multiscale component and the visual field branch on challenging benchmark datasets: MPIIGaze and EYEDIAP. Based on these studies, our proposed PerimetryNet significantly outperforms state-of-the-art methods. In addition, the multiscale mechanism and visual field branch can be easily applied to existing network architecture for gaze estimation. Related code would be available at public repository https://github.com/gazeEs/PerimetryNet.

Item Type:Article
ISSN:1546-4261
Uncontrolled Keywords:Gaze Estimation; Multi-Scale; Fine Grained; Visual Field; MPIIGaze; EYEDIAP
Group:Faculty of Science & Technology
ID Code:38516
Deposited By: Symplectic RT2
Deposited On:09 Jun 2023 11:53
Last Modified:24 May 2024 09:34

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -