Realtime Dynamic 3D Facial Reconstruction for Monocular Video In-the-Wild.

Liu, S., Wang, Z., Yang, X. and Zhang, J. J., 2018. Realtime Dynamic 3D Facial Reconstruction for Monocular Video In-the-Wild. In: IEEE International Conference on Computer Vision Workshop (ICCVW), 2017, 22-29 October 2017, Venice, 777 - 785.

Full text available as:

[img]
Preview
PDF
egpaper_final-min.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

337kB

DOI: 10.1109/ICCVW.2017.97

Abstract

With the increasing amount of videos recorded using 2D mobile cameras, the technique for recovering the 3D dynamic facial models from these monocular videos has become a necessity for many image and video editing applications. While methods based parametric 3D facial models can reconstruct the 3D shape in dynamic environment, large structural changes are ignored. Structure-from-motion methods can reconstruct these changes but assume the object to be static. To address this problem we present a novel method for realtime dynamic 3D facial tracking and reconstruction from videos captured in uncontrolled environments. Our method can track the deforming facial geometry and reconstruct external objects that protrude from the face such as glasses and hair. It also allows users to move around, perform facial expressions freely without degrading the reconstruction quality.

Item Type:Conference or Workshop Item (Paper)
Uncontrolled Keywords:Three-dimensional displays; Image reconstruction; Two dimensional displays, Solid modeling, Face; Robustness; Target tracking;
Group:Faculty of Media & Communication
ID Code:30711
Deposited By: Unnamed user with email symplectic@symplectic
Deposited On:14 May 2018 15:45
Last Modified:14 May 2018 15:45

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -