Liu, S., Yang, X., Wang, Z., Xiao, Z. and Zhang, J., 2016. Real-Time Facial Expression Transfer with Single Video Camera. Computer Animation and Virtual Worlds, 27 (3-4), pp. 301-310.
Full text available as:
casa_2016.pdf - Accepted Version
Restricted to Repository staff only until 11 May 2017.
Available under License Creative Commons Attribution Non-commercial.
Facial expression transfer is currently an active research field. However, 2D image wrapping based methods suffer from depth ambiguity and specific hardware is required for depth-based methods to work. We present a novel markerless, real time online facial transfer method that requires only a single video camera. Our method adapts a model to user specific facial data, computes expression variances in real time and rapidly transfers them to another target. Our method can be applied to videos without prior camera calibration and focal adjustment. It enables realistic online facial expression editing and performance transferring in many scenarios, such as: video conference; news broadcasting; lip-syncing for song performances; etc. With a low computational demand and hardware requirement, our method tracks a single user at an average of 38 fps. Our tracking method runs smoothly in web browsers despite their slower execution speed.
|Uncontrolled Keywords:||expression transfer;facial animation;facial tracking|
|Deposited By:||Unnamed user with email symplectic@symplectic|
|Deposited On:||15 Apr 2016 14:34|
|Last Modified:||21 Jun 2016 15:32|
Downloads per month over past year
|Repository Staff Only -|