Skip to main content

Semantic portrait color transfer with internet images.

Yang, Y., Zhao, H., You, L., Tu, R., Wu, X. and Jin, X., 2017. Semantic portrait color transfer with internet images. Multimedia Tools and Applications, 76 (1), 523 - 541.

Full text available as:

colorTransferS - Copy.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial.


DOI: 10.1007/s11042-015-3063-x


We present a novel color transfer method for portraits by exploring their high-level semantic information. First, a database is set up which consists of a collection of portrait images download from the Internet, and each of them is manually segmented using image matting as a preprocessing step. Second, we search the database using Face++ to find the images with similar poses to a given source portrait image, and choose one satisfactory image from the results as the target. Third, we extract portrait foregrounds from both source and target images. Then, the system extracts the semantic information, such as faces, eyes, eyebrows, lips, teeth, etc., from the extracted foreground of the source using image matting algorithms. After that, we perform color transfer between corresponding parts with the same semantic information. We get the final transferred result by seamlessly compositing different parts together using alpha blending. Experimental results show that our semantics-driven approach can generate better color transfer results for portraits than previous methods and provide users a new means to retouch their portraits.

Item Type:Article
Group:Faculty of Media & Communication
ID Code:33100
Deposited By: Symplectic RT2
Deposited On:02 Dec 2019 15:35
Last Modified:14 Mar 2022 14:18


Downloads per month over past year

More statistics for this item...
Repository Staff Only -