Skip to main content

Photographic style transfer.

Wang, L., Wang, Z., Yang, X., Hu, S.M. and Zhang, J., 2020. Photographic style transfer. Visual Computer, 36 (2), 317-331.

Full text available as:

[img]
Preview
PDF (OPEN ACCESS ARTICLE)
Wang2020_Article_PhotographicStyleTransfer.pdf - Published Version
Available under License Creative Commons Attribution.

18MB

DOI: 10.1007/s00371-018-1609-4

Abstract

© 2018, The Author(s). Image style transfer has attracted much attention in recent years. However, results produced by existing works still have lots of distortions. This paper investigates the CNN-based artistic style transfer work specifically and finds out the key reasons for distortion coming from twofold: the loss of spatial structures of content image during content-preserving process and unexpected geometric matching introduced by style transformation process. To tackle this problem, this paper proposes a novel approach consisting of a dual-stream deep convolution network as the loss network and edge-preserving filters as the style fusion model. Our key contribution is the introduction of an additional similarity loss function that constrains both the detail reconstruction and style transfer procedures. The qualitative evaluation shows that our approach successfully suppresses the distortions as well as obtains faithful stylized results compared to state-of-the-art methods.

Item Type:Article
ISSN:0178-2789
Group:Faculty of Media & Communication
ID Code:31496
Deposited By: Symplectic RT2
Deposited On:26 Nov 2018 15:45
Last Modified:14 Mar 2022 14:13

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -