Wang, L., 2020. Neural style transfer for images, videos and reliefs. Doctoral Thesis (Doctoral). Bournemouth University.
Full text available as:
|
PDF
WANG, Li_Ph.D._2020.pdf 113MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Abstract
For hundred years, artists engage into art creation to present their understanding of subjective and objective world, and their style representations can be inspired from other art- works and followed by other people. To grasp the spirit and styles from artworks, followers have to practice for years even for professional artists. In the past two decades, researchers in computer science have dedicated to propose automatic techniques to create paintings in different artistic styles, which gradually forms a research area Artistic Style Transfer (AST). The breakthrough on Convolutional Neural Network (CNN) recently drives the AST into a new era called Neural Style Transfer (NST). Since born as a new technique, NST has been researched as a powerful tool to benefit other areas such as colour transfer, video temporal consistency and geometry detail transfer etc. However, applying NST directly into different research fields often causes unexpected artefacts. For example, the distortion artefacts in stylized results is inevitable especially when the content and style inputs are both photographic, the flickering artefacts existing in video style transfer methods, and the mismatching of geometric inputs in geometry detail transfer. To address those challenges, this thesis aims to leverage NST to develop new techniques for the related research fields. A new photo style transfer method is proposed to prevent distortion artefacts and preserve style and photorealism simultaneously. To enhance the temporal consistency in consecutive frames, a stable video style transfer method is proposed to mitigate flickering artefacts by a set of masking techniques and multi-frame coherent losses. Furthermore, a semantic neural normal transfer network is proposed to match desired texture patterns from style reference input onto content inputs by an automatically attention- based mask technique.
Item Type: | Thesis (Doctoral) |
---|---|
Additional Information: | If you feel that this work infringes your copyright please contact the BURO Manager. |
Uncontrolled Keywords: | neural style transfer; convolutional neural networks; photo style transfer; video style transfer; bas-relief modelling |
Group: | Faculty of Media & Communication |
ID Code: | 34861 |
Deposited By: | Symplectic RT2 |
Deposited On: | 23 Nov 2020 11:52 |
Last Modified: | 14 Mar 2022 14:25 |
Downloads
Downloads per month over past year
Repository Staff Only - |