Sorry, you need to enable JavaScript to visit this website.

PHOTO STYLE TRANSFER WITH CONSISTENCY LOSSES

Citation Author(s):
Xu Yao, Gilles Puy, Patrick Pérez
Submitted by:
XU YAO
Last updated:
20 September 2019 - 7:34am
Document Type:
Presentation Slides
Document Year:
2019
Event:
Presenters:
Xu YAO
Paper Code:
P2266
 

We address the problem of style transfer between two photos and propose a new way to preserve photorealism. Using the single pair of photos available as input, we train a pair of deep convolution networks (convnets), each of which transfers the style of one photo to the other. To enforce photorealism, we introduce a content preserving mechanism by combining a cycle-consistency loss with a self-consistency loss. Experimental results show that this method does not suffer from typical artifacts observed in methods working in the same settings. We then further analyze some properties of these trained convnets. First, we notice that they can be used to stylize other unseen images with same known style. Second, we show that retraining only a small subset of the network parameters can be sufficient to adapt these convnets to new styles.

up
0 users have voted: