ICCV 2017

Abstract

The Gatys method ADAPTS to a variety of styles but is too slow; The feed-forward neural network is fast but has a fixed style. The author proposes a real-time conversion method suitable for any style, the core of which is ALADIN layer (adaptive content normalization), and the speed is close to the feedforward method.

Related Work

Deep generative image modeling

VAE (automatic variational encoder), Auto_regression model (autoregressive model), GAN (generative adversarial network), among which GAN has the best effect

Background

Batch Normalization

BN simplifies training through normalized feature graph statistics and was initially used to speed up discriminator training, but has also been found to be useful in generating image modeling


B N ( x ) = gamma ( x mu ( x ) sigma ( x ) ) + Beta. BN(x) = \gamma(\frac{x – \mu(x)}{\sigma(x)}) + \beta

γ\gammaγ and β\betaβ are affine parameters learned from the data