학술논문
A method for Style-Based Domain Conversion by Generative Adversarial Network
Document Type
Conference
Author
Source
2021 IEEE 10th Global Conference on Consumer Electronics (GCCE) Consumer Electronics (GCCE), 2021 IEEE 10th Global Conference on. :818-819 Oct, 2021
Subject
Language
Abstract
In recent, many methods for image generation using the Generative Adversarial Network(GAN) have been proposed. CycleGAN is one of the methods for image domain conversion which is an unsupervised learning method using two unpaired datasets. On the other hand, a method for image generation called StyleGAN is known as a method which improves accuracy by learning and incorporating the style of the image. In this paper, we proposed a method for domain conversion based on CycleGAN model, and added a new module that can learn the style and perform style transfer. The generated images change by inputting the noise parameters to different resolution during inverse convolution in the style transfer module. In experiments, we confirmed that the generated images were change depending on which resolution image the noise is input into.