Loss functions are crucial in training generative adversarial networks (GANs) and shaping the resulting outputs. These functions, specifically designed for GANs, optimize generator and discriminator networks together but in opposite directions. GAN models, which typically handle large datasets, have been successful in the field of deep learning. However, exploring the factors that influence the success of GAN models developed for limited data problems is an important area of research. In this study, we conducted a comprehensive investigation into the loss functions commonly used in GAN literature, such as binary cross entropy (BCE), Wasserstein generative adversarial network (WGAN), least squares generative adversarial network (LSGAN), and hinge loss. Our research focused on examining the impact of these loss functions on improving output quality and ensuring training convergence in single-image GANs. Specifically, we evaluated the performance of a single-image GAN model, SinGAN, using these loss functions in terms of image quality and diversity. Our experimental results demonstrated that loss functions successfully produce high-quality, diverse images from a single training image. Additionally, we found that the WGAN-GP and LSGAN-GP loss functions are more effective for single-image GAN models.
Generative adversarial networks low data regime single-image GAN loss functions image diversity
Etik iznine gerek yoktur.
Primary Language | English |
---|---|
Subjects | Computer Vision, Pattern Recognition, Deep Learning, Neural Networks, Machine Learning (Other) |
Journal Section | Research Articles |
Authors | |
Early Pub Date | December 11, 2024 |
Publication Date | |
Submission Date | June 8, 2024 |
Acceptance Date | August 9, 2024 |
Published in Issue | Year 2024Volume: 8 Issue: 2 |
The works published in Journal of Innovative Science and Engineering (JISE) are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.