Abstract
Generative Adversarial Networks (GANs) are effective for image generation, but their unstable training limits broader applications. Additionally, neural architecture search (NAS) for GANs with one-shot models often leads to insufficient subnet training, where subnets inherit weights from a supernet without proper optimization, further degrading performance. To address both issues, we propose Architecture Knowledge Distillation for Evolutionary GAN (AKD-EGAN). AKD-EGAN operates in two stages. First, architecture knowledge distillation (AKD) is used during supernet training to efficiently optimize subnetworks and accelerate learning. Second, a multi-objective evolutionary algorithm (MOEA) searches for optimal subnet architectures, ensuring efficiency by considering multiple performance metrics. This approach, combined with a strategy for architecture inheritance, enhances GAN stability and image quality. Experiments show that AKD-EGAN surpasses state-of-the-art methods, achieving a Fréchet Inception Distance (FID) of 7.91 and an Inception Score (IS) of 8.97 on CIFAR-10, along with competitive results on STL-10 (FID: 20.32, IS: 10.06). Code and models will be available at https://github.com/njit-ly/AKD-EGAN.