Sorry, you need to enable JavaScript to visit this website.

Enhancing GAN Performance through Neural Architecture Search and Tensor Decomposition

Citation Author(s):
Prasanna Reddy Pulakurthi, Mahsa Mozaffari, Sohail A. Dianat, Majid Rabbani, Jamison Heard, and Raghuveer Rao
Submitted by:
PRASANNA REDDY ...
Last updated:
17 April 2024 - 4:38am
Document Type:
Presentation Slides
Document Year:
2024
Event:
Presenters:
Prasanna Reddy Pulakurthi
Paper Code:
MLSP-L11.1
 

Generative Adversarial Networks (GANs) have emerged as a powerful tool for generating high-fidelity content. This paper presents a new training procedure that leverages Neural Architecture Search (NAS) to discover the optimal architecture for image generation while employing the Maximum Mean Discrepancy (MMD) repulsive loss for adversarial training. Moreover, the generator network is compressed using tensor decomposition to reduce its computational footprint and inference time while preserving its generative performance. Experimental results show improvements of 34% and 28% in the FID score on the CIFAR-10 and STL-10 datasets, respectively, with corresponding footprint reductions of 14× and 31× compared to the best FID score method reported in the literature. The implementation code is available at: github.com/PrasannaPulakurthi/MMD-AdversarialNAS.

up
0 users have voted: