학술논문

PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework
Document Type
Periodical
Author
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 34(11):9171-9184 Nov, 2023
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Computer architecture
Training
Optimization
Extraterrestrial measurements
Estimation
Computational modeling
Search problems
Metric
neural architecture search (NAS)
search space shrinking
weight sharing
Language
ISSN
2162-237X
2162-2388
Abstract
Neural architecture search (NAS) depends heavily on an efficient and accurate performance estimator. To speed up the evaluation process, recent advances, like differentiable architecture search (DARTS) and One-Shot approaches, instead of training every model from scratch, train a weight-sharing super-network to reuse parameters among different candidates, in which all child models can be efficiently evaluated. Though these methods significantly boost search efficiency, they inherently suffer from inaccurate and unstable performance estimation. To this end, we propose a general and effective framework for powering weight-sharing NAS, namely, PWSNAS, by shrinking search space automatically, i.e., candidate operators will be discarded if they are less important. With the strategy, our approach can provide a promising search space of a smaller size by progressively simplifying the original search space, which can reduce difficulties for existing NAS methods to find superior architectures. In particular, we present two strategies to guide the shrinking process: detect redundant operators with a new angle-based metric and decrease the degree of weight sharing of a super-network by increasing parameters, which differentiates PWSNAS from existing shrinking methods. Comprehensive analysis experiments on NASBench-201 verify the superiority of our proposed metric over existing accuracy-based and magnitude-based metrics. PWSNAS can easily apply to the state-of-the-art NAS methods, e.g., single path one-shot neural architecture search (SPOS), FairNAS, ProxylessNAS, DARTS, and progressive DARTS (PDARTS). We evaluate PWSNAS and demonstrate consistent performance gains over baseline methods.