학술논문

Toward Evolutionary Multitask Convolutional Neural Architecture Search
Document Type
Periodical
Source
IEEE Transactions on Evolutionary Computation IEEE Trans. Evol. Computat. Evolutionary Computation, IEEE Transactions on. 28(3):682-695 Jun, 2024
Subject
Computing and Processing
Task analysis
Computer architecture
Multitasking
Statistics
Sociology
Knowledge transfer
Costs
Convolutional neural network (CNN)
evolutionary multitask optimization (EMTO)
evolutionary neural architecture search (ENAS)
knowledge transfer
Language
ISSN
1089-778X
1941-0026
Abstract
Evolutionary neural architecture search (ENAS) methods have been successfully used to design convolutional neural network (CNN) architectures automatically. These methods have achieved excellent performance in creating a specific neural architecture for a single task but are less efficient for multiple tasks. Existing ENAS frameworks always repeatedly perform the search from scratch for each task, even though these tasks may be solved by similar CNN architectures. This work presents an evolutionary multitask convolutional neural architecture search (MTNAS) framework to enable efficient architecture searches in multitask scenarios by incorporating architectural similarities. The proposed MTNAS constructs architectures for different tasks simultaneously by implementing a knowledgesharing mechanism among multiple search processes. Specifically, promising architectures found in one search process can be transferred and reused to generate high-quality architectures for others. Furthermore, we devise an adaptive strategy to dynamically adjust the frequency of knowledge transfer, aiming to alleviate the potential effect of negative transfer. Extensive experiments demonstrate that MTNAS can outperform stateof- the-art neural architecture search (NAS) methods or achieve comparable performance in different tasks but with $2\times $ less search cost.