TY - JOUR
T1 - Efficient multi-objective neural architecture search framework via policy gradient algorithm
AU - Lyu, Bo
AU - Yang, Yin
AU - Cao, Yuting
AU - Wang, Pengcheng
AU - Zhu, Jian
AU - Chang, Jingfei
AU - Wen, Shiping
N1 - Publisher Copyright:
© 2024 The Author(s)
PY - 2024/3
Y1 - 2024/3
N2 - Differentiable architecture search plays a prominent role in Neural Architecture Search (NAS) and exhibits preferable efficiency than traditional heuristic NAS methods, including those based on evolutionary algorithms (EA) and reinforcement learning (RL). However, differentiable NAS methods encounter challenges when dealing with non-differentiable objectives like energy efficiency, resource constraints, and other non-differentiable metrics, especially under multiobjective search scenarios. While the multi-objective NAS research addresses these challenges, the individual training required for each candidate architecture demands significant computational resources. To bridge this gap, this work combines the efficiency of the differentiable NAS with metrics compatibility in multi-objective NAS. The architectures are discretely sampled by the architecture parameter alpha within the differentiable NAS framework, and alpha are directly optimised by the policy gradient algorithm. This approach eliminates the need for a sampling controller to be learned and enables the encompassment of non-differentiable metrics. We provide an efficient NAS framework that can be readily customized to address real-world multi-objective NAS (MNAS) scenarios, encompassing factors such as resource limitations and platform specialization. Notably, compared with other multi-objective NAS methods, our NAS framework effectively decreases the computational burden (accounting for just 1/6 of the NSGA-Net). This search framework is also compatible with the other efficiency and performance improvement strategies under the differentiable NAS framework.
AB - Differentiable architecture search plays a prominent role in Neural Architecture Search (NAS) and exhibits preferable efficiency than traditional heuristic NAS methods, including those based on evolutionary algorithms (EA) and reinforcement learning (RL). However, differentiable NAS methods encounter challenges when dealing with non-differentiable objectives like energy efficiency, resource constraints, and other non-differentiable metrics, especially under multiobjective search scenarios. While the multi-objective NAS research addresses these challenges, the individual training required for each candidate architecture demands significant computational resources. To bridge this gap, this work combines the efficiency of the differentiable NAS with metrics compatibility in multi-objective NAS. The architectures are discretely sampled by the architecture parameter alpha within the differentiable NAS framework, and alpha are directly optimised by the policy gradient algorithm. This approach eliminates the need for a sampling controller to be learned and enables the encompassment of non-differentiable metrics. We provide an efficient NAS framework that can be readily customized to address real-world multi-objective NAS (MNAS) scenarios, encompassing factors such as resource limitations and platform specialization. Notably, compared with other multi-objective NAS methods, our NAS framework effectively decreases the computational burden (accounting for just 1/6 of the NSGA-Net). This search framework is also compatible with the other efficiency and performance improvement strategies under the differentiable NAS framework.
KW - Neural architecture search
KW - Non-differentiable
KW - Reinforcement learning
KW - Supernetwork
UR - https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=hbku_researchportal&SrcAuth=WosAPI&KeyUT=WOS:001173851400001&DestLinkType=FullRecord&DestApp=WOS_CPL
U2 - 10.1016/j.ins.2024.120186
DO - 10.1016/j.ins.2024.120186
M3 - Article
SN - 0020-0255
VL - 661
JO - Information Sciences
JF - Information Sciences
M1 - 120186
ER -