The rapid development of evolutionary deep learning has led to the emergence of various Neural Architecture Search(NAS)algorithms designed to optimize neural network structures.However,these algorithms often face sign...The rapid development of evolutionary deep learning has led to the emergence of various Neural Architecture Search(NAS)algorithms designed to optimize neural network structures.However,these algorithms often face significant computational costs due to the time-consuming process of training neural networks and evaluating their performance.Traditional NAS approaches,which rely on exhaustive evaluations and large training datasets,are inefficient for solving complex image classification tasks within limited time frames.To address these challenges,this paper proposes a novel NAS algorithm that integrates a hierarchical evaluation strategy based on Surrogate models,specifically using supernet to pre-trainweights and randomforests as performance predictors.This hierarchical framework combines rapid Surrogate model evaluations with traditional,precise evaluations to balance the trade-off between performance accuracy and computational efficiency.The algorithm significantly reduces the time required for model evaluation by predicting the fitness of candidate architectures using a random forest Surrogate model,thus alleviating the need for full training cycles for each architecture.The proposed method also incorporates evolutionary operations such as mutation and crossover to refine the search process and improve the accuracy of the resulting architectures.Experimental evaluations on the CIFAR-10 and CIFAR-100 datasets demonstrate that the proposed hierarchical evaluation strategy reduces the search time and costs compared to traditional methods,while achieving comparable or even superior model performance.The results suggest that this approach can efficiently handle resourceconstrained tasks,providing a promising solution for accelerating the NAS process without compromising the quality of the generated architectures.展开更多
文摘The rapid development of evolutionary deep learning has led to the emergence of various Neural Architecture Search(NAS)algorithms designed to optimize neural network structures.However,these algorithms often face significant computational costs due to the time-consuming process of training neural networks and evaluating their performance.Traditional NAS approaches,which rely on exhaustive evaluations and large training datasets,are inefficient for solving complex image classification tasks within limited time frames.To address these challenges,this paper proposes a novel NAS algorithm that integrates a hierarchical evaluation strategy based on Surrogate models,specifically using supernet to pre-trainweights and randomforests as performance predictors.This hierarchical framework combines rapid Surrogate model evaluations with traditional,precise evaluations to balance the trade-off between performance accuracy and computational efficiency.The algorithm significantly reduces the time required for model evaluation by predicting the fitness of candidate architectures using a random forest Surrogate model,thus alleviating the need for full training cycles for each architecture.The proposed method also incorporates evolutionary operations such as mutation and crossover to refine the search process and improve the accuracy of the resulting architectures.Experimental evaluations on the CIFAR-10 and CIFAR-100 datasets demonstrate that the proposed hierarchical evaluation strategy reduces the search time and costs compared to traditional methods,while achieving comparable or even superior model performance.The results suggest that this approach can efficiently handle resourceconstrained tasks,providing a promising solution for accelerating the NAS process without compromising the quality of the generated architectures.