The larger the size of the data, structured or unstructured, the harder to understand and make use of it. One of the fundamentals to machine learning is feature selection. Feature selection, by reducing the number of ...The larger the size of the data, structured or unstructured, the harder to understand and make use of it. One of the fundamentals to machine learning is feature selection. Feature selection, by reducing the number of irrelevant/redundant features, dramatically reduces the run time of a learning algorithm and leads to a more general concept. In this paper, realization of feature selection through a neural network based algorithm, with the aid of a topology optimizer genetic algorithm, is investigated. We have utilized NeuroEvolution of Augmenting Topologies (NEAT) to select a subset of features with the most relevant connection to the target concept. Discovery and improvement of solutions are two main goals of machine learning, however, the accuracy of these varies depends on dimensions of problem space. Although feature selection methods can help to improve this accuracy, complexity of problem can also affect their performance. Artificialneural networks are proven effective in feature elimination, but as a consequence of fixed topology of most neural networks, it loses accuracy when the number of local minimas is considerable in the problem. To minimize this drawback, topology of neural network should be flexible and it should be able to avoid local minimas especially when a feature is removed. In this work, the power of feature selection through NEAT method is demonstrated. When compared to the evolution of networks with fixed structure, NEAT discovers significantly more sophisticated strategies. The results show NEAT can provide better accuracy compared to conventional Multi-Layer Perceptron and leads to improved feature selection.展开更多
This paper introduces a novel hybrid optimization algorithm,Adaptive Hybrid PSO-Embedded GA(AHPEGA),which dynamically adapts to optimization performance by integrating Particle Swarm Optimization(PSO)and Genetic Algor...This paper introduces a novel hybrid optimization algorithm,Adaptive Hybrid PSO-Embedded GA(AHPEGA),which dynamically adapts to optimization performance by integrating Particle Swarm Optimization(PSO)and Genetic Algorithms(GA).The primary objective is to enhance the neuroevolutionary training of multilayer perceptron-based controllers(MLPCs)through the joint optimization of model parameters and structural hyperparameters.Traditional training methods frequently encounter issues such as premature convergence and limited generalization.AHPEGA addresses these limitations through an adaptive training strategy that dynamically adjusts parameters during the evolutionary process,thereby improving convergence speed and solution quality.By effectively reducing entrapment in local minima and balancing exploration and exploitation,AHPEGA improves the quality of neural controller design.The algorithm’s performance is evaluated against conventional optimization methods,demonstrating significant improvements in accuracy,convergence speed,and consistency across multiple runs.The practical applicability of the proposed method is demonstrated through simulation in the context of a VSC-based islanded microgrid(MG),where ensuring reliable and effective control under variable operating conditions is critical.This highlights AHPEGA’s capability to optimize intelligent control strategies in MG systems,particularly under dynamic and uncertain conditions,reinforcing its practical value in real-world energy environments.展开更多
文摘The larger the size of the data, structured or unstructured, the harder to understand and make use of it. One of the fundamentals to machine learning is feature selection. Feature selection, by reducing the number of irrelevant/redundant features, dramatically reduces the run time of a learning algorithm and leads to a more general concept. In this paper, realization of feature selection through a neural network based algorithm, with the aid of a topology optimizer genetic algorithm, is investigated. We have utilized NeuroEvolution of Augmenting Topologies (NEAT) to select a subset of features with the most relevant connection to the target concept. Discovery and improvement of solutions are two main goals of machine learning, however, the accuracy of these varies depends on dimensions of problem space. Although feature selection methods can help to improve this accuracy, complexity of problem can also affect their performance. Artificialneural networks are proven effective in feature elimination, but as a consequence of fixed topology of most neural networks, it loses accuracy when the number of local minimas is considerable in the problem. To minimize this drawback, topology of neural network should be flexible and it should be able to avoid local minimas especially when a feature is removed. In this work, the power of feature selection through NEAT method is demonstrated. When compared to the evolution of networks with fixed structure, NEAT discovers significantly more sophisticated strategies. The results show NEAT can provide better accuracy compared to conventional Multi-Layer Perceptron and leads to improved feature selection.
基金financial support from the Swedish International Development Cooperation Agency(SIDA)through the research capacity-building program between Addis Ababa University and Swedish universities.
文摘This paper introduces a novel hybrid optimization algorithm,Adaptive Hybrid PSO-Embedded GA(AHPEGA),which dynamically adapts to optimization performance by integrating Particle Swarm Optimization(PSO)and Genetic Algorithms(GA).The primary objective is to enhance the neuroevolutionary training of multilayer perceptron-based controllers(MLPCs)through the joint optimization of model parameters and structural hyperparameters.Traditional training methods frequently encounter issues such as premature convergence and limited generalization.AHPEGA addresses these limitations through an adaptive training strategy that dynamically adjusts parameters during the evolutionary process,thereby improving convergence speed and solution quality.By effectively reducing entrapment in local minima and balancing exploration and exploitation,AHPEGA improves the quality of neural controller design.The algorithm’s performance is evaluated against conventional optimization methods,demonstrating significant improvements in accuracy,convergence speed,and consistency across multiple runs.The practical applicability of the proposed method is demonstrated through simulation in the context of a VSC-based islanded microgrid(MG),where ensuring reliable and effective control under variable operating conditions is critical.This highlights AHPEGA’s capability to optimize intelligent control strategies in MG systems,particularly under dynamic and uncertain conditions,reinforcing its practical value in real-world energy environments.