Automated Architecture; Genetic Algorithms; NEAT; Supervised Learning; Automated architecture; Class labels; Evolutionary mechanisms; Genetic mechanism; Machine learning problem; Multi-class classification; Neuro evolutions; Neuroevolution of augmenting topologies; Performance; Specific class; Artificial Intelligence; Computer Science Applications; Computer Vision and Pattern Recognition; Statistics, Probability and Uncertainty; Computational Mathematics; Control and Optimization; Modeling and Simulation
Abstract :
[en] Advances in the field of NeuroEvolution (NE) highlighted the potential of applying genetic and evolutionary mechanisms to Machine Learning (ML) problems while simultaneously alleviating the need for manual neural architecture design. The NeuroEvolution of Augmenting Topologies (NEAT) is a prominent NE method shows competitive performance in the field of reinforcement learning while its performance is not as efficient if we apply it to supervised multi-class problems. This may be attributed to the multiple objectives of the problem which hinders the learning process in NEAT. In this study, we introduce a novel method C-NEAT which is developed to address this issue and enhance the performance of NEAT without changing its core implementation. C-NEAT seeks to learn and classify different classes by creating and using a container holds the best genomes i.e. networks where each one of them focuses on learning a specific class label of the problem. Each organism in NEAT which is a unit contains the evolved genome and its fitness value is assigned automatically to a specific class based on its index in the population. During the evolution process, the container will keep updating itself and store the best evolved genomes for these classes. Through this, C-NEAT will focus on recognizing each class label and it will preserve the learning progress, thus ensuring higher learning efficiency.
Disciplines :
Computer science
Author, co-author :
Alfaham, Abdallah ; UA - University of Antwerp > Faculty of Applied Engineering ; imec, IDLab
Van Raemdonck, Stijn; University of Antwerp, Faculty of Applied Engineering, Antwerp, Belgium
Mercelis, Siegfried; University of Antwerp - imec, IDLab - Faculty of Applied Engineering, Antwerp, Belgium
Language :
English
Title :
Genetic NEAT-Based Method for Multi-Class Classification
Publication date :
2024
Event name :
2024 7th International Conference on Algorithms, Computing and Artificial Intelligence (ACAI)
Event place :
Guangzhou, China
Event date :
20-12-2024 => 22-12-2024
By request :
Yes
Audience :
International
Main work title :
ACAI 2024 - 2024 7th International Conference on Algorithms, Computing and Artificial Intelligence
Editor :
Wang, Zenghui
Publisher :
Institute of Electrical and Electronics Engineers Inc.
T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. M. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, and D. Amodei, “Language models are few-shot learners,” 5 2020.
L. Chen and D. Alahakoon, “Neuroevolution of augmenting topologies with learning for data classification.” IEEE, 12 2006, pp. 367–371.
A. Wijaya, D. Ikawahyuni, R. Gea, and F. Maedjaja, “Role comparison between deep belief neural network and neuroevolution of augmenting topologies to detect diabetes,” JOIV: International Journal on Informatics Visualization, vol. 5, 5 2021.
K. O. Stanley and R. Miikkulainen, “Evolving neural networks through augmenting topologies,” Evolutionary Computation, vol. 10, pp. 99–127, 6 2002.
A. Gaier and D. Ha, “Weight agnostic neural networks,” 6 2019.
T. Elsken, J. H. Metzen, and F. Hutter, “Neural architecture search: A survey,” 8 2018.
E. Gibney, “How to shrink ai’s ballooning carbon footprint,” Nature, vol. 607, pp. 648–648, 7 2022.
P. R. Neary, “Competing conventions,” Games and Economic Behavior, vol. 76, no. 1, pp. 301–328, 2012.
M. Y. Ibrahim, R. Sridhar, T. Geetha, and S. Deepika, “Advances in neuroevolution through augmenting topologies – a case study.” IEEE, 12 2019, pp. 111–116.
G. A. Pimenta, F. B. J. R. Dallaqua, A. Fazenda, and F. A. Faria, “Neuroevolution-based classifiers for deforestation detection in tropical forests,” 8 2022.
V. Stanovov, S. Akhmedova, and E. Semenkin, “Neuroevolution of augmented topologies with difference-based mutation,” IOP Conference Series: Materials Science and Engineering, vol. 1047, p. 012075, 2 2021.
R. A. Fisher, “Iris,” UCI Machine Learning Repository, 1988, DOI: https://doi.org/10.24432/C56C76.
K. Nakai, “Yeast,” UCI Machine Learning Repository, 1996, DOI: https://doi.org/10.24432/C5KG68.
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg et al., “Scikit-learn: Machine learning in python,” the Journal of machine Learning research, vol. 12, pp. 2825–2830, 2011.
P. Nakkiran, G. Kaplun, Y. Bansal, T. Yang, B. Barak, and I. Sutskever, “Deep double descent: Where bigger models and more data hurt,” 12 2019.
C. Zhang, S. Bengio, M. Hardt, B. Recht, and O. Vinyals, “Understanding deep learning (still) requires rethinking generalization,” Communications of the ACM, vol. 64, pp. 107–115, 3 2021.
Y. LeCun, L. Bottou, G. B. Orr, and K.-R. Müller, “Efficient backprop,” in Neural networks: Tricks of the trade. Springer, 2002, pp. 9–50.