Título Global Negative Correlation Learning: A Unified Framework for Global Optimization of Ensemble Models.
Autores PERALES GONZÁLEZ, CARLOS, FERNÁNDEZ NAVARRO, FRANCISCO DE ASÍS, CARBONERO RUZ, MARIANO, PÉREZ RODRÍGUEZ, JAVIER
Publicación externa No
Medio IEEE Transactions on Neural Networks and Learning Systems
Alcance Article
Naturaleza Científica
Cuartil JCR 1
Cuartil SJR 1
Impacto JCR 14.255
Impacto SJR 4.222
Web https://www.scopus.com/inward/record.uri?eid=2-s2.0-85100868702&doi=10.1109%2fTNNLS.2021.3055734&partnerID=40&md5=5c0214443fe510fbbed4c42f71af2f27
Fecha de publicacion 11/02/2021
ISI 000732412000001
Scopus Id 2-s2.0-85100868702
DOI 10.1109/TNNLS.2021.3055734
Abstract Ensembles are a widely implemented approach in the machine learning\n community and their success is traditionally attributed to the diversity\n within the ensemble. Most of these approaches foster diversity in the\n ensemble by data sampling or by modifying the structure of the\n constituent models. Despite this, there is a family of ensemble models\n in which diversity is explicitly promoted in the error function of the\n individuals. The negative correlation learning (NCL) ensemble framework\n is probably the most well-known algorithm within this group of methods.\n This article analyzes NCL and reveals that the framework actually\n minimizes the combination of errors of the individuals of the ensemble\n instead of minimizing the residuals of the final ensemble. We propose a\n novel ensemble framework, named global negative correlation learning\n (GNCL), which focuses on the optimization of the global ensemble instead\n of the individual fitness of its components. An analytical solution for\n the parameters of base regressors based on the NCL framework and the\n global error function proposed is also provided under the assumption of\n fixed basis functions (although the general framework could also be\n instantiated for neural networks with nonfixed basis functions). The\n proposed ensemble framework is evaluated by extensive experiments with\n regression and classification data sets. Comparisons with other\n state-of-the-art ensemble methods confirm that GNCL yields the best\n overall performance.
Palabras clave Training; Optimization; Correlation; Estimation; Training data; Data models; Learning systems; Ensemble; global optimization; negative correlation learning (NCL)
Miembros de la Universidad Loyola

Change your preferences Gestionar cookies