Compartir
Título
A generalized decision tree ensemble based on the NeuralNetworks architecture: Distributed Gradient Boosting Forest (DGBF)
Autor
Facultad/Centro
Área de conocimiento
Título de la revista
Applied Intelligence
Número de la revista
19
Cita Bibliográfica
Delgado-Panadero, Á., Benítez-Andrades, J. A., & García-Ordás, M. T. (2023). A generalized decision tree ensemble based on the NeuralNetworks architecture: Distributed Gradient Boosting Forest (DGBF). Applied Intelligence, 53(19), 22991-23003. https://doi.org/10.1007/S10489-023-04735-W
Editorial
Springer
Fecha
2023-07-05
ISSN
0924-669X
Resumen
[EN] Tree ensemble algorithms as RandomForest and GradientBoosting are currently the dominant methods for modeling discrete or tabular data, however, they are unable to perform a hierarchical representation learning from raw data as NeuralNetworks does thanks to its multi-layered structure, which is a key feature for DeepLearning problems and modeling unstructured data. This limitation is due to the fact that tree algorithms can not be trained with back-propagation because of their mathematical nature. However, in this work, we demonstrate that the mathematical formulation of bagging and boosting can be combined together to define a graph-structured-tree-ensemble algorithm with a distributed representation learning process between trees naturally (without using back-propagation). We call this novel approach Distributed Gradient Boosting Forest (DGBF) and we demonstrate that both RandomForest and GradientBoosting can be expressed as particular graph architectures of DGBT. Finally, we see that the distributed learning outperforms both RandomForest and GradientBoosting in 7 out of 9 datasets.
Materia
Palabras clave
Peer review
SI
URI
DOI
Versión del editor
Aparece en las colecciones
- Artículos [4694]
Ficheros en el ítem
Nombre:
Generalized_ Decision_Tree_Ensemble_ Based_NeuralNetworks.pdfEmbargado hasta: 2024-07-01
Tamaño:
675.9
xmlui.dri2xhtml.METS-1.0.size-kilobytes
Formato:
Adobe PDF