Komusiewicz, C., Kunz, P., Sommer, F., & Sorge, M. (2023). On Computing Optimal Tree Ensembles. In A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, & J. Scarlett (Eds.), Proceedings of the 40th International Conference on Machine Learning (pp. 17364–17374). http://hdl.handle.net/20.500.12708/192688
E192-01 - Forschungsbereich Algorithms and Complexity
-
Published in:
Proceedings of the 40th International Conference on Machine Learning
-
Volume:
202
-
Date (published):
2023
-
Event name:
40th International Conference on Machine Learning (ICML 2023)
en
Event date:
23-Jul-2023 - 29-Jul-2023
-
Event place:
Honolulu, United States of America (the)
-
Number of Pages:
11
-
Peer reviewed:
Yes
-
Keywords:
Decision Trees; Machine Learning
en
Abstract:
Random forests and, more generally, (deci- sion-)tree ensembles are widely used methods for classification and regression. Recent algorith- mic advances allow to compute decision trees that are optimal for various measures such as their size or depth. We are not aware of such research for tree ensembles and aim to contribute to this area. Mainly, we provide two novel algo- rithms and corresponding lower bounds. First, we are able to carry over and substantially improve on tractability results for decision trees, obtain- ing a (6δDS)S · poly-time algorithm, where S is the number of cuts in the tree ensemble, D the largest domain size, and δ is the largest num- ber of features in which two examples differ. To achieve this, we introduce the witness-tree tech- nique which also seems promising for practice. Second, we show that dynamic programming, which has been successful for decision trees, may also be viable for tree ensembles, providing an ℓn · poly-time algorithm, where ℓ is the number of trees and n the number of examples. Finally, we compare the number of cuts necessary to clas- sify training data sets for decision trees and tree ensembles, showing that ensembles may need ex- ponentially fewer cuts for increasing number of trees.