Hinweis
Dieser Eintrag wurde automatisch aus einem Altsystem migriert. Die Daten wurden nicht überprüft und entsprechen eventuell nicht den Qualitätskriterien des vorliegenden Systems.
Ordyniak, S., & Szeider, S. (2021). Parameterized Complexity of Small Decision Tree Learning. In Thirty-Fifth AAAI Conference on Artificial Intelligence (pp. 1–9). AAAI Press. http://hdl.handle.net/20.500.12708/58602
E192-01 - Forschungsbereich Algorithms and Complexity
-
Erschienen in:
Thirty-Fifth AAAI Conference on Artificial Intelligence
-
Datum (veröffentlicht):
2021
-
Veranstaltungsname:
35th AAAI 2021
-
Veranstaltungszeitraum:
2-Feb-2021 - 9-Feb-2021
-
Veranstaltungsort:
virtual event, Unbekannt
-
Umfang:
9
-
Verlag:
AAAI Press, 35
-
Peer Reviewed:
Ja
-
Abstract:
We study the NP-hard problem of learning a decision tree
(DT) of smallest depth or size from data. We provide the first
parameterized complexity analysis of the problem and draw
a detailed parameterized complexity map for the natural parameters:
size or depth of the DT, maximum domain size of
all features, and the maximum Hamming distance between
any two examples. Our main result shows that learning DTs
of smallest depth or size is fixed-parameter tractable (FPT)
parameterized by the combination of all three of these parameters.
We contrast this FPT-result by various hardness results
that underline the algorithmic significance of the considered
parameters.