<div class="csl-bib-body">
<div class="csl-entry">Ganian, R., & Korchemna, V. (2021). The Complexity of Bayesian Network Learning: Revisiting the Superstructure. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P. S. Liang, & J. Wortman Vaughan (Eds.), <i>Advances in Neural Information Processing Systems 34 (NeurIPS 2021)</i> (pp. 430–442). Curran Associates, Inc. https://doi.org/10.34726/3905</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/175987
-
dc.identifier.uri
https://doi.org/10.34726/3905
-
dc.description.abstract
We investigate the parameterized complexity of Bayesian Network Structure Learning (BNSL), a classical problem that has received significant attention in empirical but also purely theoretical studies. We follow up on previous works that have analyzed the complexity of BNSL w.r.t. the so-called superstructure of the input. While known results imply that BNSL is unlikely to be fixed-parameter tractable even when parameterized by the size of a vertex cover in the superstructure, here we show that a different kind of parameterization - notably by the size of a feedback edge set - yields fixed-parameter tractability. We proceed by showing that this result can be strengthened to a localized version of the feedback edge set, and provide corresponding lower bounds that complement previous results to provide a complexity classification of BNSL w.r.t. virtually all well-studied graph parameters.We then analyze how the complexity of BNSL depends on the representation of the input. In particular, while the bulk of past theoretical work on the topic assumed the use of the so-called non-zero representation, here we prove that if an additive representation can be used instead then BNSL becomes fixed-parameter tractable even under significantly milder restrictions to the superstructure, notably when parameterized by the treewidth alone. Last but not least, we show how our results can be extended to the closely related problem of Polytree Learning.
en
dc.description.sponsorship
FWF Fonds zur Förderung der wissenschaftlichen Forschung (FWF)
-
dc.description.sponsorship
FWF Fonds zur Förderung der wissenschaftlichen Forschung (FWF)
-
dc.language.iso
en
-
dc.rights.uri
http://rightsstatements.org/vocab/InC/1.0/
-
dc.subject
Parameterized complexity
en
dc.subject
Bayesian Network Structure Learning
en
dc.subject
fixed-parameter tractability
en
dc.subject
Polytree Learning
en
dc.title
The Complexity of Bayesian Network Learning: Revisiting the Superstructure
en
dc.type
Inproceedings
en
dc.type
Konferenzbeitrag
de
dc.rights.license
Urheberrechtsschutz
de
dc.rights.license
In Copyright
en
dc.identifier.doi
10.34726/3905
-
dc.relation.isbn
9781713845393
-
dc.description.startpage
430
-
dc.description.endpage
442
-
dc.relation.grantno
P31336-N35
-
dc.relation.grantno
Y1329-N
-
dc.type.category
Full-Paper Contribution
-
tuw.booktitle
Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
-
tuw.container.volume
34
-
tuw.peerreviewed
true
-
tuw.relation.publisher
Curran Associates, Inc.
-
tuw.project.title
New Frontiers for Parameterized Complexity
-
tuw.project.title
Parameterisierte Analyse in der Künstlichen Intelligenz