<div class="csl-bib-body">
<div class="csl-entry">Korchemna, V. (2021). <i>Parameterized algorithms for Bayesian network learning</i> [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.90847</div>
</div>
-
dc.identifier.uri
https://doi.org/10.34726/hss.2021.90847
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/18189
-
dc.description.abstract
We investigate the parameterized complexity of Bayesian Network Structure Learning (BNSL), a classical problem that has received significant attention in empirical but also purely theoretical studies. We follow up on previous works that have analyzed the complexity of BNSL w.r.t. the so-called superstructure of the input. While known results imply that BNSL is unlikely to be fixed-parameter tractable even when parameterized by the size of a vertex cover in the superstructure, here we show that a different kind of parameterization—notably by the size of a feedback edge set—yields fixed-parameter tractability. We proceed by showing that this result can be strengthened to a localized version of the feedback edge set. We adapt corresponding algorithms to the closely related problem of Polytree Learning. Concerning the lower bounds, we establish W[1]-hardness of BNSL parameterized by tree-cut width.We then analyze how the complexity of BNSL depends on the representation of the input. In particular, while the bulk of past theoretical work on the topic assumed the use of the so-called non-zero representation, here we prove that if an additive representation can be used instead then BNSL becomes fixed-parameter tractable even under significantly milder restrictions to the superstructure, notably when parameterized by the treewidth alone.
en
dc.language
English
-
dc.language.iso
en
-
dc.rights.uri
http://rightsstatements.org/vocab/InC/1.0/
-
dc.subject
Parameterized complexity
en
dc.subject
Bayesian network structure learning
en
dc.title
Parameterized algorithms for Bayesian network learning
en
dc.title.alternative
Parametrisierte Algorithmen für das Bayesianisches Netzwerklernen
de
dc.type
Thesis
en
dc.type
Hochschulschrift
de
dc.rights.license
In Copyright
en
dc.rights.license
Urheberrechtsschutz
de
dc.identifier.doi
10.34726/hss.2021.90847
-
dc.contributor.affiliation
TU Wien, Österreich
-
dc.rights.holder
Viktoriia Korchemna
-
dc.publisher.place
Wien
-
tuw.version
vor
-
tuw.thesisinformation
Technische Universität Wien
-
tuw.publication.orgunit
E192 - Institut für Logic and Computation
-
dc.type.qualificationlevel
Diploma
-
dc.identifier.libraryid
AC16266894
-
dc.description.numberOfPages
51
-
dc.thesistype
Diplomarbeit
de
dc.thesistype
Diploma Thesis
en
dc.rights.identifier
In Copyright
en
dc.rights.identifier
Urheberrechtsschutz
de
tuw.advisor.staffStatus
staff
-
item.languageiso639-1
en
-
item.openairetype
master thesis
-
item.grantfulltext
open
-
item.fulltext
with Fulltext
-
item.cerifentitytype
Publications
-
item.mimetype
application/pdf
-
item.openairecristype
http://purl.org/coar/resource_type/c_bdcc
-
item.openaccessfulltext
Open Access
-
crisitem.author.dept
E192-01 - Forschungsbereich Algorithms and Complexity