<div class="csl-bib-body">
<div class="csl-entry">Heiss, J. M. (2019). <i>Implicit regularization for artificial neural networks</i> [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2019.69320</div>
</div>
-
dc.identifier.uri
https://doi.org/10.34726/hss.2019.69320
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/4493
-
dc.description
Decomposed Zeichen konvertiert!
-
dc.description.abstract
The main result is a rigorous proof that artificial neural networks without explicit regularization implicitly regularize the integral of the squared second derivative. when trained by gradient descent by solving very precisely the smoothing spline regression problem := arg min C2(Ni=1((xi train)yi traini)2+(′′)2dx) under certain conditions. Artificial neural networks are often used in Machine Learning to estimate an unknown function True by only observing finitely many data points. There are many methods that guarantee the convergence of the estimated function to the true function True as the number of samples tends to infinity. But in practice there is almost always only a finite number N of samples available. Given a finite number of data points there are infinitely many functions that fit perfectly through the N data points but generalize arbitrary bad. Therefore one needs some regularization to find a suitable function. With the help of the main theorem one can solve the paradox why training neural networks without explicit regularization works surprisingly well under certain conditions (in the case of 1-dimensional wide ReLU randomized shallow neural networks).
en
dc.language
English
-
dc.language.iso
en
-
dc.rights.uri
http://rightsstatements.org/vocab/InC/1.0/
-
dc.subject
implizite Regularisierung
de
dc.subject
maschinelles Lernen
de
dc.subject
neuronale Netzwekre
de
dc.subject
early stopping
de
dc.subject
Spline
de
dc.subject
Regression
de
dc.subject
Gradienten-Verfahren
de
dc.subject
Backpropagation
de
dc.subject
künstliche Intelligenz.
de
dc.subject
implicit regularization
en
dc.subject
machine learning
en
dc.subject
neural networks
en
dc.subject
early stopping
en
dc.subject
spline
en
dc.subject
regression
en
dc.subject
gradient descend
en
dc.subject
back-propagation
en
dc.subject
artificial intelligence
en
dc.title
Implicit regularization for artificial neural networks
en
dc.type
Thesis
en
dc.type
Hochschulschrift
de
dc.rights.license
In Copyright
en
dc.rights.license
Urheberrechtsschutz
de
dc.identifier.doi
10.34726/hss.2019.69320
-
dc.contributor.affiliation
TU Wien, Österreich
-
dc.rights.holder
Jakob Michael Heiss
-
dc.publisher.place
Wien
-
tuw.version
vor
-
tuw.thesisinformation
Technische Universität Wien
-
tuw.publication.orgunit
E105 - Institut für Stochastik und Wirtschaftsmathematik
-
dc.type.qualificationlevel
Diploma
-
dc.identifier.libraryid
AC15493687
-
dc.description.numberOfPages
40
-
dc.identifier.urn
urn:nbn:at:at-ubtuw:1-130139
-
dc.thesistype
Diplomarbeit
de
dc.thesistype
Diploma Thesis
en
dc.rights.identifier
In Copyright
en
dc.rights.identifier
Urheberrechtsschutz
de
tuw.advisor.staffStatus
staff
-
item.languageiso639-1
en
-
item.openairetype
master thesis
-
item.grantfulltext
open
-
item.fulltext
with Fulltext
-
item.cerifentitytype
Publications
-
item.mimetype
application/pdf
-
item.openairecristype
http://purl.org/coar/resource_type/c_bdcc
-
item.openaccessfulltext
Open Access
-
crisitem.author.dept
E104 - Institut für Diskrete Mathematik und Geometrie