<div class="csl-bib-body">
<div class="csl-entry">Feischl, M. (2025, May 14). <i>Optimal convergence rates in the context of neural networks</i> [Presentation]. Zurich Colloquium in Applied and Computational Mathematics 2025, Zürich, Switzerland. http://hdl.handle.net/20.500.12708/217630</div>
</div>
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/217630
-
dc.description.abstract
We present two recent results on the convergence rates of algorithms involving neural networks: First, we propose a hierarchical training algorithm for standard feed-forward neural networks that adaptively extends the network architecture as soon as the optimization reaches a stationary point. By solving small (low-dimensional) optimization problems, the extended network provably escapes any local minimum or stationary point. Under some assumptions on the approximability of the data with stable neural networks, we show that the algorithm achieves an optimal convergence rate s in the sense that loss is bounded by the number of parameters to the -s. Second, we show that quadrature with neural network integrands is inherently hard and that no higher-order algorithms can exist, even if the algorithm has access to the weights of the network.
en
dc.language.iso
en
-
dc.subject
neural networks
en
dc.subject
training
en
dc.subject
optimality
en
dc.title
Optimal convergence rates in the context of neural networks
en
dc.type
Presentation
en
dc.type
Vortrag
de
dc.type.category
Presentation
-
tuw.publication.invited
invited
-
tuw.researchTopic.id
A3
-
tuw.researchTopic.name
Fundamental Mathematics Research
-
tuw.researchTopic.value
100
-
tuw.publication.orgunit
E101-02-3 - Forschungsgruppe Computational PDEs
-
tuw.event.name
Zurich Colloquium in Applied and Computational Mathematics 2025