In the infinite-width limit, regularized Newton's method for neural networks converges exponentially to global minimizers with uniform rates across the frequency spectrum using the Newton neural tangent kernel.
Title resolution pending
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
citation-role summary
background 1
citation-polarity summary
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1roles
background 1polarities
background 1representative citing papers
citing papers explorer
-
Convergence Analysis of Newton's Method for Neural Networks in the Overparameterized Limit
In the infinite-width limit, regularized Newton's method for neural networks converges exponentially to global minimizers with uniform rates across the frequency spectrum using the Newton neural tangent kernel.