For ReLU networks with input and hidden widths at least 2, most parameters are identifiable up to symmetry, so the functional dimension equals the parameter count minus the number of hidden neurons.
An embedding of relu networks and an analysis of their identifiability
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
ACCEPT 1representative citing papers
citing papers explorer
-
Most ReLU Networks Admit Identifiable Parameters
For ReLU networks with input and hidden widths at least 2, most parameters are identifiable up to symmetry, so the functional dimension equals the parameter count minus the number of hidden neurons.