Large-scale empirical sweep shows MoE quality improves with total parameters even at high active-to-total ratios, optimal expert granularity depends only on active parameters, and most other design choices have small effects.
In all settings and configurations, the addition of a generalist results in comparable or degraded performance
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
ACCEPT 1representative citing papers
citing papers explorer
-
Slicing and Dicing: Configuring Optimal Mixtures of Experts
Large-scale empirical sweep shows MoE quality improves with total parameters even at high active-to-total ratios, optimal expert granularity depends only on active parameters, and most other design choices have small effects.