Byz-Clip21-SGD2M delivers high-probability convergence for Byzantine-robust and differentially private federated learning under L-smoothness and σ-sub-Gaussian gradient noise.
Title resolution pending
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
CONDITIONAL 1representative citing papers
citing papers explorer
-
Byzantine-Robust and Differentially Private Federated Optimization under Weaker Assumptions
Byz-Clip21-SGD2M delivers high-probability convergence for Byzantine-robust and differentially private federated learning under L-smoothness and σ-sub-Gaussian gradient noise.