pith. machine review for the scientific record. sign in

arxiv: 1709.09565 · v2 · submitted 2017-09-27 · 🧮 math.ST · math.PR· stat.TH

Recognition: unknown

Entrywise Eigenvector Analysis of Random Matrices with Low Expected Rank

Authors on Pith no claims yet
classification 🧮 math.ST math.PRstat.TH
keywords eigenvectorsanalysisentrywiseinftylargematrixrandomapproximation
0
0 comments X
read the original abstract

Recovering low-rank structures via eigenvector perturbation analysis is a common problem in statistical machine learning, such as in factor analysis, community detection, ranking, matrix completion, among others. While a large variety of bounds are available for average errors between empirical and population statistics of eigenvectors, few results are tight for entrywise analyses, which are critical for a number of problems such as community detection. This paper investigates entrywise behaviors of eigenvectors for a large class of random matrices whose expectations are low-rank, which helps settle the conjecture in Abbe et al. (2014b) that the spectral algorithm achieves exact recovery in the stochastic block model without any trimming or cleaning steps. The key is a first-order approximation of eigenvectors under the $\ell_\infty$ norm: $$u_k \approx \frac{A u_k^*}{\lambda_k^*},$$ where $\{u_k\}$ and $\{u_k^*\}$ are eigenvectors of a random matrix $A$ and its expectation $\mathbb{E} A$, respectively. The fact that the approximation is both tight and linear in $A$ facilitates sharp comparisons between $u_k$ and $u_k^*$. In particular, it allows for comparing the signs of $u_k$ and $u_k^*$ even if $\| u_k - u_k^*\|_{\infty}$ is large. The results are further extended to perturbations of eigenspaces, yielding new $\ell_\infty$-type bounds for synchronization ($\mathbb{Z}_2$-spiked Wigner model) and noisy matrix completion.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. The Statistical Cost of Adaptation in Multi-Source Transfer Learning

    math.ST 2026-05 unverdicted novelty 8.0

    Multi-source transfer learning incurs an intrinsic adaptation cost that can exceed one, with phase transitions separating regimes where bias-agnostic estimators match oracle performance from those where they cannot.