pith. machine review for the scientific record. sign in

arxiv: 1703.00539 · v1 · submitted 2017-03-01 · 🧮 math.ST · stat.TH

Recognition: unknown

Learning Determinantal Point Processes with Moments and Cycles

Authors on Pith no claims yet
classification 🧮 math.ST stat.TH
keywords learningcomplexitydeterminantalfastmomentsoptimalpointprocesses
0
0 comments X
read the original abstract

Determinantal Point Processes (DPPs) are a family of probabilistic models that have a repulsive behavior, and lend themselves naturally to many tasks in machine learning where returning a diverse set of objects is important. While there are fast algorithms for sampling, marginalization and conditioning, much less is known about learning the parameters of a DPP. Our contribution is twofold: (i) we establish the optimal sample complexity achievable in this problem and show that it is governed by a natural parameter, which we call the \emph{cycle sparsity}; (ii) we propose a provably fast combinatorial algorithm that implements the method of moments efficiently and achieves optimal sample complexity. Finally, we give experimental results that confirm our theoretical findings.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. State-of-art minibatches via novel DPP kernels: discretization, wavelets, and rough objectives

    stat.ML 2026-05 unverdicted novelty 7.0

    Wavelet DPP kernels deliver improved continuous variance reduction and a discretization procedure that preserves decay rates for discrete ML subsampling tasks including rough objectives.