Recognition: unknown
Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)
read the original abstract
We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Exposure-averaged Gaussian Processes for Combining Overlapping Datasets
Exposure-integrated Gaussian processes allow prediction of both latent stellar signals and instrument-specific binned versions, supporting combination of overlapping EPRV datasets with varying exposure times.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.