Efficient Optimization for Sparse Gaussian Process Regression

Cao, Y., Brubaker, M., Fleet, D.J. and Hertzmann, A.

Advances in Neural Information Processing Systems (NIPS), Lake Tahoe, 2013.

We propose an efficient optimization algorithm for selecting a subset of training data to induce sparsity for Gaussian process regression. The algorithm estimates an inducing set and the hyperparameters using a single objective, either the marginal likelihood or a variational free energy. The space and time complexity are linear in training set size, and the algorithm can be applied to large regression problems on discrete or continuous domains. Empirical evaluation shows state-of-art performance in discrete cases and competitive results in the continuous case.


CholQR-var-snelson-1D-small.pngsnelson_1D_cost-reduction-approx-vs-exact.pngsnelson_1D_OR-cost-reduction-approx-vs-exact.png

Journal paper © IEEE

Conference paper

Conference paper supplementary material

Code