#### In Proceedings of the 25th International Conference on Machine Learning (ICML 2008), Helsinki, Finland

Ryan Prescott Adams and Oliver Stegle
Stationarity is often an unrealistic prior assumption for Gaussian
process regression. One solution is to predefine an explicit
nonstationary covariance function, but such covariance functions can be
difficult to specify and require detailed prior knowledge of the
nonstationarity. We propose the Gaussian process product model (GPPM)
which models data as the pointwise product of two latent Gaussian
processes to nonparametrically infer nonstationary variations of
amplitude. This approach differs from other nonparametric approaches
to covariance function inference in that it operates on the outputs
rather than the inputs, resulting in a significant reduction in
computational cost and required data for inference. We present an
approximate inference scheme using Expectation Propagation. This
variational approximation yields convenient GP hyperparameter selection
and compact approximate predictive distributions.

pdf
| ps
| bibtex