Tech Report CS-08-07

Incremental Nonparametric Bayesian Regression

Frank Wood, Dan H. Grollman, K. A. Heller, Odest Chad Jenkins and Michael Black

July 2008

Abstract:

In this paper we develop an incremental estimation algorithm for infinite mixtures of Gaussian process experts. Incremental, local, non-linear regression algorithms are required for a wide variety of applications, ranging from robotic control to neural decoding. Arguably the most popular and widely used of such algorithms is currently Locally Weighted Projection Regression (LWPR) which has been shown empirically to be both computationally efficient and sufficiently accurate for a number of applications. While incremental variants of non-linear Bayesian regression models have superior theoretical properties and have been shown to produce better function approximations than LWPR, they suffer from high computational and storage costs. Through exploitation of locality, infinite mixtures of Gaussian process experts (IMGPE) offer the same function approximation performance with reduced computation and storage cost. Our contribution is an incremental regression approach that has the theoretical benefits of a fully Bayesian model and computational benefits that derive from exploiting locality.

(complete text in pdf)