Documentation for GPML Matlab Code

The code provided here demonstrates the main algorithms from Rasmussen and Williams: Gaussian Processes for Machine Learning.

The code is written in Matlab®, and should work with version 6 and version 7. Bug reports should be sent to the authors. All the code including demonstrations and html documentation can be downloaded in a tar or zip archive file. Previous versions of the code may be available here. Please read the copyright notice.

After unpacking the tar or zip file you will find 3 subdirectories: gpml, gpml-demo and doc.

The directory gpml contains the basic functions for GP regression, GP binary classification, and sparse approximate methods for GP regression.

The directory gpml-demo contains Matlab® scripts with names "demo_*.m". These provide small demonstrations of the various programs provided.

The directory doc contains four html files providing documentation. This information can also be accessed via the www at http://www.GaussianProcess.org/gpml/code.

The code should run directly as provided, but some demos require a lot of computation. A significant speedup may be attained by compiling the mex files, see the rudimentary instructions on how to do this in the README file.

The documentation is divided into three sections:

Regression

Basic Gaussian process regression (GPR) code allowing flexible specification of the covariance function.

Binary Classification

Gaussian process classification (GPC) demonstrates implementations of Laplace and EP approximation methods for binary GP classification.

Sparse Approximation methods for Gaussian Process Regression

Approximation methods for GPR demonstrates the methods of subset of datapoints (SD), subset of regressors (SR) and projected process (PP) approximations.


Other Gaussian Process Code

A table of other sources of useful Gaussian process software, unrelated to the book, may be found here. This includes pointers a number of packages that can handle multi-class classification, e.g. fbm (Radford Neal), c++-ivm (Neil Lawrence), gpclass (David Barber and Chris Williams), klr (kernel multiple logistic regression, by Matthias Seeger), and VBGP (Mark Girolami and Simon Rogers).




Go back to the
web page for Gaussian Processes for Machine Learning.
Last modified: Tue Jun 26 10:43:51 CET 2007