Room 961 Rutherford building, x39852, http://wol.ra.phy.cam.ac.uk/mackay/
E-mail: mackay@mrao.cam.ac.uk
There is one project available; it has two parts:
Neural networks [2] and Gaussian processes [3] both show promise
for nonlinear data-modelling.
Gaussian processes are very simple to use, but the extrapolation
properties of `standard' Gaussian processes are different
from those of neural networks.
Chris Williams [1] recently showed how to implement a Gaussian
process whose predictions are identical to those
of an infinite neural network.
In this project you will implement this modelling method and
test it on previously studied datasets.
If possible, you will create a software package allowing
others to try the method also.
Gaussian processes show promise for nonlinear data-modelling, but they are slow to use if the straightforward implementation is used and the data set size exceeds 1000. Chris Williams [4] recently demonstrated how the Nystrom method can be used to speed up Gaussian process predictions. In the second part of this project you will implement this method. (Alternatively, the Sparse Greedy Gaussian Process Regression method of Smola and Bartlett [6] might be even better.)
[1] Chris Williams,
"Computation with Infinite Neural Networks"
[2] David MacKay: `Bayesian Non-Linear Modelling with Neural Networks',
available from
http://wol.ra.phy.cam.ac.uk/mackay/BayesNets.html
[3] David MacKay: `Gaussian Processes - A Replacement for Supervised Neural
Networks?' and `Introduction to Gaussian Processes', available
from
http://wol.ra.phy.cam.ac.uk/mackay/BayesGP.html.