Room 961 Rutherford building, x39852, http://wol.ra.phy.cam.ac.uk/mackay/
There is one project available; it has two parts:
Neural networks  and Gaussian processes  both show promise for nonlinear data-modelling. Gaussian processes are very simple to use, but the extrapolation properties of `standard' Gaussian processes are different from those of neural networks. Chris Williams  recently showed how to implement a Gaussian process whose predictions are identical to those of an infinite neural network. In this project you will implement this modelling method and test it on previously studied datasets. If possible, you will create a software package allowing others to try the method also.
Gaussian processes show promise for nonlinear data-modelling, but they are slow to use if the straightforward implementation is used and the data set size exceeds 1000. Chris Williams  recently demonstrated how the Nystrom method can be used to speed up Gaussian process predictions. In the second part of this project you will implement this method. (Alternatively, the Sparse Greedy Gaussian Process Regression method of Smola and Bartlett  might be even better.)
 Chris Williams,
"Computation with Infinite Neural Networks"
 David MacKay: `Bayesian Non-Linear Modelling with Neural Networks', available from http://wol.ra.phy.cam.ac.uk/mackay/BayesNets.html
 David MacKay: `Gaussian Processes - A Replacement for Supervised Neural Networks?' and `Introduction to Gaussian Processes', available from http://wol.ra.phy.cam.ac.uk/mackay/BayesGP.html.