This is an old website.  I have now moved to CMU.
Please visit: http://www.cs.cmu.edu/~andrewgw
You will be redirected shortly.


Andrew Gordon Wilson
Postdoctoral Research Fellow in the Sailing Lab at Carnegie Mellon University
[PhD Thesis] [CV] [Papers] [Talks]
andrewgw@cs.cmu.edu

Welcome.

I have broad interests in machine learning and statistics.  I am particularly interested in developing kernel methods, Gaussian processes, and Bayesian nonparametric models, for scalable automatic pattern discovery and extrapolation, and representation learning.  Much of this work involves automatic and expressive approaches to kernel learning, versus hand crafting of features.

In January 2014 I completed my PhD dissertation, "Covariance Kernels for Fast Automatic Pattern Discovery and Extrapolation with Gaussian processes" (news story), in the Machine Learning Group at the University of Cambridge, where I am a member of Trinity College
 
Outside of work, I am a classical pianist who particularly enjoys Glenn Gould's playing of Bach. 
I am also interested in modern physics, I write essays and fiction, and I enjoy squash, tennis, badminton, soccer, and chess.

 
Highlights:

- Our paper, "Fast Kernel Learning for Multidimensional Pattern Extrapolation" has been accepted to NIPS 2014! [PDF, BibTeX]

- I am co-organising the NIPS 2014 Workshop
"Modern Nonparametrics 3: Automating the Learning Pipeline"!

- I gave a lecture series on "Kernel Methods for Representation Learning" at MLSS 2014 Pittsburgh!
[Lecture 1+2Lecture 3+4]

- My PhD Thesis:
Covariance Kernels for Fast Automatic Pattern Discovery and Extrapolation with Gaussian Processes
[PDF, BibTeX]

- Video Lecture on Spectral Mixture (SM) Kernels


A note on GPRNs and changepoints
me


Thesis

Kernel methods, such as Gaussian processes, have great potential for developing intelligent systems, since the
kernel flexibly and interpretably controls the generalisation properties of these methods.  The predictive performance
of a kernel method is in general extremely sensitive to the choice of kernel.  However, it is standard practice to
use a simple RBF (aka Gaussian, or squared exponential) kernel, which is limited to smoothing and interpolation. 
This thesis argues for the importance of developing new kernels, introduces new kernels for automatic pattern
extrapolation (with a view towards feature extraction, representation learning, and automatic kernel selection),
and discusses how to best scale flexible kernel learning approaches, in order to extract rich structure from large
multidimensional datasets.

Covariance kernels for fast automatic pattern discovery and extrapolation with Gaussian processes
Andrew Gordon Wilson
PhD Thesis, January 2014.
[PDF, BibTeX]


Papers

Fast kernel learning for multidimensional pattern extrapolation
Andrew Gordon Wilson*, Elad Gilboa*, Arye Nohari, and John P. Cunningham
To appear at in Advances in Neural Information Processing Systems (NIPS) 2014
[PDF, BibTeX]

Bayesian inference for NMR spectroscopy with applications to chemical quantification
Andrew Gordon Wilson, Yuting Wu, Daniel J. Holland, Sebastian Nowozin, Mick D. Mantle, Lynn F. Gladden, and Andrew Blake
February 14, 2014.  In Submission.
[arXiv, PDF, BibTeX]

Student-t
processes as alternatives to Gaussian processes
Amar Shah, Andrew Gordon Wilson, and Zoubin Ghahramani
Artificial Intelligence and Statistics, 2014.
[arXiv, PDF, Supplementary, BibTeX]

GPatt: Fast multidimensional pattern extrapolation with Gaussian processes
Andrew Gordon Wilson, Elad Gilboa, Arye Nehorai, and John P. Cunningham
October 21, 2013.   In Submission.
[arXiv, PDF, BibTeX, Resources and Tutorial]

Bayesian optimization using Student-t processes
Amar Shah, Andrew Gordon Wilson, and Zoubin Ghahramani
NIPS Workshop on Bayesian Optimisation, 2013.
[PDF, BibTeX]

Gaussian process kernels for pattern discovery and extrapolation
Andrew Gordon Wilson and Ryan Prescott Adams
International Conference on Machine Learning (ICML), 2013.
Oral Presentation
[arXiv, PDF, Supplementary, BibTeX, Slides, Resources and Tutorial, Video Lecture]

Modelling input varying correlations between multiple responses
Andrew Gordon Wilson and Zoubin Ghahramani
European Conference on Machine Learning (ECML), 2012
Nectar Track  for "significant machine learning results"
Oral Presentation

[PDF, BibTeX]

Gaussian process regression networks
Andrew Gordon Wilson, David A. Knowles, and Zoubin Ghahramani
International Conference on Machine Learning (ICML), 2012.
Oral Presentation
[PDF, BibTeX, Slides, Supplementary, Video Lecture]

Generalised Wishart processes
Andrew Gordon Wilson and Zoubin Ghahramani
Uncertainty in Artificial Intelligence (UAI), 2011.
Best Student Paper Award
[PDF, BibTeX]

Copula processes
Andrew Gordon Wilson and Zoubin Ghahramani
Advances in Neural Information Processing Systems (NIPS), 2010.
Spotlight

[PDF, BibTeX, Slides, Video Lecture]


Talks

Building kernel methods for large scale representation learning
Machine Learning Summer School (MLSS), Pittsburgh, USA, July 2014

Kernels for automatic pattern discovery and extrapolation
International Conference on Machine Learning (ICML), Atlanta, USA, June 2013

The automated Bayesian nonparametric statistician
Information Engineering Conference, University of Cambridge, Cambridge, UK, June 2013

Gaussian processes for pattern discovery
Research Seminar, Sheffield Translational Institute for Neuroscience,
University of Sheffield, Sheffield, UK, March 2013

Models of input dependent covariances
Xerox Research Seminar, Grenoble, France, November 2012

A machine learning approach to NMR spectroscopy
Microsoft Research Cambridge, UK, September 2012

Modelling input dependent correlations between multiple responses

ECML Nectar Track, Bristol, UK, September 2012
Information Engineering Conference, University of Cambridge, Cambridge, UK, June 2012


Gaussian process regression networks

ICML, Edinburgh, UK, June 2012

Bayesian nonparametric density estimation
Machine Learning Group, University of Cambridge, UK, May 2012

Bayesian nonparametric modelling of dependent covariances

Harvard University, April 2012
University of California, Berkeley, May 2012

Generalised Wishart processes
International Joint Conference for Artificial Intelligence, Award Winning Paper Track, Barcelona, July 2011
Uncertainty in Artificial Intelligence, Barcelona, July 2011

Copula and Wishart processes for modelling dependent uncertainty and dynamic correlations
(Poster) Bayesian Nonparametrics Workshop, Veracruz, Mexico, June 2011

Copula and Wishart processes for multivariate volatility
Rimini Bayesian Econometrics Workshop, Rimini, Italy, June 2011

Latent Gaussian process models
Latent Gaussian Models Workshop, Zurich, Switzerland, February 2011
ETH Zurich, Zurich, Switzerland, February 2011

Poisson processes
Machine Learning Group, University of Cambridge, UK, November 2010

Modelling changing uncertainty with copula processes
University College London, London, UK, October 2010

Time series
Machine Learning Group, University of Cambridge, UK, May 2010