[ 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006 | 2005 | 2004 | 2003 | 2002 | 2001 | past millennia ] |

## Gaussian Processes and Kernel MethodsGaussian processes are non-parametric distributions useful for doing Bayesian inference and learning on unknown functions. They can be used for non-linear regression, time-series modelling, classification, and many other problems. |

## ClusteringClustering algorithms are unsupervised methods for finding groups of similar points in data. They are closely related to statistical mixture models. |

## Graphical ModelsGraphical models are a graphical representation of the conditional independence relations among a set of variables. The graph is useful both as an intuitive representation of how the variables are related, and as a tool for defining efficient message passing algorithms for probabilistic inference. |

## Monte Carlo MethodsMarkov chain Monte Carlo (MCMC) methods use sampling to approximate high dimensional integrals and intractable sums. MCMC methods are widely used in many areas of science, applied mathematics and engineering. They are an indispensable approximate inference tool for Bayesian statistics and machine learning. |

## Semi-Supervised LearningOften, it is easy and cheap to obtain large amounts of unlabelled data (e.g. images, text documents), while it is hard or expensive to obtain labelled data. Semi-supervised learning methods attempt to use the unlabelled data to improve the performance on supervised learning tasks, such as classification. |

## Non-parametric Bayesian LearningNon-parametric models are very flexible statistical models in which the complexity of the model grows with the amount of observed data. While traditional parametric models make strong assumptions about how the data was generated, non-parametric models try to make weaker assumptions and let the data "speak for itself". Many non-parametric models can be seen as infinite limits of finite parametric models, and an important family of non-parametric models are derived from Dirichlet processes. See also Gaussian Processes. |

## Approximate InferenceFor all but the simplest statistical models, exact learning and inference are computationally intractable. Approximate inference methods make it possible to learn realistic models from large data sets. Generally, approximate inference methods trade off computation time for accuracy. Some of the major classes of approximate inference methods include Markov chain Monte Carlo methods, variational methods and related algorithms such as Expectation Propagation. |

## BioinformaticsRecent advances in biology have allowed us to collect vast amounts of genetic, proteomic and biomedical data. While this data offers the potential to help us understand the building blocks of life, and to revolutionise medicine, analysing and understanding it poses immense computational and statistical challenges. Our work in Bionformatics includes modelling protein secondary and tertiary structure, analysis of gene microarray data, protein-protein interactions, and biomarker discovery. |

## Information RetrievalInformation retrieval concerns develping systems that find material from within a large unstructured collection (e.g. the internet) that satisfy the user's need. The best example of such systems are web search engines, such as Google, but there are many other specialized applications of information retrieval (such as collaborative filtering and recommender systems). Information retrieval can be thought of as an inference problem: given the user's query, what are the relevant items in the data collection? |

## Reinforcement Learning and ControlWe are interested in understanding the human sensory motor system from a mathematical, computational and engineering point of view. To do this, we need to use concepts from control theory, optimization, machine learning and statistics, as well as experimental methods based on human psychophysics and virtual reality. These formal tools are also useful for advancing robotics and decision theory. |

## Time Series ModelsModelling time series and sequential data is an essential part of many different areas of science and engineering, including for example, signal processing and control, bioinformatics, speech recognition, econometrics and finance. Using basic building blocks such as hidden Markov models, linear Gaussian state-space models, and Bayesian networks, it is possible to develop sophisticated time series models for real world data. However learning (parameter inference / system identification) becomes computationally challenging for such sophisticated models. |

## Network Modelling |

## Active Learning |

## Neuroscience |

## Signal Processing |

## Machine Vision |

## Machine Hearing |

## Natural Language Processing |

## Deep Learning |

## Review Articles and Tutorials |