Machine Learning 4f13 Lent 2009

Keywords: Machine learning, probabilistic modelling, graphical models, approximate inference, Bayesian statistics

Taught By: Zoubin Ghahramani and Carl Edward Rasmussen

Code and Term: 4F13 Lent term

Year: 4th year (part IIB) Engineering, also open to MPhil and PhD students.

Structure & Assessment:14 lectures, 2 example papers, 2 peices of course work. The evaluation is by the course work, there is no final exam.

Time: 11am-12 noon Tuesdays and 9-10am Fridays.

Location: Lecture Room 12 (LR12), Engineering Building, Trumpington Street, Cambridge

Prerequisites: A good background in statistics, calculus, linear algebra, and computer science. 3F3 Signal and Pattern Processing and 4f10 Statistical Pattern Processing would both be useful. You should thoroughly review the maths in the following cribsheet [pdf] [ps] before the start of the course. The following Matrix Cookbook is also a useful resource. If you want to do the optional coursework you need to know Matlab or Octave, or be willing to learn it on your own. Any student or researcher at Cambridge meeting these requirements is welcome to attend the lectures. Students wishing to take it for credit should consult with the course lecturers.

Textbook: There is no required textbook. However, the material covered is treated in:

Christopher M. Bishop (2006) Pattern Recognition and Machine Learning. Springer

and we will provide references to sections in this book. Another excellent textbook is:
David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press, available freely on the web.

For a summary of the topics covered in this module you can read the following chapter:
Ghahramani (2004) Unsupervised Learning. In Bousquet, O., Rätsch, G. and von Luxburg, U. (eds)
Advanced Lectures on Machine Learning LNAI 3176. Springer-Verlag.
NOTE: If you want to see lecture slides from a similar but not identical course taught last year click on the Lent 2008 course website, but be warned that the slides will change this year.

NEW: SAMPLE EXAM PAPER


LECTURE SYLLABUS

Jan 16     Introduction to Machine Learning(1L): review of probabilistic models, relation to coding terminology: Bayes rule, supervised, unsupervised and reinforcement learning Lecture 1 slides
Jan 20, 23 Unsupervised learning (2L): factor analysis, PCA, the EM algorithm Lecture 2 and 3 slides
Jan 27, 30, Feb 3 Bayesian Networks (3L): conditional independence, Belief Propagation Lecture 4: Graphical Models
Lecture 5: Inference in Graphical Models
Lecture 6: Learning in Graphical Models
Feb 6, 10 Monte Carlo Methods (2L): simple Monte Carlo, rejection sampling, importance sampling, Metropolis, Gibbs sampling and Hybrid Monte Carlo.

Good introductions to Monte Carlo methods are contained in MacKay's book, chapter 29 or Bishop's book, chapter 11.

Download a few demo matlab scripts showing different sampling methods.
Lecture 7 and 8 slides
Feb 13 Examples Paper 1 (1L): Questions 1-5. Examples paper 1
Feb 17, 20 Variational approximations (2L): KL divergences, mean field, expectation propagation Lecture 10 and 11 slides
Feb 24 Model comparison (1L): Bayes factors, Occam's razor, BIC, Laplace approximations. Lecture 12 slides
Feb 27, Mar 3, 6 Reinforcement Learning, Decision Making and MDPs (3L): value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory Lectures 13, 14 and 15 slides
Mar 10 Examples Paper 2 (1L): Quastions 6-9. Examples paper 2

COURSE WORK

Assignment 1 is due on Feb 24th 2009.

Assignment 2 is due on Mar 10th 2009. You may need the following files images.jpg, Mstep.m and genimages.m.