Machine Learning 4f13 Lent 2008

Keywords: Machine learning, probabilistic modelling, graphical models, approximate inference, Bayesian statistics

Taught By: Zoubin Ghahramani and Carl Edward Rasmussen

Code and Term: 4F13 Lent term

Year: 4th year (part IIB) Engineering, also open to MPhil and PhD students.

Structure & Assessment:14 lectures, 2 example papers, 2 assigments (25%) and final exam (75%).

Time: 11am to 12noon Wednesdays, and 10am to 11am Fridays

Location: Lecture Room 5, First floor, Engineering Building, Trumpington Street, Cambridge

Prerequisites: A good background in statistics, calculus, linear algebra, and computer science. 3F3 Signal and Pattern Processing and 4f10 Statistical Pattern Processing would both be useful. You should thoroughly review the maths in the following cribsheet [pdf] [ps] before the start of the course. The following Matrix Cookbook is also a useful resource. If you want to do the optional coursework you need to know Matlab or Octave, or be willing to learn it on your own. Any student or researcher at Cambridge meeting these requirements is welcome to attend the lectures. Students wishing to take it for credit should consult with the course lecturers.

Textbook: There is no required textbook. However, the material covered is treated in:

Christopher M. Bishop (2006) Pattern Recognition and Machine Learning. Springer

and we will provide references to sections in this book. Another excellent textbook is:
David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press, available freely on the web.

For a summary of the topics covered in this module you can read the following chapter:
Ghahramani (2004) Unsupervised Learning. In Bousquet, O., Rätsch, G. and von Luxburg, U. (eds)
Advanced Lectures on Machine Learning LNAI 3176. Springer-Verlag.
NOTE: If you want to see lecture slides from a similar but not identical course ZG taught last year click on the 2006 course website, but be warned that the slides will change this year.

NEW: SAMPLE EXAM PAPER


LECTURE SYLLABUS

Jan 18     Introduction to Machine Learning(1L): review of probabilistic models, relation to coding terminology: Bayes rule, supervised, unsupervised and reinforcement learning Lecture 1 slides
Jan 23, 25 Unsupervised learning (2L): factor analysis, PCA, the EM algorithm Lecture 2 and 3 slides
Jan 30, Feb 1, 6 Bayesian Networks (3L): conditional independence, Belief Propagation Lecture 4: Graphical Models
Lecture 5: Inference in Graphical Models
Lecture 6: Learning in Graphical Models
Feb 8, 13 Monte Carlo Methods (2L): simple Monte Carlo, rejection sampling, importance sampling, Metropolis, Gibbs sampling and Hybrid Monte Carlo.

Good introductions to Monte Carlo methods are contained in MacKay's book, chapter 29 or Bishop's book, chapter 11.
Lecture 7 and 8 slides
Feb 15 Examples Paper 1 (1L): Questions 1-6. Examples paper 1
Feb 20, 22 Variational approximations (2L): KL divergences, mean field, expectation propagation Lecture 10 and 11 slides
Feb 27 Model comparison (1L): Bayes factors, Occam's razor, BIC, Laplace approximations. Lecture 12 slides
Feb 29, Mar 5, 7 Reinforcement Learning, Decision Making and MDPs (3L): value functions, value iteration, policy iteration, Bellman equations, Q-learning, Bayesian decision theory Lectures 13, 14 and 15 slides
Mar 12 Examples Paper 2 (1L) Examples paper 2

COURSE WORK

Assignment 1 is due on Feb 29th 2008.

Assignment 2 is due on Mar 14th 2008. You may need the following files images.jpg, Mstep.m and genimages.m.