Gatsby Computational
Neuroscience Unit |
Year: MSc in Intelligent Systems, Gatsby Unit Core Course
Prerequisites: A good background in statistics, calculus, linear algebra, and computer science. You should thoroughly review the maths in the following cribsheet [pdf] [ps] before the start of the course. You must either know Matlab or Octave, be taking a class on Matlab/Octave, or be willing to learn it on your own. Any student or researcher at UCL meeting these requirements is welcome to attend the lectures. Students wishing to take it for credit should consult with the course lecturer (email:
Term: 1, 2003
Time: 11.00 to 13.00 Mondays and Thursdays
Location: Gatsby Unit, 17 Queen Square
Taught By: Zoubin Ghahramani
Teaching Assistants: Iain Murray and Ed Snelson.
Homework Assignments: all assignments for this course are to be handed in to the Gatsby Unit, not to the CS department. Please hand in all assignments at the beginning of lecture on the due date to either Zoubin, Iain, or Ed. Late assignments will be penalised. If you are unable to come to class, you can also hand in assignments to Alexandra Boss, Room 408, Gatsby Unit.
Late Assignment Policy: Assignments that are handed in late will be penalised as follows: 10% penalty per day for every weekday late, until the answers are discussed in a review session. NO CREDIT will be given for assignments that are handed in after answers are discussed in the review session.
Textbook: There is no required textbook. However, I recommend the following recently published textbook as an excellent source for many of the topics here, and I will be occasionally assigning reading from it:
David J.C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press. (also available online)
NOTE: NO LECTURE Thurs OCT 2
Dates and Title | Topics | Materials |
Sep 29, Oct 6, Oct 9 Introduction and Statistical Foundations |
|
Lecture Slides Assignment 1 (due Oct 9) Readings: Nuances of Probability Theory by Tom Minka. Probability Theory: The Logic of Science by ET Jaynes Sam Roweis' notes on matrix algebra Tom Minka's notes on matrix algebra Probability and Statistics Online Reference |
Oct 2 |
|
. |
Oct 13 and Oct 16 Latent Variable Models |
|
Lecture Slides Assignment 2 (NEW: due Thurs Oct 23) Suggested Readings: * Cribsheet [pdf] [ps]of Basic Maths Needed for Machine Learning * David MacKay's Book, Chapters 20, 22 and 23 on k-means and MoG * Max Welling's Class Notes on PCA and FA [pdf] [ps] |
Oct 20 and 23 The EM Algorithm |
|
Lecture Slides
Assignment 3 binarydigits.txt bindigit.m |
Oct 27 and 30 Latent Variable Time Series Models |
|
Lecture Slides Suggested Further Readings: Ghahramani, Z. and Hinton, G.E. (1996) Parameter estimation for linear dynamical systems. Minka, T. (1999) From Hidden Markov Models to Linear Dynamical Systems Welling (2002) The Kalman Filter (class notes). |
Nov 3 and 6 Reading Week |
|
. |
Nov 10 Introduction to Graphical Models I |
|
Lecture Slides Assignment 4 (due Nov 19, deadline extended) Data Sets: geyser.txt, data1.txt Suggested Further Readings: The following three related articles will appear in Arbib (ed): The Handbook of Brain Theory and Neural Networks (2nd edition) Jordan and Weiss (2002) Probabilistic Inference in Graphical Models Ghahramani (2002) Graphical Models: Parameter Learning Heckerman (2002) Graphical Models: Structure Learning Shachter (1998) Bayes Ball |
Nov 13 Introduction to Graphical Models II |
|
Belief Propagation Demo: Fluffy and Moby Factor Graph Propagation |
Nov 17 and 20 Hierarchical and Nonlinear Models |
|
Lecture Slides
Demo Suggested Readings: Max Welling's Notes on ICA David MacKay's Book, Ch 34 on ICA |
Nov 24 and Nov 27 Sampling Methods |
|
Lecture Slides (MCMC) Assignment 5 (due Dec 4) Suggested Readings: David MacKay's Book, Ch 29 and 30 on Monte Carlo methods; A more in-depth treatment of Monte Carlo methods is in Radford Neal's Technical Report; |
Dec 1 Variational Approximations |
|
Lecture Slides (Variational) Suggested Readings: David MacKay's Book, Ch 33 on variational methods; Jordan et al's Introduction to Variational Methods [ps.gz] [pdf] |
Dec 4 Bayesian Model Selection |
|
Lecture Slides (Bayesian Model Selection) Assignment 6 (due End of Term) Data: images.jpg , Code: genimages.m , MStep.m |
Dec 8 and 11 |
|
. |
Aims: This course provides students with an in-depth introduction to unsupervised learning techniques. It presents probabilistic approaches to modelling and their relation to coding theory and Bayesian statistics. A variety of latent variable models will be covered including mixture models (used for clustering), dimensionality reduction methods, time series models such as hidden Markov models which are used in speech recognition and bioinformatics, independent components analysis, hierarchical models, and nonlinear models. The course will present the foundations of probabilistic graphical models (e.g. Bayesian networks and Markov networks) as an overarching framework for unsupervised modelling. We will cover Markov chain Monte Carlo sampling methods and variational approximations for inference. Time permitting, students will also learn about Gaussian processes and the fundamentals of Bayesian decision theory/reinforcement learning/optimal control.
Learning Outcomes: To be able to understand the theory of unsupervised learning systems; to have in-depth knowledge of the main models used in UL; to understand the methods of exact and approximate inference in probabilistic models; to be able to recognise which models are appropriate for different real-world applications of machine learning methods.
Method: Lecture presentations with associated class problems.
Assessment:
Course Location: |
Tel: Emails: |