Blog

Machine learning blog. And such.

Uncertainty in Deep Learning (PhD Thesis)

Uncertainty in Deep Learning (PhD Thesis)

So I finally submitted my PhD thesis, collecting already published results on how to obtain uncertainty in deep learning, and lots of bits and pieces of new research I had lying around...
Post

Code and discussion for the paper 'A Theoretically Grounded Application of Dropout in Recurrent Neural Networks'

Code and discussion for the paper "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks"

These are just some comments and updates on the code from the paper.
Post

Homoscedastic and heteroscedastic regression with dropout uncertainty

Homoscedastic and heteroscedastic regression with dropout uncertainty

During a talk I gave at Google recently, I was asked about a peculiar behaviour of the uncertainty estimates we get from dropout networks.
Post

Science!

The Science of Deep Learning

I've decided to play with a new idea – encouraging an interactive discussion to try to support or falsify a hypothesis in deep learning. This followed some ideas I've had about the interaction between theoretical and experimental research approaches in our field. This post contains the discussion board for the paper "A theoretically grounded application of dropout in recurrent neural networks" if you have any comments.
Post (Comments)

Dropout uncertainty

What my deep model doesn't know...

I recently spent some time trying to understand why dropout deep learning models work so well – trying to relate them to new research from the last couple of years. I was quite surprised to see how close these were to Gaussian processes. I was even more surprised to see that we can get uncertainty information from these deep learning models for free – without changing a thing.
Post (Comments)

Contact me

Email

yg279 -at- cam.ac.uk

Post

Cambridge University
Engineering Department
Cambridge, CB2 1PZ
United Kingdom