The goal of my research is to discover new principles of neural computation. In particular, I am interested in understanding how neural networks learn to perform difficult computations, such as probabilistic inference in deep generative models. My approach is to combine techniques from machine learning, statistical physics and neuroscience.

Primary Contributions:

  • Optimal Compensation theory: I developed a theoretical framework to describe neural degeneration, by generalising optimal computation theories form the intact brain to the damaged brain.
  • Statisitics of Balanced Networks: I calculated noise correlations and Fisher Information in balanced networks and found, counterintuitively, that information increases as correlations increase.
  • A new phase transition in foam physics: I made the first measurement of a new state of foam (the eight-fold vertex foam) onboard the European Space Agency's Zero-G Airplane. read more
  • Other areas of research: Neural Networks, Sparse coding, Variational Inference, The Helmholtz Machine, Auto-encoding, Optimal compensation theory, Quadratic Programming, Balanced network theory, Noise correlations, Visual cortex tuning, Natural sound processing and Information theory.

Academic biography: I received an undergraduate degree in Theoretical Physics from Trinity College Dublin in 2006, and an M.Sc in Sparse Coding in 2007. I completed a Ph.D in Computational Neuroscience and Machine Learning at the Gatsby Unit, UCL, with Prof. Peter Latham and Prof. Prof. Peter Dayan in 2012. After my PhD, I held a joint-research position at the École Normale Supérieure, Paris and the Champalimaud Centre for the Unknown, Lisbon. In May 2014, I joined the Computational and Biological Learning Lab at Cambridge University, where I have been working with Máté Lengyel. I am also a College Research Associate at St John's College, Cambridge.

Last updated: July 2015