Thang Bui

Thang Bui 

Thang Bui, 4th year PhD student
Machine Learning Group
Computational and Biological Learning Lab
University of Cambridge

Supervisor: Richard Turner
Advisor: Carl Rasmussen

CV github twitter @thdbui

News

Conference and journal papers

A Unifying Framework for Sparse Gaussian Process Approximation using Power Expectation Propagation

Thang Bui, Josiah Yan, Rich Turner
JMLR 2017
arxiv code

Streaming Sparse Gaussian Process Approximations

Thang Bui, Cuong Nguyen, Rich Turner
NIPS 2017
arxiv code

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato and Rich Turner
ICML 2016 paper code

Black-box alpha-divergence minimization

José Miguel Hernández-Lobato, Yingzhen Li, Mark Rowland, Daniel Hernández-Lobato, Thang Bui and Rich Turner
ICML 2016 paper code

Learning stationary time series using Gaussian processes with nonparametric kernels

Felipe Tobar, Thang Bui and Rich Turner
NIPS 2015 (Spotlight, acceptance rate = 3.6%) paper

Tree-structured Gaussian process approximations

Thang Bui and Rich Turner
NIPS 2014 (Spotlight, acceptance rate = 3.6%) paper code

Workshop papers

Importance weighted autoencoders with random neural network parameters

Daniel Hernández-Lobato, Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner
NIPS Workshop on Bayesian Deep Learning, 2016 paper

Black-box alpha divergence for generative models

Thang Bui, Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2016 paper

Circular Pseudo-point approximations for scaling Gaussian processes

Will Tebbutt, Thang Bui and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2016 paper

Bayesian Gaussian process state space models via Power-EP

Thang Bui, Carl Rasmussen and Rich Turner
ICML Workshop on Data efficient Machine Learning, 2016 paper

Training deep Gaussian processes using stochastic expectation propagation and probabilistic backpropagation

Thang Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2015. arXiv

Stochastic variational inference for Gaussian process latent variable models using back constraints

Thang Bui and Rich Turner
NIPS Workshop on Black Box Learning and Inference, 2015 paper

Black-box alpha-divergence minimisation

José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui and Rich Turner
NIPS Workshops on Advances in Approximate Bayesian Inference and Black Box Learning and Inference, 2015. arXiv

Stochastic expectation propagation for large scale Gaussian process classification

Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Yingzhen Li, Thang Bui and Rich Turner
NIPS Workshop on Advances in Approximate Bayesian Inference, 2015. arXiv

Design of covariance functions using inter-domain inducing variables

Felipe Tobar, Thang Bui and Rich Turner
NIPS Workshop on Time Series, 2015 Best paper prize paper

Misc

Sparse Approximations for Non-Conjugate Gaussian Process Regressions

Thang Bui and Rich Turner
report

Through-the-Wall Imaging Radar

Thang Bui, Joseph Rabig, Douglas Gray, Richard Drake,
Progress in Radar Research, 2012, Adelaide, Australia.
Link Technical Report Poster

Talks

Short bio

I received a Bachelor of Engineering (Telecommunications) from Adelaide University, Australia in 2011. I then spent sometime working as a research associate at Teletraffic Research Centre and Telari, Adelaide. I did several internships in previous summers at Google Research in 2016, Toshiba Cambridge Research Lab in 2014, CISRA, Canon Australia in 2011-2012, UNSW in 2010-2011 and UofA in 2009-2010.

Other things

Member of Trinity College

Reviewer for JMLR (2016, 2017), NIPS (2016, 2017), ICLR (2017, 2018), ICML (2017), AISTATS (2018)

Contact

CBL, Department of Engineering
Trumpington Street, Cambridge CB2 1PZ, UK
tdb40 at at at cam.ac.uk