| Non-parametric Bayesian Models
|
Non-parametric models are very flexible statistical models in which
the complexity of the model grows with the amount of observed
data. While traditional parametric models make strong assumptions
about how the data was generated, non-parametric models try to make
weaker assumptions and let the data "speak for itself". Many
non-parametric models can be seen as infinite limits of finite
parametric models, and an important family of non-parametric models
are derived from Dirichlet processes.
See also Gaussian Processes.
Some relevant publications:
- van Gael, J., Saatci, Y., Teh, Y.-W., and Ghahramani, Z. (2008)
Beam sampling for the Infinite Hidden
Markov Model
Proceedings of the 25th International Conference on
Machine Learning (ICML-2008).
- Knowles, D. and Ghahramani, Z. (2007)
Infinite Sparse Factor Analysis
and Infinite Independent Components Analysis.
In 7th
International Conference on Independent Component Analysis and
Signal Separation (ICA 2007). Lecture Notes in Computer Science Series
(LNCS). Springer.
- Teh, Y.W., Gorur, D. and Ghahramani, Z. (2007)
Stick-breaking
Construction for the Indian Buffet.
To appear in Eleventh International Conference on Artifical
Intelligence and Statistics (AISTATS 2007). San Juan, Puerto
Rico.
- Ghahramani, Z., Griffiths, T.L., Sollich, P. (2007)
Bayesian nonparametric latent feature
models (with discussion and rejoinder).
Bayesian Statistics 8. Oxford University Press.
- Heller, K.A., and Ghahramani, Z. (2007)
A Nonparametric Bayesian Approach to
Modeling Overlapping Clusters.
In Eleventh International
Conference on Artifical Intelligence and Statistics (AISTATS
2007). San Juan, Puerto Rico.
- Meeds, E., Ghahramani, Z., Neal, R. and Roweis, S.T. (2007)
Modeling Dyadic Data with Binary Latent
Factors.
In Advances in Neural Information Processing Systems 19
(NIPS-2006). Cambridge, MA: MIT Press.
- Wood, F., Griffiths, T.L. and Ghahramani, Z. (2006)
A Non-Parametric Bayesian Method for Inferring Hidden Causes.
In Uncertainty in Artificial Intelligence (UAI-2006) pp. 536-543.
- Griffiths, T.L., and Ghahramani, Z. (2006)
Infinite Latent
Feature Models and the Indian Buffet Process.
In Advances in Neural Information Processing Systems
18 (NIPS-2005).
- Heller, K.A. and Ghahramani, Z. (2005)
Bayesian Hierarchical
Clustering, Gatsby Unit Technical Report GCNU-TR 2005-002. [ps] [pdf]
A shorter
version was published in the Twenty-second International Conference on Machine
Learning (ICML-2005). [pdf]
- Griffiths, T. L., and Ghahramani, Z. (2005)
Infinite latent feature models and
the Indian buffet process.
Gatsby Unit Technical Report
GCNU-TR 2005-001.
- Zhu, X., Ghahramani, Z., and Lafferty, J. (2005)
Time-Sensitive Dirichlet Process
Mixture Models.
Carnegie Mellon University Technical
Report CMU-CALD-05-104.
- Zhang, J., Ghahramani, Z. and Yang, Y. (2005)
A Probabilistic Model for
Online Document Clustering with Application to Novelty
Detection [ps]. [pdf]
In Advances in Neural Information
Processing Systems 17. (NIPS-2004)
- Minka, T.P., and Ghahramani, Z. (2003)
Expectation Propagation for Infinite Mixtures.
Technical Report,
presented at the NIPS 2003 Workshop on Nonparametric Bayesian Methods
and Infinite Models.
Talk and abstract at this
website.
-
Beal, M. J., Ghahramani, Z. and Rasmussen, C. E. (2002)
The
Infinite Hidden Markov Model [ps.gz] [pdf]
[abstract]
In Dietterich, T.G., Becker, S. and Ghahramani, Z. (eds)
Neural Information Processing Systems 14: 577-585. Cambridge,
MA, MIT Press.
-
Rasmussen, C. E and Ghahramani, Z. (2002)
Infinite Mixtures of Gaussian Process Experts [ps]
[abstract]
In Dietterich, T.G., Becker, S. and Ghahramani, Z. (eds)
Neural Information Processing Systems 14: 881-888. Cambridge,
MA, MIT Press.