Skip to main content Skip to navigation

Zoubin Ghahramani

Bayesian model selection: non-parametric approaches and recent advances in approximate inference

 

Bayesian models are most practical and robust when the prior assumptions are reasonable. While parametric modelling often focuses on simple models which can yield tractable inferences, the assumptions made by many parametric models can be unreasonably restrictive and not capture our prior beliefs. In contrast, non-parametric models, which can often be derived as infinite-parameter limits of parametric models, seek to define more flexible priors which hopefully better capture our beliefs. One attraction of the non-parametric approach is that model selection is often unnecessary. For example, a classic non-parametric Bayesian model is the Dirichlet process mixture (DPM) model, which assumes infinitely many mixture components, making model selection of the number of components unnecessary. Classic approaches to non-parametric
modelling are based on MCMC---I will describe three modern alternatives, that provide more suitable speed / accuracy tradeoffs for large-scale problems ---(1) variational Bayesian (VB) approximations, (2) Expectation Propagation (EP), and (3) a new approximate inference method for DPMs based on Bayesian hierarchical clustering.