This workshop took place on December 13th in Whistler, BC Canada,
and formed part of the NIPS*03 workshop sessions.
These workshops followed the main conference in Vancouver on 8-11 December.

Thank you all, speakers, panelists and attendees, for an extremely insightful and productive workshop.
Please contact the workshop organisers with any questions:
Matthew J. Beal & Yee-Whye Teh.

*** New workshop at NIPS*05 here ***

Overview

A long standing issue with learning in graphical models has been determining the appropriate model size and structure. In many real world applications, traditional models with a small number of latent variables seem inadequate. In the quest for creating more flexible modelling tools, recent research has turned to the limit of such models with infinitely many latent variables and parameters (for example, neural nets with infinitely many hidden units, mixture models with infinitely many clusters, and hidden Markov models with infinitely many states). This limit corresponds to the field traditionally covered by Nonparametric Bayesian Statistics, which assumes a priori that the data was generated from a nonparametric model, with a possibly infinite number of parameters, experts, or hidden states, etc. In particular, such infinite models involving the use of Dirichlet processes have recently been introduced as a very attractive alternative to finite models where cumbersome model selection is required.

Goals

The workshop will bring together researchers and practitioners of nonparametric Bayesian methods to share their experiences and expertise with the general NIPS community, in an effort to transfer and build upon key methodologies developed in the statistics community. In particular, we wish to discuss the following themes and questions: