Some papers of mine dealing with latent variable models:

Neal, R. M. (2001) ``Defining priors for distributions using Dirichlet diffusion trees'', Technical Report No. 0104, Dept. of Statistics, University of Toronto, 25 pages: abstract, postscript, pdf, associated software.Maximum likelihood inference for latent variable models is often done with the EM algorithm, an unconventional view of which is explained in the following paper:Neal, R. M. (2000) ``Markov chain sampling methods for Dirichlet process mixture models'',

Journal of Computational and Graphical Statisticsvol. 9, pp. 249-265: abstract, associated references, associated software.Neal, R. M. (1991) ``Bayesian mixture modeling by Monte Carlo simulation'', Technical Report CRG-TR-91-2, Dept. of Computer Science, University of Toronto, 23 pages: abstract, postscript, pdf, associated references.

Neal, R. M. (1990) ``Learning stochastic feedforward networks'', Technical Report CRG-TR-90-7, Dept. of Computer Science, University of Toronto, 34 pages: abstract, postscript, pdf, associated reference.

Dayan, P., Hinton, G. E., Neal, R. M., and Zemel, R. S. (1995) ``The Helmholtz machine'',

Neural Computation, vol. 7, pp. 1022-1037: abstract, associated reference.Hinton, G. E., Dayan, P., Frey, B. J., and Neal, R. M. (1995) ``The ``wake-sleep'' algorithm for unsupervised neural networks'',

Science, vol. 268, pp. 1158-1161: abstract, associated references.Neal, R. M. and Dayan, P. (1996) ``Factor analysis using delta-rule wake-sleep learning'', Technical Report No. 9607, Dept. of Statistics, University of Toronto, 23 pages: abstract, postscript, pdf, associated references, associated software.

Neal, R. M. and Hinton, G. E. (1998) ``A view of the EM algorithm that justifies incremental, sparse, and other variants'', in M. I. Jordan (editor)Learning in Graphical Models, pp. 355-368, Dordrecht: Kluwer Academic Publishers: abstract, postscript, pdf.

Back to Radford Neal's research interests

Back to Radford Neal's home page