34
Summary
Restricted Boltzmann Machines provide a simple way to
learn a layer of features without any supervision.
Many layers of representation can be learned by treating
the hidden states of one RBM as the visible data for
training the next RBM.
This creates good generative models that can then be
fine-tuned.
Backpropagation can fine-tune discrimination.
Contrastive wake-sleep can fine-tune generation.
The same ideas can be used for non-linear
dimensionality reduction.
This leads to very effective ways of visualizing sets of
documents or searching for similar documents.